Fumble: How a Botched Software Upgrade Hurt J. Crew’s Bottom Line

Fumble: How a Botched Software Upgrade Hurt J. Crew’s Bottom Line

What is the cost of a fumbled upgrade?  For J. Crew it was $3MM in unanticipated costs which, according to J. Crew, contributed to their recently announced earnings miss of 4 cents a share.   How did Wall Street respond?  Typical overcorrection seems to be the response with shares trading 15% down in after hour activity.

Ben Worthen highlighted J. Crew’s stumble in his business technology blog in the Wall Street Journal, mentioning that J. Crew isn’t the first company to blame poor earnings results on technology.

There was a wave of businesses blaming poor results on tech-projects-gone-bad in the early part of the decade. We haven’t seen it much lately, though.

One difference: Nike, Hershey and others that had problems in the past went out of their way to blame the tech vendor. J. Crew never once tried to pass the buck. The company didn’t respond to our requests for comment, which also means we don’t know which company sold the offending technology. You can search the Web for “J. Crew” and “systems” and find the names of several companies J. Crew buys software from, but there’s not enough evidence for us to point a finger.

What strikes a chord in me in the report is that Worthen assumes that the poor upgrade is the result of “offending technology.”  Our experience however, leads me to be much more suspicious of the implementation/upgrade approach, executive sponsorship, project budget and timeline constraints, and ultimately the implementation team itself.  All too often we see companies approach an upgrade as a routine activity that can be performed easily by their staff (all while their staff stays on top of their regular day-to-day responsibilities).   Supplemental staff is reluctantly brought in via simple commodity broker staffing firms that can only provide bodies and not real experience from both a people and process standpoint.

Obviously, this doesn’t fly too often.

What happens next?  For a while, nothing.  Status reports look good; the project seems to be on track because really nobody on the team really knows any better.  Then one day, usually a month or two before testing, the cracks start to show.  The team isn’t ready to start testing, but the “go-live” date isn’t movable.   Then some genius somewhere makes the brilliant decision to give the team the extra time they need to complete their work by cutting the time budgeted for testing.   Things get worse, because you can’t slaughter QA, which is a lesson that’s been taught a hundred times over yet rarely sticks at crunch time.  Testing gets cut to a bare minimum, and as a result there is an overinflated sense of confidence about the readiness to go-live.

You know the rest of the story.  The new system is turned on and all hell breaks loose.  Someone orders a few items of clothing and is charged $9,200.  Oops.  Then you spend a bunch of extra money to repair the production business logic that should have been caught in the QA you cut, you miss your earning estimate, and you get punished by Wall Street.  Fumble.

The moral here?  Self-awareness.  Understand what you do well and do that really, really well.  For everything else, work with trusted partners that can bring you the experience and expertise you need to get the job done…the right way…the first time.

+ posts