Preface

Evaluating big IT projects and trends has consumed a large part of my professional life. After graduating from engineering school in 1979, I spent nearly a decade as a technical journalist, reporting on technologies ranging from robots that painted automobiles, to voice-recognition systems that took orders over the phone, to software that controlled and managed the brewing of beer. I toured hundreds of facilities in the United States and learned firsthand how companies used technology. When the technology was used well, the results were amazing. There was always a risk of failure, however, when companies jumped into a technology too quickly.

I then became a technology analyst and spent over a decade at Gartner Inc., starting when it was shy of $40 million per year in revenues and leaving when revenues had grown above $600 million. During that time, I became a trusted advisor to both sellers and buyers of complex technologies, particularly in the area of back-office and operations software. I worked with many of the Fortune 500 companies around the world as well as some of the largest software providers, including SAP, Oracle, Computer Associates, and IBM. During my tenure at Gartner, I advised more than 1,000 companies.

As I spent more time with buyers I began to see that the promise of technology was not always fulfilled. Even when projects were successful in the beginning, they were often followed by unanticipated difficulties downstream.

"How do you do a live upgrade with R/3 [SAP's ERP system]?" asked a client over lunch one day. I didn't have the faintest idea - R/3 infrastructure was covered by another Gartner analyst. He continued: "If we do it the way that SAP recommends, we will have to shut down our systems for four days. I don't know how we'd do it." I still remember my sharp sense of embarrassment. Here was a technology that I had readily recommended to the company, yet I had no idea of some of the consequences of its use. I passed the buck to my Gartner colleague.

Another time, I was working with a client who wanted to replace a perfectly good piece of software with a new one because the older one ran on mainframe technology. I kept trying to talk him out of it, encouraging him to look for alternatives. I couldn't understand what the point was. The client made it very clear to me: "It's a good career move for me."

By the end of my tenure at Gartner, in the late 1990s, I was spending more time helping clients clean up problems than launch new technology initiatives. By then I had serious doubts about the widely held theory that technology would always help businesses better themselves.

In the next stretch of my career, as a consultant, I started to look with fresh eyes at what it takes to be successful with technology. Over the next few years, I found more questions than answers.

Back in the 1990s, it appeared that an irrational exuberance for IT-based capital spending grabbed hold of corporate America - and didn't let go for a long time. (Ironically, Europe and Asia did not catch this illness to the same degree, though I had non-U.S. clients who made mistakes similar to those of their U.S. counterparts.)

When the exuberance finally died down in 2001, corporate technology buying was left in a state of exhaustion and malaise, with little future direction. A new way of thinking and implementing IT was long overdue.

In 2002, the Sand Hill Group asked me to discuss the future of the software industry during its annual Enterprise conference. This was well past the bursting of the IT stock-market bubble, yet few pundits or companies believed that the worst was over. My presentation, "The New Competition," told how new technology combined with a new buying sobriety and economic shortfalls had changed the landscape of corporate IT buying and selling, perhaps forever. While it was a downbeat presentation to more than 200 CEOs and decision makers in the IT industry, it was received thoughtfully.

Before dinner, one conference attendee approached me and asked, "If things are going to become so bad, how will you make a living?" We both laughed, and although he may have been kidding, I felt a twinge of discomfort. A change was in the air for corporate buyers and sellers of technology, but I couldn't put my finger on what it was. Unlike during the previous 15 years, there was no Big Thing. No Y2K. No ERP. No Internet mania to stoke the hype fires. Instead, many small things were happening: major companies were experimenting with offshore development; clients were choosing to build open-source software solutions for under $100,000 rather than buying $2 million software packages; CIOs were looking to cut back rather than accelerate spending. Everyone was waiting for the Next Big Thing - but what was it?

The answer may seem obvious given all the technological advances in cell phones, broadband Internet access, DVDs, instant messaging, and personal computers. But we need to distinguish between consumer technologies, which are booming, and corporate technologies, which are not.

I was not the only one thinking this way. In mid-2003, Nicholas Carr's eight-page article "IT Doesn't Matter" in the Harvard Business Review triggered a firestorm of criticism. Regardless of the details of Carr's argument that the strategic importance of technology had diminished, his article served a useful purpose in triggering serious public debate. This debate covers three important themes: 1) the appropriate role of IT within business, 2) the right level of investment in technology, and 3) the benefits ultimately derived from technology.

With this book, I am attempting to consider all sides. You will find discussions and examples identifying participants with no more than a first name (to protect the confidentiality of my clients, it was not possible to identify them more precisely). All of the examples and conversations described in the book are real; they took place at various points over the course of my career.

In contrast to much of the research done by traditional analyst firms (which assumes and, in fact, promotes more spending in IT), this book maintains that companies can move ahead over the next few years without large increases in their IT budgets. The only thing a company needs is a different perspective.

A lot of books and reports focusing on business improvement via technology have a catch: Before you can hope to see any benefits, you are required to spend more money on a new technology than what you initially paid. While I have favorite technologies that I believe can improve business processes, those technologies are not the subject of this book. This book is about increasing the effectiveness of technology while reducing IT expenses. Instead of a silver-bullet technology, companies need silver-haired thinking, grounded in solid returns, not airy promises.

Let me be blunt: The 1990s way of doing business is dead. Buyers and sellers of technology have two choices: Deny the practice of spending less and getting more, and be rolled over by change-or embrace the change and move forward into a new business environment.

If I am right and we are living through an IT inflection point, it is useful to know that such points occurred in all major technological revolutions dating back to the industrial revolution of the 18th century. Carlota Perez's excellent book, Technological Revolutions and Financial Capital, places IT as the fifth such revolution. Nothing is new about this revolution except that few people remember the prior ones.

These inflection points have one consistent feature: The companies that recognize the nature of change before everyone else prosper greatly. The goal of this book is to help anyone whose livelihood depends on the IT industry to navigate the difficult times ahead.