I came a blog post recently entitled,
Is Enterprise 2.0 Helping to Kill Off the IT Department? (For those not up-to-date on the latest fashions, Enterprise 2.0 refers to the use of tools such as blogs, wikis, RSS, mashups, and social networks to capture and manage unstructured information in business enterprises.)
What set me off about this post was not the part about "Enterprise 2.0," but the part about "killing off the IT department." I've been reading articles like this ever since I started working with IT several decades ago.
About every 10 years or so, some new technology comes along that observers trumpet as so radical and innovative that it will result in nothing less than the death of the corporate IT department. And every time, IT ultimately adapts, though at first it may resist.
The PC revolution
As I recall, the first death knell was sounded when PCs arrived. These started showing up in businesses in a big way in the late 1970s. At the time, I was working as a manufacturing systems analyst for an oil field services firm. I was in the midst of gathering requirements for a custom system that would manage tooling inventory on the shop floor.
One day, I went out to visit the tool crib in one of our plants and found that my users had already built their own tooling inventory system using a TRS-80 PC from Radio Shack. They said they didn't need help from corporate IT. I argued with the manager, "Look, I'm not out here setting up my own tool crib--you shouldn't be out here building your own computer systems." Ultimately, we did end up building the corporate tooling system, but not without a lot of resistance from users who felt that what could build themselves was good enough.
It took more than 10 years--some might say, 15-20 years--for IT to figure out how to adapt to the PC revolution. The first big issue was "connectivity." Users were buying personal computers, then spending way too much time re-keying data from mainframe-printed reports into PC spreadsheets. So, corporate IT was given the task of connecting them while still maintaining security and integrity of corporate data.
The next challenge was to harness all that computing power sitting on the desktop, which brought in the era of client-server computing. Centralized systems would maintain master file data, while desktop systems would be used for data entry and analysis. The strengths of the server computer and the client computer would each be used where they were strongest. Response time would be faster on the desktop PC since only local processing was required, and data integrity would be maintained as master files were kept at the server level. Eventually, you had three-tier (client, application server, and database server) and n-tier systems.
Along with client-server computing, we had to deal with the need for graphical user interfaces. As users became accustomed to the Windows GUI in the late 80s and early 90s, they became frustrated with the character-based interfaces of their corporate mainframe and minicomputer systems. Some users, learning Visual Basic on their own, became better PC programmers than those of us in corporate IT. So we had a whole new set of skills to learn. But corporate IT ultimately adapted.
What corporate IT went through to adapt to the PC revolution was much greater than anything it faces today in adapting to the presence of Facebook, Twitter, and iPhones.
The Web revolution
Just around the time corporate IT figured out how to managed PCs and client-server systems, the Internet changed everything. Again, a new technology served to empower users, who were now able to build their own websites and simple web-based applications, bypassing corporate IT.
But the Internet and Web technologies turned out to be a great boon to corporate IT. The first way it helped was in simplifying system design and system management. In truth, client-server was a very cumbersome way to build and manage systems, with code sitting both on the server and on each client. Web technologies made it possible to create thin client systems, with all business logic sitting at the server tier and the desktop PC used only for user presentation. Web technologies allowed systems to be re-centralized--a return to the host-based computing of the mainframe days, but with GUI.
Second, the Internet made it possible to connecting customers, partners, and suppliers in a way that was much simpler than it was with the old electronic data interchange (EDI) technologies--the so-called e-business revolution. Some of this resulted in the dot-com boom and bust, but much still remains and has become part of doing business today.
Once again, the corporate IT department adapted and survived.
The cloud-computing revolution
Today, we're in the midst of another revolution--cloud computing--which is really just an extension of the Internet wave. As Web-based systems became mainstream, it became possible to move an application system with its entire infrastructure--data centers, servers, databases, application code, network gear, and infrastructure support personnel--out of the corporate data center to a third-party provider.
At first it was a one-for-one replacement, simply picking up the application and moving it offsite--what was called, in the late 90's, the application service provider (ASP) model. Then, providers began to build their own applications that could support multiple customers on a single instance of the application code, so-called multi-tenant systems, or software-as-a-service (SaaS). Some would say this was a return to the old time-sharing or service bureau model of the earliest days of corporate computing.
The rise of SaaS applications, from providers such as Salesforce.com, meant that user departments could go out and buy their own applications as a service, without any involvement from corporate IT.
Earlier this year, Ray Wang did a
quick poll of 46 large company CIOs and found only 11 (24%) of them indicating they had SaaS applications running in their organizations. But when he polled the procurement managers
from these same organizations, he found that 100% of them reporting the presence of SaaS applications in various business units. Clearly, CIOs in three-quarters of these companies were being cut out of the SaaS procurement decision.
Doesn't this mean the death of the corporate IT department? I don't think so, because nearly every one of these SaaS applications, ultimately, will need to be integrated into an enterprise architecture, just like all those PCs back in the 80's needed to be connected to the corporate mainframe. You can see this happening already. As one respondent in Ray's survey said, "The business heads keep showing up with these SaaS apps and then want us to integrate them. We need to get a handle on all this!”
The architectural role of corporate ITNo one talks about "the death of corporate HR," or "the death of corporate accounting." Why is this thought of "the death of corporate IT" constantly recurring? I think it's because many observers only see corporate IT as a manager of technology. So, every time a new technology comes along that users can deploy for themselves, it would seem to obviate the need for a corporate IT department to exist at all.
Many observers don't realize that information technology--software, hardware, networks--is just one part of "corporate IT." The other major part is designing and managing business processes--especially cross-functional business processes--that utilize technology and developing. This part also includes maintaining the enterprise architecture--the combination of technology and business processes--to support the organization's objectives.
Granted, some IT departments are not very good at the second part. But if they aren't, someone needs to be. I don't care what you call it, or who it reports to. Someone needs to be concerned about the integration of all these systems with one another, and no single user department is in a position to do that.
Back to the beginning
In thinking about these matters, I was reminded of my introduction to corporate IT. My first job was as a programmer trainee at Macy's at Herald Square in New York (as seen in the
Miracle at 34th Street). Macy's was one of the first organizations to purchase an electronic computer for business purposes--one of the first National Cash Register (NCR) computers--in the early 1950s.
I reported to some of those first programmers, who were still working there when I started as a trainee in 1974. (In fact, when we had trouble with some obscure piece of logic in those core systems--which had long since been migrated from NCR to the IBM mainframe--these old boys would pull out some dog-eared pages of ancient bubble flow-charts to answer our questions.)
Interestingly, prior to the arrival of its first NCR mainframe, there was no "corporate IT department" at Macy's. Rather, Macy's assigned the responsibility for this new technology to a group that already existed within the organization--it was called (as I believe I was told), the "Systems and Procedures Department."
What was the previous mission of the Systems and Procedures Department? It was to design and maintain all business processes within Macy's. For example, the procedure for handling and accounting for returned merchandise involved several departments and required tight controls. The Systems and Procedures Department designed, tested, and implemented this procedure. When the NCR mainframe arrived, the department simply incorporated computer technology to automate parts of that process.
The "systems and procedures" roots of the Macy's corporate IT department were still in evidence when I arrived in 1974, fresh out of college as a programmer trainee. After two or three days studying programming manuals, bored out of my mind, I was ready for my first assignment--to write a program to print "Holiday Money" (those fake currency certificates that department stores used to mail during holiday season to credit card holders, allowing them to spend them like real money and have the amounts charged to their credit cards).
But rather than give me programming specs, my manager (one of the old guard) gave me my first task: go talk to the users. He said, "Around here, no one is just a programmer--you have to understand the business. Go talk to the users and see what they want. Then write it up and run it by me."
He also told me about a friendly rivalry he had with his counterpart at another well-known department store. From time to time each would visit the other's store and attempt to find weaknesses in the other's business processes.
In fact, just the previous year my manager had discovered a hole in his rival's Holiday Money process! It was possible to return merchandise purchased with holiday money and receive cash back--effectively getting a cash advance on the consumer's credit card, which was a violation of the store's policy. He then notified his rival, who plugged the hole, but lost bragging rights for that round.
Understand, these were not two "business managers." These were two IT managers.
So I learned from my first week on the job: corporate IT is not just about technology. It's about "systems and processes." So, if "systems and processes" pre-dated the introduction of computers to business, it means that this role for corporate IT will survive, even if we get rid of the computers.
Related posts
The inexorable dominance of cloud computing
IT departments face extinction
The end of corporate computing