Monday, September 27, 2010

Assessing and rationalizing the IT applications portfolio

There's nothing like putting out a big hairy number to get media attention, and Gartner has just done that with its new statistic it calls "global IT debt." Gartner defines it as "the cost of clearing the backlog of maintenance that would be required to bring the corporate applications portfolio to a fully supported current release state."

And how big a number is it? According to Gartner's press release, it's approximately $500 billion today "with the potential to rise to $1 trillion by 2015."

A bogus statistic
I first heard about the press release through my associate, Vinnie Mirchandani, who points out that this latest statistic was in keeping with a long-standing Gartner tradition:
Gartner made a name starting in the mid-90s forecasting the estimated cumulative cost of Y2K remediation. I was there – and the big numbers it bandied about helped focus enterprises on the core problem. But it also led to hype, panic buying (and exaggerated market declines on the other side of the peak) and many, many poor IT investments. Since then Gartner has picked on many events – such as the introduction of the Euro between 1999 and 2002 - and piled up potential costs to come up with a single, usually scary, related aggregated IT project cost forecast.
Vinnie goes on to point out five reasons why it may be in an organization's best interest NOT to upgrade applications to the current release. Read Vinnie's whole post.

At this point, the easiest thing for me to do would be to pile on. In fact, my first reaction upon hearing about IT debt was to call the concept "bone-headed."

For example, if IT organizations are incurring a debt, to whom do they owe it? To software vendors? If so, it would betray a very vendor-centric view of IT. Gartner's view also assumes that having a software package up to date on the current release is a desired state--a concept that Vinnie pretty much demolishes.

Rationalizing the application portfolio
But, to be fair, it appears that Gartner is using this statistic to highlight the state of disarray in the applications portfolios of many organizations and to "make the problem bigger."

With that in mind, let's stipulate that application proliferation and unsupported versions is a big problem in many organizations. What, then, should a CIO do about it?

I would recommend a formal process to evaluate the entire applications portfolio and assign them to categories, such as the following:
  1. Retirement Candidates: those applications that can simply be retired, as they may have little current usage or the business value is minimal.

  2. Consolidation Candidates: those that duplicate functionality in other applications. For example, combining two SAP instances, or an SAP system and an Oracle system. Pick one and standardize on it.

  3. Freeze Candidates: those older applications that have little business value from upgrading to the current release. Mixing metaphors here, in Gartner's concept this would be "debt forgiveness." This might be a temporary strategy for an application that ultimately is slated for retirement or consolidation.

  4. 3PM Candidates. those older applications that still need to be maintained but have insufficient business value from upgrading. These are good candidates for third-party maintenance. This wouldn't be a full "debt forgiveness" but more like "loan modification." You still need the application but are looking for a cheaper alternative to vendor maintenance.

  5. SaaS Candidates: those that would benefit from moving away from traditional on-premise to software-as-a-service. Such applications do not incur "IT debt" in the future, because the service provider keeps the application up to date at all times, unlike traditional on-premise software that requires customers to upgrade.

  6. Upgrade Candidates: those that do not qualify for SaaS conversion but represent strategic platforms that the customer wants to keep current on some sort of schedule. You should avoid making custom modifications to these applications as much as possible, as it makes them more difficult to upgrade.
Now, this is not an exhaustive list of categories, but you get the idea. You have to know what applications you have and what condition they are in and then come up with a strategy for optimizing the value of each. Of course, the list should also be prioritized to determine the criticality and business value of the proposed changes.

Conducting the assessment
On the other hand, it is not always an easy matter to put applications into the right category. Users may have one opinion on the business value or technical quality of the application, while the IT organization may have another. How to evaluate in-house written systems and modifications to packaged software further complicates the problem, as the IT organization may have considerable pride-of-ownership.

In conducting such assessments, we have found sometimes widely differing opinions about whether the problem is the application itself, how the application was installed or configured, or the business processes that use the applications--or all three. In many cases, therefore, it helps to have a neutral third-party participate in the evaluation.

My consulting firm Strativa has experience in conducting these sorts of assessments. You can also read more on our approach in the blog post I wrote several years ago, Four problems with ERP, and a follow-up post, Solving the Four Problems with ERP.

Email me if this is something of interest to your organization.

Update, Sep. 29. My colleague Dennis Howlett has a good post on ZDnet with further reaction to Gartner's IT debt proposition as well as to Vinnie's and my posts. Read Dennis's entire post.

Monday, September 20, 2010

Oracle Apps User Survey: first look at early results

Update: the final results of our survey are complete. See this post: Oracle applications customers: wedded bliss or battered wives? The full report, Go-Forward Strategies for Oracle Application Customers, is available from Computer Economics.


Over at Computer Economics, we've been running a survey for customers of Oracle Applications. We've received nearly 100 responses so far, which is enough for us to begin to see some patterns taking shape.
  1. Service and support. There is a lot of dissatisfaction among Oracle customers concerning the quality and cost of Oracle maintenance and support. For example, 48% of E-Business Suite users and 41% of PeopleSoft users express dissatisfaction.

    But what really bothers customers is the cost of support: a whopping 63% of EBS users and about half of the PeopleSoft and J.D. Edwards users say Oracle support costs too much.

  2. Stay or leave? In spite of unhappiness with Oracle's support, most Oracle Apps customers aren't going anywhere. About 75% see Oracle maintaining the same or greater share of their IT budgets three years from now.

  3. Fusion what? Oracle has its work cut out for itself in selling its product roadmap. Fusion Apps are not on the radar for most Oracle Apps customers. For example, only 25% of E-Business Suite and PeopleSoft customers are considering a migration to Fusion. This may change for the better after this Open World conference, as Fusion Apps will be getting a lot more attention. It will also help when Oracle's installed base reps are finally able to start demonstrating Fusion, after this conference.
Our survey covers several other areas as well, such as experience with Sun under Oracle, Exadata, and views toward Oracle's litigation of competitors.

Dennis Howlett videotaped me commenting on these results.



We plan to publish the full results in Q4. In the meantime, if you are an Oracle Apps customer, please take the 10-minute survey here. Qualified respondents will receive a free copy of the full final report from Computer Economics.

Related posts
Oracle Open World 2010: First Night Vibe

Sunday, September 19, 2010

Oracle Open World 2010: First Night Vibe

I'm here again at this year's Oracle Open World conference in San Francisco. The conference opened for me with my participation on a panel discussion for Oracle customers, but outside of Oracle's control, moderated by Ray Wang of the Altimeter Group. Lots of dialog there that I need to process.

I'm now listening to opening presentations, in the comfort of the blogger/press room, awaiting the much anticipated keynote by Oracle CEO Larry Ellision.

In the meantime, here's a quick Youtube video of sights and sounds around and in the Moscone center, leading up to the first night's keynote.



Update, 7:00 p.m.--Oracle's View of Cloud Computing
We're now into Ellison's keynote, which is heavily focused on cloud computing and moved quickly to covering the latest developments in Oracle/Sun hardware. At the beginning of his talk, Ellison first developed two definitions of cloud computing:
  1. Virtualized cloud computing infrastructure services, as offered by Amazon.com's Elastic Cloud Computing services, and
  2. Software applications that are offered as a service over the Internet, as typified by Salesforce.com.
Ellison says that Oracle's definition of cloud computing matches Amazon's definition, not Salesforce.com's. By then quickly moving to an overview of Oracle/Sun hardware, his purpose is clear: Oracle wants to be a providers of infrastructure hardware and software to cloud computing providers--both public clouds and private clouds (similar virtualized data center services run by large organizations for their internal purposes).

By implication, then, Oracle does not intend to broadly offer software-as-a-service, as Salesforce.com does. Ironically, Oracle does have some SaaS offerings today, such as its CRM On-Demand, which it inherited from Siebel. But as we all know, such offerings do not carry the high margins of Oracle's on-premise software applications, or the margins it thinks it can get by selling high-end database appliance boxes (i.e. Exadata). In my view, then, Oracle would rather sell the infrastructure (hardware, database, middleware) than the service (SaaS applications delivered over that infrastructure).

There's something to be said about having a clear strategy and executing consistent with it. However, I have to wonder--if SaaS becomes the norm for delivering application functionality in the future, will the hardware/database/middleware market really grow to match Oracle's expectations?

Update, 8:00 p.m: Fusion Apps on the way.
Ellison devoted the last part of his keynote to talk about Oracle's much-awaiting Fusion Applications, which he said would be released to some customers late this year, with general availability in the first quarter of 2011. I won't go into the details at this time. I see now that there's a new section on Oracle's website, with much new detail on what's in Fusion Apps.

Postscript: Oracle Apps User Survey. Our 10-minute Oracle Apps User Survey is still open for responses. If you're a customer of any of Oracle's Application products, take the survey here.

Wednesday, September 15, 2010

Rumors of the death of corporate IT greatly exaggerated

I came a blog post recently entitled, Is Enterprise 2.0 Helping to Kill Off the IT Department? (For those not up-to-date on the latest fashions, Enterprise 2.0 refers to the use of tools such as blogs, wikis, RSS, mashups, and social networks to capture and manage unstructured information in business enterprises.)

What set me off about this post was not the part about "Enterprise 2.0," but the part about "killing off the IT department." I've been reading articles like this ever since I started working with IT several decades ago.

About every 10 years or so, some new technology comes along that observers trumpet as so radical and innovative that it will result in nothing less than the death of the corporate IT department. And every time, IT ultimately adapts, though at first it may resist.

The PC revolution
As I recall, the first death knell was sounded when PCs arrived. These started showing up in businesses in a big way in the late 1970s. At the time, I was working as a manufacturing systems analyst for an oil field services firm. I was in the midst of gathering requirements for a custom system that would manage tooling inventory on the shop floor.

One day, I went out to visit the tool crib in one of our plants and found that my users had already built their own tooling inventory system using a TRS-80 PC from Radio Shack. They said they didn't need help from corporate IT. I argued with the manager, "Look, I'm not out here setting up my own tool crib--you shouldn't be out here building your own computer systems." Ultimately, we did end up building the corporate tooling system, but not without a lot of resistance from users who felt that what could build themselves was good enough.

It took more than 10 years--some might say, 15-20 years--for IT to figure out how to adapt to the PC revolution. The first big issue was "connectivity." Users were buying personal computers, then spending way too much time re-keying data from mainframe-printed reports into PC spreadsheets. So, corporate IT was given the task of connecting them while still maintaining security and integrity of corporate data.

The next challenge was to harness all that computing power sitting on the desktop, which brought in the era of client-server computing. Centralized systems would maintain master file data, while desktop systems would be used for data entry and analysis. The strengths of the server computer and the client computer would each be used where they were strongest. Response time would be faster on the desktop PC since only local processing was required, and data integrity would be maintained as master files were kept at the server level. Eventually, you had three-tier (client, application server, and database server) and n-tier systems.

Along with client-server computing, we had to deal with the need for graphical user interfaces. As users became accustomed to the Windows GUI in the late 80s and early 90s, they became frustrated with the character-based interfaces of their corporate mainframe and minicomputer systems. Some users, learning Visual Basic on their own, became better PC programmers than those of us in corporate IT. So we had a whole new set of skills to learn. But corporate IT ultimately adapted.

What corporate IT went through to adapt to the PC revolution was much greater than anything it faces today in adapting to the presence of Facebook, Twitter, and iPhones.

The Web revolution
Just around the time corporate IT figured out how to managed PCs and client-server systems, the Internet changed everything. Again, a new technology served to empower users, who were now able to build their own websites and simple web-based applications, bypassing corporate IT.

But the Internet and Web technologies turned out to be a great boon to corporate IT. The first way it helped was in simplifying system design and system management. In truth, client-server was a very cumbersome way to build and manage systems, with code sitting both on the server and on each client. Web technologies made it possible to create thin client systems, with all business logic sitting at the server tier and the desktop PC used only for user presentation. Web technologies allowed systems to be re-centralized--a return to the host-based computing of the mainframe days, but with GUI.

Second, the Internet made it possible to connecting customers, partners, and suppliers in a way that was much simpler than it was with the old electronic data interchange (EDI) technologies--the so-called e-business revolution. Some of this resulted in the dot-com boom and bust, but much still remains and has become part of doing business today.

Once again, the corporate IT department adapted and survived.

The cloud-computing revolution
Today, we're in the midst of another revolution--cloud computing--which is really just an extension of the Internet wave. As Web-based systems became mainstream, it became possible to move an application system with its entire infrastructure--data centers, servers, databases, application code, network gear, and infrastructure support personnel--out of the corporate data center to a third-party provider.

At first it was a one-for-one replacement, simply picking up the application and moving it offsite--what was called, in the late 90's, the application service provider (ASP) model. Then, providers began to build their own applications that could support multiple customers on a single instance of the application code, so-called multi-tenant systems, or software-as-a-service (SaaS). Some would say this was a return to the old time-sharing or service bureau model of the earliest days of corporate computing.

The rise of SaaS applications, from providers such as Salesforce.com, meant that user departments could go out and buy their own applications as a service, without any involvement from corporate IT.

Earlier this year, Ray Wang did a quick poll of 46 large company CIOs and found only 11 (24%) of them indicating they had SaaS applications running in their organizations. But when he polled the procurement managers from these same organizations, he found that 100% of them reporting the presence of SaaS applications in various business units. Clearly, CIOs in three-quarters of these companies were being cut out of the SaaS procurement decision.

Doesn't this mean the death of the corporate IT department? I don't think so, because nearly every one of these SaaS applications, ultimately, will need to be integrated into an enterprise architecture, just like all those PCs back in the 80's needed to be connected to the corporate mainframe. You can see this happening already. As one respondent in Ray's survey said, "The business heads keep showing up with these SaaS apps and then want us to integrate them. We need to get a handle on all this!”

The architectural role of corporate ITNo one talks about "the death of corporate HR," or "the death of corporate accounting." Why is this thought of "the death of corporate IT" constantly recurring? I think it's because many observers only see corporate IT as a manager of technology. So, every time a new technology comes along that users can deploy for themselves, it would seem to obviate the need for a corporate IT department to exist at all.

Many observers don't realize that information technology--software, hardware, networks--is just one part of "corporate IT." The other major part is designing and managing business processes--especially cross-functional business processes--that utilize technology and developing. This part also includes maintaining the enterprise architecture--the combination of technology and business processes--to support the organization's objectives.

Granted, some IT departments are not very good at the second part. But if they aren't, someone needs to be. I don't care what you call it, or who it reports to. Someone needs to be concerned about the integration of all these systems with one another, and no single user department is in a position to do that.

Back to the beginning
In thinking about these matters, I was reminded of my introduction to corporate IT. My first job was as a programmer trainee at Macy's at Herald Square in New York (as seen in the Miracle at 34th Street). Macy's was one of the first organizations to purchase an electronic computer for business purposes--one of the first National Cash Register (NCR) computers--in the early 1950s.

I reported to some of those first programmers, who were still working there when I started as a trainee in 1974. (In fact, when we had trouble with some obscure piece of logic in those core systems--which had long since been migrated from NCR to the IBM mainframe--these old boys would pull out some dog-eared pages of ancient bubble flow-charts to answer our questions.)

Interestingly, prior to the arrival of its first NCR mainframe, there was no "corporate IT department" at Macy's. Rather, Macy's assigned the responsibility for this new technology to a group that already existed within the organization--it was called (as I believe I was told), the "Systems and Procedures Department."

What was the previous mission of the Systems and Procedures Department? It was to design and maintain all business processes within Macy's. For example, the procedure for handling and accounting for returned merchandise involved several departments and required tight controls. The Systems and Procedures Department designed, tested, and implemented this procedure. When the NCR mainframe arrived, the department simply incorporated computer technology to automate parts of that process.

The "systems and procedures" roots of the Macy's corporate IT department were still in evidence when I arrived in 1974, fresh out of college as a programmer trainee. After two or three days studying programming manuals, bored out of my mind, I was ready for my first assignment--to write a program to print "Holiday Money" (those fake currency certificates that department stores used to mail during holiday season to credit card holders, allowing them to spend them like real money and have the amounts charged to their credit cards).

But rather than give me programming specs, my manager (one of the old guard) gave me my first task: go talk to the users. He said, "Around here, no one is just a programmer--you have to understand the business. Go talk to the users and see what they want. Then write it up and run it by me."

He also told me about a friendly rivalry he had with his counterpart at another well-known department store. From time to time each would visit the other's store and attempt to find weaknesses in the other's business processes.

In fact, just the previous year my manager had discovered a hole in his rival's Holiday Money process! It was possible to return merchandise purchased with holiday money and receive cash back--effectively getting a cash advance on the consumer's credit card, which was a violation of the store's policy. He then notified his rival, who plugged the hole, but lost bragging rights for that round.

Understand, these were not two "business managers." These were two IT managers.

So I learned from my first week on the job: corporate IT is not just about technology. It's about "systems and processes." So, if "systems and processes" pre-dated the introduction of computers to business, it means that this role for corporate IT will survive, even if we get rid of the computers.

Related posts
The inexorable dominance of cloud computing
IT departments face extinction
The end of corporate computing