Friday, February 21, 2003

FDA drops the other shoe on Part 11

FDA has just announced that it is issuing a single new draft guidance document for 21 CFR Part 11, and it is withdrawing all prior agency draft guidance on Part 11. In its announcement, FDA stated clearly that a re-examination of Part 11 is already underway that may result in revision of Part 11 itself. FDA also indicated that for the time being it will "not normally take regulatory action to enforce Part 11 with regard to systems that were operational before August 20, 1997. . . while we are examining Part 11." In other words, for now, legacy systems are grand-fathered. Furthermore, FDA indicated specific concerns over some Part 11 requirements for validation, audit trails, record retention, and record copying.

I was at the Medical Device Manufacturing conference in Anaheim when word began to spread through the exhibit floor regarding this announcement. But after carefully reading the new guidance this morning, it is clear that FDA is not abandoning its concern about use of computer systems. I say this for three reasons:
  1. Even though FDA withdrew Part 11 guidance regarding validation, validation of computer systems is still a requirement under predicate rules (e.g. 21 CFR Part 210, 211, and 820). Validation was a requirement even before Part 11 was originally promulgated.

  2. FDA stated clearly that it will continue enforcement of certain controls for closed systems (11.10) and open systems (11.30), such as limiting access, operational checks, authority checks, device checks, and administrative/procedural controls.

  3. FDA stated it would continue to enforce all of the Part 11 requirements for electronic signatures. Nearly no legacy system meets these requirements without remediation or adoption of a hybrid system of handwritten signatures executed to electronic records.

As I wrote earlier this month, FDA is not abandoning its interest in regulating use of electronic records and electronic signatures. Regulated companies should continue to implement the administrative and procedural controls called for by Part 11, since for the most part they are not difficult to implement, and they represent best security practices that will increase the trustworthiness and reliability of any system. Vendors of packaged software (such as ERP, PDM, document management, and quality assurance systems) that are working on adding technical controls required by Part 11 should continue their efforts. Nevertheless, FDA’s announcement gives both users and software vendors some breathing space to implement proper controls over electronic records and signatures, with hope of a more well-defined risk-based approach to Part 11 to come in the future.

Friday, February 14, 2003

Corporations—the next target for crackdown on piracy

Just three weeks ago, I predicted that large corporations would be the next target for the entertainment industry’s crackdown on Internet piracy of copyrighted media content. But it turns out that my prediction is coming true faster than I expected. The entertainment industry is already distributing a brochure to hundreds of corporations around the world, urging them to take action against employee downloading, or face legal consequences. ZDNet has a full report on this latest warning from the entertainment industry.

As I noted earlier, companies need to get their desktops under control. Many companies already have policies in place regarding acceptable use of corporate systems and desktops, but many of the same companies do not take the next step to directly audit desktops for compliance. All companies, large and small, need to adopt periodic desktop auditing as a best practice to mitigate liability.

My firm, Strativa, has already conducted one such audit on behalf of a large company, with a worldwide network, and the results were a real eye-opener. We wrote a white paper on the subject, which is available here [no longer available--contact me if interested--FS].

Wednesday, February 05, 2003

FDA signals change in approach to Part 11

Last week, FDA announced that it is withdrawing its draft guidance regarding the electronic copies requirements of 21 CFR Part 11. This is good news for all companies regulated by FDA. When FDA first issued this draft guidance less than three months ago, it was clear to me that if something wasn’t changed it was going to be nearly impossible to implement. For example, the guidance called for companies to provide FDA with capabilities to "perform the same kinds of data processing" on the electronic copies that the company’s own system allows on the original records. Other consultants I’ve spoken to had basically the same reaction. So, withdrawal of this guidance is welcome.

There are hints that FDA soon may be making more changes to its approach to Part 11. FDA made this announcement in the context of the initiative it began last August to update its current good manufacturing practice (cGMP) program to a more risk-based approach. In this context, FDA indicates that the withdrawn guidance on Part 11 "may no longer represent FDA’s approach under the CGMP initiative." Furthermore, FDA announced that main responsibility for implementing Part 11 is shifting from the Office of Regulatory Affairs to the Center for Drug Evaluation and Research (CDER), the FDA center that regulates drugs.

The implications of FDA’s announcement are a) that a more risk-based approach to Part 11 may be forthcoming, something that practitioners have been calling for since Part 11 was first promulgated, and b) that Part 11 should be applied on an industry-specific basis, by those who best understand industry issues and risks. Although CDER will take the lead in implementing Part 11, it would seem likely that inspection to Part 11 would take place by investigators from each FDA Center.

Companies struggling with Part 11 compliance should view FDA’s announcement and its implications as providing some breathing space--not as an abandonment of FDA’s interest in regulating use of electronic records and electronic signatures. Regulated companies should continue to implement the administrative and procedural controls called for by Part 11, since for the most part they are not difficult to implement, and they represent best security practices that will increase the trustworthiness and reliability of any system. Vendors of packaged software (such as ERP, PDM, document management, and quality assurance systems) that are working on adding technical controls required by Part 11 should continue their efforts. Nevertheless, FDA’s announcement may indicate that both users and vendors may be able to deal with Part 11 with less uncertainty than in the past.

For more discussion on Part 11 and its implications for users and vendors, see the posts I wrote in October, November, and December of last year.

Wednesday, January 22, 2003

Desktop auditing is crucial to avoid employer liability. Recently, my firm conducted an automated worldwide desktop audit for a major client. Additionally, we have received other inquiries and have seen a general increase in interest for this type of auditing service. The interest seems to be driven by two key issues: 1) the need to comply with software licensing agreements, and 2) the need to avoid liability for employee downloading of copyrighted materials.

Executives have generally understood their responsibilities under software licensing agreements, and most recognize in some abstract way that they are liable when employees install pirated software or media content on company computers. Under the concept of vicarious liability of the U.S. Copyright Act, an employer is liable for acts committed by its employees when those acts are within the scope of their employment duties. Therefore, most companies have policies in place regarding acceptable use of company computers.

But why have these needs suddenly become "front-burner" topics with many companies? Although executives know of these risks, until recently most have not done much about them because enforcement had been lax. Software vendors would occasionally audit large firms to ensure they purchased sufficient licenses. But it was nearly impossible to find a case where a company was actually sued for an employee's downloading of copyrighted materials. But the times are changing. Software vendors and the entertainment industry both are cracking down hard on non-compliance and piracy. The risk to companies -- large and small -- is real.

Software publishers stepping up audits

By some estimates, 24% of installed software in the U.S. is not licensed. According to the Software and Information Industry Association (SIIA), the worldwide cost of software piracy to vendors in 1999 was over $12.2 billion. With the software industry no longer enjoying the double digit growth of the 1990s, that's a lot of money left on the table. Therefore, vendors are looking to increase revenues by enforcing license agreements and cracking down on pirated software. The Software Publishers Association (SPA), the Business Software Alliance (BSA), and the UK-based Federation Against Software Theft (FAST), are all ramping up enforcement activity as vendors, such as Microsoft, continue to focus on combating non-compliance.

This risk is not limited to large companies. From time to time in the U.S., the BSA announces a piracy truce for small and medium businesses, where companies are encouraged to turn themselves in if they suspect they are harboring unlicensed or counterfeit software. Furthermore, the BSA, the SPA, and some software vendors such as Microsoft, solicit tipsters to anonymously report suspected cases of software piracy. So, even companies that want to be compliant could face an audit based on an anonymous tip from a disgruntled employee.

Entertainment industry cracking down on piracy

Media and entertainment companies, likewise, face loss of revenue due to illegal file-sharing. The industry has already shut down Napster, an early peer-to-peer file sharing service, and it is now turning its attention to other services, such as Kazaa and Morpheus, which are used to trade copyrighted materials, such as movies, music, software, and games. They are aggressively targeting Web sites that offer media for download as well as their Internet service providers. Just this week the Recording Industry Association of America (RIAA) got a federal judge to order Verizon to turn over the identity of a subscriber suspected of making available unauthorized copies of several hundred songs. They are also putting pressure on universities. The RIAA is currently sponsoring an educational program with more than 300 universities, claiming a 55% drop in the number of sites on university servers offering illegal downloads.

After the RIAA gets file-sharing services, ISPs, and universities under control, there is only one other place where consumers can easily download illegal content: the workplace. Is there any doubt that corporations with high-speed Internet connections and large networks of desktop computers will be the next target of the entertainment industry?

Therefore, companies need to get desktops under control. To avoid software license non-compliance, companies need to periodically audit license usage. And, to avoid liability for employee actions, companies must demonstrate "reasonable effort" to deter misappropriation or theft of computer software and intellectual property. Efforts should include a clear company policy regarding desktop software and content, consequences for non-compliance, periodic desktop audits, and documentation of the results.

Tuesday, January 21, 2003

Blocking and tackling in the warehouse

A recent survey of supply chain managers and executives revealed that a surprising number of companies have yet to fully adopt transportation and warehouse management systems (TMS and WMS) and advanced planning and scheduling (APS) systems, and that these systems are only now becoming more broadly accepted.

The study found that only 15 percent of responding companies had fully implemented TMS, while another 10 percent were in the process of implementation. Adoption was somewhat better for WMS, with 35 percent having a WMS in place and another 10 percent in process of implementation. Still the study found that interest in such solutions is high, with an additional 24 percent considering TMS and an additional 35 percent considering WMS.

"We were surprised by how interested people are in those fundamentals — systems that help in the warehouse or that help in the transportation function, for instance," said Thomas Goldsby, assistant professor of marketing and logistics at Ohio State University, who was one of the study’s authors. He also said they were surprised that these systems, which he calls "blocking and tackling" technologies of logistics — had not seen broader adoption given the IT spending spree of the late 1990s and the widespread buzz about supply chain collaboration in recent years.

In my opinion, such findings are not surprising to anyone who has actually spent any time among midsize manufacturing and distribution firms. The gap is huge between what vendors offer and what most midsize hard-goods firms have actually implemented. Companies may have back office systems such as accounting, purchasing, and customer order management under control, but out in the warehouse there is still a tremendous opportunity for basic execution systems such as warehouse and transportation management. This should be good news for vendors of such systems, such as EXE, Manhattan Associates, Optum, and Catalyst, as well as the hundreds of niche vendors that offer point solutions in this space.

iSource has a complete report on the study.

Monday, January 13, 2003

ROI study is a huge embarrassment for i2. Nucleus Research has just published a study on the return on investment (ROI) for users of i2, a Tier I supply chain management vendor. It found that over half of the companies surveyed have not yet received a positive ROI from their investments in i2 software. The study, based on interviews with companies that i2 touts as references on its own Web site, is bad news for i2. Worse still, Nucleus is freely distributing the report, so anyone even thinking about i2 is going to hear about it, probably from i2's competitors.

The study found several factors contributing to negative ROI:
  1. Customers buying modules they will never use, due to i2 bundling of products.
  2. Software license cost based on upfront estimates by i2 of benefits that the client will achieve, resulting in license costs that are simply too high.
  3. Steep learning curves and long training periods, with average training time of nearly two weeks per user, and super-users requiring up to two months of training.
  4. Consultants from i2 not sufficiently knowledgeable in their own products.
  5. Implementation taking longer than estimated in nearly 70 percent of the companies surveyed.
Adding to the embarrassment, the study found that i2's Web site is overstating the benefits received for specific customers and that i2 continues to reference customers that are no longer using its products.

To be fair, many companies have been successful in implementing i2. And, i2 has been targeting the largest organizations with the most complex supply chain problems, a high-risk environment for any technology vendor. Furthermore, customers as well as vendors bear responsibility for the success of any software implementation. Nevertheless, it is impossible to escape the conclusion that in too many cases i2 has been over-promising and under-delivering. As a result, supply chain vendors, including i2, have already adjusted their strategies to simplify their applications, break up sales into more bite-sized pieces, and to focus on more targeted solutions, such as supply chain event management.

The study itself is on the Nucleus Research website. Computerworld has an article with some interesting, if somewhat muted, reaction to the study from i2.

Saturday, January 11, 2003

E-learning ROI isn't a slam dunk. Earlier this year, I wrote about a study that found a strong return on investment from e-learning initiatives. I still believe that the business case for e-learning can be strong, but getting the payback takes more effort than vendors of e-learning solutions like to admit.

Achieving e-learning benefits.
First, companies should devote more effort to ensuring that employees actually use the system and learn something. Simply implementing Web-based training is not enough. E-learning is a whole different type of experience than live classroom training. If this fact is not recognized, companies may replace classroom training with e-learning but find that employees are not learning anything.

For example, Web-based training tends to shift responsibility for learning from the trainer to the trainee. E-learning simply requires more self-discipline on the part of the student. One study indicated that only 25% of students who start an e-learning course actually complete it. When Bob goes off to classroom training, he leaves his office and faces a live instructor who, if he or she is any good, holds his attention in the classroom. But when Bob stays at his desk and participates in Web-based training, it's too easy for him to answer the phone or read e-mails on the side. It's also too easy for his manager to pop in with a quick question, or ask him to defer his lesson in order to deal with some crisis. Of course, sometimes the Web-based content is simply boring. Therefore, CIOs and HR professionals must remind themselves that the goal of e-learning is not merely to save training costs, but to more effectively train the workforce. Those responsible for e-learning must continually assess how well learning objectives are being met.

Finding the sweet spot.
Second, decision makers must remind themselves that e-learning is just one element of a comprehensive employee development program. Web-based training is good for some things — it is not good for others.

For example, Web-based training often does not accommodate labs or hands-on exercises as well as classroom training does. I saw this first-hand about a year ago while working with a Fortune 50 technology firm that has already made a huge investment in e-learning. I was helping to manage a program to develop classroom material along with hands-on exercises involving this firm's products. Our charter was to conduct a series of instructor-led classes, refine the material, and then convert it for Web-based delivery. Although we could see how to translate the lecture material for the Web, the courseware authors (all highly experienced system engineers) could not imagine how the lab exercises — where most of the real learning occurs, could be delivered over the Web. Therefore, the best approach for many technical subjects would be a combination of Web-based training with in-person labs or practical exercises.

Web-based training, or distance learning, is not going to replace all classroom training. More likely, it will be useful for basic subjects that must be taught to a large pool of employees, such as new employee orientation, EEOC training, HIPAA compliance, or basic user training during an ERP or CRM rollout. It also may be useful to cover prerequisite subjects prior to classroom training, or to provide follow-up. Training professionals have already discovered that computer-based training is useful for measuring the effectiveness of any kind of training. Nevertheless, for some subject matter, classroom training is simply the better vehicle. The challenge for companies will be to find the best combination of training formats to develop employee skills most cost-effectively.

Per-user Pricing Can Be Costly.
Finally, buyers should check their assumptions on the cost side of the equation. E-learning solutions can be expensive. Vendors often price their solutions based on total employee headcount or total named users. But this assumes that a large percentage of the employee population will adopt Web-based training. Companies that have signed up for such deals often find that actual adoption, or use of the system, lags far behind the total number of seats the company has licensed.

Typically, a lack of system usage does not play into building a solid business case for most other types of IT investments. For example, end users of a newly implemented transactional system, such as ERP, have little choice when it comes to using the new system. Because the end users can't do their jobs apart from the ERP system, assumptions regarding the user count in an ERP business case tend to be accurate. However, employees or entire departments can choose not to take advantage of an e-learning system. Therefore, when building a business case for an e-learning initiative, buyers should try to structure the deal to specify a conservative base number of users, with terms that allow additional users to be added on a per user basis.

Vendors like to justify the cost of their e-learning systems by pointing to the huge savings in travel costs that will result if much of the live classroom training in central locations is replaced with distance learning. But, as noted, this benefit may be overstated. I believe that there is already a backlash developing against unreasonable expectations for e-learning.

I still believe that the business case for e-learning is strong. I also believe that we are still early in the life cycle of adoption, as companies learn how to leverage the unique strengths of Web-based training. But ultimately, it has to be much more than simply avoiding travel costs. It comes down to how much knowledge is effectively transferred and to what extent employee skills are actually enhanced. By focusing on the objectives, as well as the cost savings, executives can achieve a more reliable business case for e-learning.

E-Learning vendors.
For companies considering development of an e-learning capability, there are dozens of niche vendors offering solutions, some on a license basis, others on a hosted basis. Some of the current vendors include Click2Learn, Docent, Element K, GeoLearning, Intralearn, KnowledgePlanet, NETg, Pathlore, ReadyGo, Saba, Skillsoft, and Skillview Technologies. In addition, many of the enterprise application vendors, such as SAP, PeopleSoft, J.D. Edwards, Oracle, and Siebel have introduced e-learning capabilities as part of their suite of products.

Wednesday, January 01, 2003

Aberdeen: new poster child for sloppy research

Earlier this month, Aberdeen Group released a study that claims “the poster child for security glitches is no longer Microsoft; this label now belongs to open source and Linux software suppliers.” However, a closer look at Aberdeen’s research indicates that it may be more appropriate to focus the spotlight on Aberdeen itself.

Aberdeen found that:
“Open source software, commonly used in many versions of Linux, Unix, and network routing equipment, is now the major source of elevated security vulnerabilities for IT buyers. Security advisories for open source and Linux software accounted for 16 out of the 29 security advisories — about one of every two advisories — published for the first 10 months of 2002 by CERT (Computer Emergency Response Team). Keeping pace with Linux and open source software are traditional Unix-based software products, which have been affected by 16 of the 29 — about half of all — advisories to date during 2002. During this same time, vulnerabilities affecting Microsoft products numbered seven, or about one in four of all advisories.”

If true, this is a stunning turnabout. It is common knowledge that Microsoft has had much bad press surrounding security of its products, such as deficiencies in IIS. Organizations with mission-critical security requirements have traditionally implemented open source products for systems exposed to the Internet, such as Unix or Linux. So, if Aberdeen’s analysis is correct, the trend has been reversed, with Microsoft’s efforts over the past year in “trustworthy computing” paying off to make Microsoft now more secure than Unix/Linux and open source in general.

However, Aberdeen's analysis is faulty. Because CERT advisories are public information, it is a simple matter to look at the raw data behind Aberdeen’s conclusions and see where Aberdeen erred. David Kelsheimer, network services practice director for Strativa, assisted me in dissecting Aberdeen’s conclusions. Based on our analysis of the 2002 CERT advisories, we can summarize the problems with Aberdeen’s study as follows:

1. Aberdeen counts CERT advisories, ignoring multiple vulnerabilities per advisory. This is like counting the number of guests arriving at a party by counting the automobiles they come in, regardless of the number of passengers in each car. For example, CA-2002-09 describes 10 separate vulnerabilities in Microsoft’s IIS, but Aberdeen counts them as one advisory. Thus, by Aberdeen’s reckoning, in the first 10 months of 2002, 16 out of the 29 security advisories are for open source/Linux, and 7 out of 29 are for Microsoft.

However, when we count based on the number of vulnerabilities within the advisories, the score is 18 for open source/Linux and 24 for Microsoft. We categorize another 34 as “other” or cross-platform vulnerabilities, because they are difficult to attribute to Microsoft or Linux/open source. (E-mail me at the address in the right column if you would like a copy of our worksheet.)

2. CERT advisories are not an adequate sample. CERT itself has said as much, in response to Aberdeen’s study. CERT’s comments were reported in an InternetWeek article, which said,

“CERT believes Aberdeen drew too much from its numbers. The organization doesn't draw any conclusions from its advisories on the vulnerability of open-source software vs. Microsoft or any other seller of proprietary applications. Instead of comparisons, the group focuses on identifying and studying security problems it considers most serious based on CERT's own metrics. That covers about 20 percent of all known vulnerabilities, said Shawn Hernan, senior member of the CERT technical staff.”

If Aberdeen were interested in a more complete sample, it could have looked at the complete database of CERT vulnerabilities, which lists over 3,000 vulnerabilities for the first ten months of 2002. The fact that it didn’t is puzzling.

3. Comparing Microsoft with “open source/Linux” is not a fair comparison. As one correspondent to SecurityFocus pointed out,
“…to take a listing of vulnerabilities from CERT (not a comprehensive list by any means!) and say that Linux is less secure because there are more open source advisories is laughable. There are more types of open source software out there, than there are software packages from Microsoft. To attribute open source flaws to Linux is like blaming Microsoft for the holes in AOL Instant Messenger.”

4. Aberdeen fails to note other problems with the use of CERT data. For example, CERT only reports vulnerabilities that are confirmed by the software developer. Because of the nature of open source, vulnerabilities tend to get reported and confirmed more transparently for open source than for closed source products, such as those of Microsoft, which has the option of not disclosing vulnerabilities which it finds and patches itself in the next release. Second, CERT issues advisories only for those vulnerabilities with the potential for the widest impact on the Internet. As one correspondent to OSOpinion pointed out,
“Since a large percentage of Internet infrastructure is based on open source software such as BIND, sendmail, and Apache, it makes sense that security flaws in these products would be considered serious, while flaws in Microsoft products may not be counted because they have much less of an impact. A security flaw in Microsoft Word may be bad, but it does not have the potential to bring down much of the Internet or compromise the integrity of millions of dollars in e-commerce.”

Aberdeen’s failure to properly use CERT data can only have two explanations: either Aberdeen researchers did not realize the shortcomings of such data, or worse, they had a conclusion they wanted to reach and searched for data to help them reach that conclusion. If Aberdeen’s research is so poor when based on public data, which can be independently verified, how can we trust its research when it develops the data itself and does not release it in its raw form?

Thursday, December 19, 2002

QAD hooks up with IBM

Manufacturing Systems reports on QAD's deal with IBM to standardize QAD's eQ supply chain management product on IBM's database, middleware, and Web services products. According to QAD's press release, QAD's eQ applications will now be pre-integrated with IBM's WebSphere Application Server and WebSphere MQ, and DB2 database software." (It should be noted that the deal does not involve QAD's flagship ERP system, MFG/PRO, which is built on the Progress development platform and therefore not easily wrapped around IBM's technology.)

Although Manufacturing Systems treats this as a ground-breaking alliance, J.D. Edwards (JDE) announced a similar deal with IBM in September, which I outlined in my post on Sep. 22. The contrast between IBM's strategy for enterprise applications and Microsoft's is striking. Microsoft is moving directly into the enterprise systems space, with its Microsoft Business Solutions group, whereas IBM is taking the partnership route, bundling its offerings with those of its ISV partners, such as QAD and JD Edwards.

Wednesday, December 18, 2002

The first time buyer’s mistake

I had an interesting discussion today with a software sales representative who described her efforts to sell a new enterprise system to a small company. Because the company is relatively unsophisticated, its best choice would be a simple system that meets its key needs -- but not too much more -- along with a local value-added reseller (VAR) that can provide support. But the prospect refuses to consider anything other than price. As a result, the prospect is considering a bargain basement system that does not have certain key features and offers only Web-based support. Furthermore, because the prospect views all software vendors as “used car salesmen,” the sales rep cannot convince him that he is heading for trouble.

Selecting a system solely on price is the classic first-time buyer’s mistake. First-time buyers often forget that price is only one factor in success. An enterprise system must meet key functional requirements of the business. It must be able to scale to support the anticipated number of users and transaction volume. It must not be too complicated for employees to learn or use. It must come with adequate support. It must operate with a certain level of reliability. When buyers ignore these other factors, they risk failure and end up spending more than if they had made a sensible choice to begin with. This lesson is so simple that it is almost common sense. But too often, common sense is not so common.