Desktop auditing is crucial to avoid employer liability. Recently, my firm conducted an automated worldwide desktop audit for a major client. Additionally, we have received other inquiries and have seen a general increase in interest for this type of auditing service. The interest seems to be driven by two key issues: 1) the need to comply with software licensing agreements, and 2) the need to avoid liability for employee downloading of copyrighted materials.
Executives have generally understood their responsibilities under software licensing agreements, and most recognize in some abstract way that they are liable when employees install pirated software or media content on company computers. Under the concept of vicarious liability of the U.S. Copyright Act, an employer is liable for acts committed by its employees when those acts are within the scope of their employment duties. Therefore, most companies have policies in place regarding acceptable use of company computers.
But why have these needs suddenly become "front-burner" topics with many companies? Although executives know of these risks, until recently most have not done much about them because enforcement had been lax. Software vendors would occasionally audit large firms to ensure they purchased sufficient licenses. But it was nearly impossible to find a case where a company was actually sued for an employee's downloading of copyrighted materials. But the times are changing. Software vendors and the entertainment industry both are cracking down hard on non-compliance and piracy. The risk to companies -- large and small -- is real.
Software publishers stepping up audits
By some estimates, 24% of installed software in the U.S. is not licensed. According to the Software and Information Industry Association (SIIA), the worldwide cost of software piracy to vendors in 1999 was over $12.2 billion. With the software industry no longer enjoying the double digit growth of the 1990s, that's a lot of money left on the table. Therefore, vendors are looking to increase revenues by enforcing license agreements and cracking down on pirated software. The Software Publishers Association (SPA), the Business Software Alliance (BSA), and the UK-based Federation Against Software Theft (FAST), are all ramping up enforcement activity as vendors, such as Microsoft, continue to focus on combating non-compliance.
This risk is not limited to large companies. From time to time in the U.S., the BSA announces a piracy truce for small and medium businesses, where companies are encouraged to turn themselves in if they suspect they are harboring unlicensed or counterfeit software. Furthermore, the BSA, the SPA, and some software vendors such as Microsoft, solicit tipsters to anonymously report suspected cases of software piracy. So, even companies that want to be compliant could face an audit based on an anonymous tip from a disgruntled employee.
Entertainment industry cracking down on piracy
Media and entertainment companies, likewise, face loss of revenue due to illegal file-sharing. The industry has already shut down Napster, an early peer-to-peer file sharing service, and it is now turning its attention to other services, such as Kazaa and Morpheus, which are used to trade copyrighted materials, such as movies, music, software, and games. They are aggressively targeting Web sites that offer media for download as well as their Internet service providers. Just this week the Recording Industry Association of America (RIAA) got a federal judge to order Verizon to turn over the identity of a subscriber suspected of making available unauthorized copies of several hundred songs. They are also putting pressure on universities. The RIAA is currently sponsoring an educational program with more than 300 universities, claiming a 55% drop in the number of sites on university servers offering illegal downloads.
After the RIAA gets file-sharing services, ISPs, and universities under control, there is only one other place where consumers can easily download illegal content: the workplace. Is there any doubt that corporations with high-speed Internet connections and large networks of desktop computers will be the next target of the entertainment industry?
Therefore, companies need to get desktops under control. To avoid software license non-compliance, companies need to periodically audit license usage. And, to avoid liability for employee actions, companies must demonstrate "reasonable effort" to deter misappropriation or theft of computer software and intellectual property. Efforts should include a clear company policy regarding desktop software and content, consequences for non-compliance, periodic desktop audits, and documentation of the results.
Since 2002, providing independent analysis of issues and trends in enterprise technology with a critical analysis of the marketplace.
Wednesday, January 22, 2003
Tuesday, January 21, 2003
Blocking and tackling in the warehouse
A recent survey of supply chain managers and executives revealed that a surprising number of companies have yet to fully adopt transportation and warehouse management systems (TMS and WMS) and advanced planning and scheduling (APS) systems, and that these systems are only now becoming more broadly accepted.
The study found that only 15 percent of responding companies had fully implemented TMS, while another 10 percent were in the process of implementation. Adoption was somewhat better for WMS, with 35 percent having a WMS in place and another 10 percent in process of implementation. Still the study found that interest in such solutions is high, with an additional 24 percent considering TMS and an additional 35 percent considering WMS.
"We were surprised by how interested people are in those fundamentals — systems that help in the warehouse or that help in the transportation function, for instance," said Thomas Goldsby, assistant professor of marketing and logistics at Ohio State University, who was one of the study’s authors. He also said they were surprised that these systems, which he calls "blocking and tackling" technologies of logistics — had not seen broader adoption given the IT spending spree of the late 1990s and the widespread buzz about supply chain collaboration in recent years.
In my opinion, such findings are not surprising to anyone who has actually spent any time among midsize manufacturing and distribution firms. The gap is huge between what vendors offer and what most midsize hard-goods firms have actually implemented. Companies may have back office systems such as accounting, purchasing, and customer order management under control, but out in the warehouse there is still a tremendous opportunity for basic execution systems such as warehouse and transportation management. This should be good news for vendors of such systems, such as EXE, Manhattan Associates, Optum, and Catalyst, as well as the hundreds of niche vendors that offer point solutions in this space.
iSource has a complete report on the study.
The study found that only 15 percent of responding companies had fully implemented TMS, while another 10 percent were in the process of implementation. Adoption was somewhat better for WMS, with 35 percent having a WMS in place and another 10 percent in process of implementation. Still the study found that interest in such solutions is high, with an additional 24 percent considering TMS and an additional 35 percent considering WMS.
"We were surprised by how interested people are in those fundamentals — systems that help in the warehouse or that help in the transportation function, for instance," said Thomas Goldsby, assistant professor of marketing and logistics at Ohio State University, who was one of the study’s authors. He also said they were surprised that these systems, which he calls "blocking and tackling" technologies of logistics — had not seen broader adoption given the IT spending spree of the late 1990s and the widespread buzz about supply chain collaboration in recent years.
In my opinion, such findings are not surprising to anyone who has actually spent any time among midsize manufacturing and distribution firms. The gap is huge between what vendors offer and what most midsize hard-goods firms have actually implemented. Companies may have back office systems such as accounting, purchasing, and customer order management under control, but out in the warehouse there is still a tremendous opportunity for basic execution systems such as warehouse and transportation management. This should be good news for vendors of such systems, such as EXE, Manhattan Associates, Optum, and Catalyst, as well as the hundreds of niche vendors that offer point solutions in this space.
iSource has a complete report on the study.
Monday, January 13, 2003
ROI study is a huge embarrassment for i2. Nucleus Research has just published a study on the return on investment (ROI) for users of i2, a Tier I supply chain management vendor. It found that over half of the companies surveyed have not yet received a positive ROI from their investments in i2 software. The study, based on interviews with companies that i2 touts as references on its own Web site, is bad news for i2. Worse still, Nucleus is freely distributing the report, so anyone even thinking about i2 is going to hear about it, probably from i2's competitors.
The study found several factors contributing to negative ROI:
To be fair, many companies have been successful in implementing i2. And, i2 has been targeting the largest organizations with the most complex supply chain problems, a high-risk environment for any technology vendor. Furthermore, customers as well as vendors bear responsibility for the success of any software implementation. Nevertheless, it is impossible to escape the conclusion that in too many cases i2 has been over-promising and under-delivering. As a result, supply chain vendors, including i2, have already adjusted their strategies to simplify their applications, break up sales into more bite-sized pieces, and to focus on more targeted solutions, such as supply chain event management.
The study itself is on the Nucleus Research website. Computerworld has an article with some interesting, if somewhat muted, reaction to the study from i2.
The study found several factors contributing to negative ROI:
- Customers buying modules they will never use, due to i2 bundling of products.
- Software license cost based on upfront estimates by i2 of benefits that the client will achieve, resulting in license costs that are simply too high.
- Steep learning curves and long training periods, with average training time of nearly two weeks per user, and super-users requiring up to two months of training.
- Consultants from i2 not sufficiently knowledgeable in their own products.
- Implementation taking longer than estimated in nearly 70 percent of the companies surveyed.
To be fair, many companies have been successful in implementing i2. And, i2 has been targeting the largest organizations with the most complex supply chain problems, a high-risk environment for any technology vendor. Furthermore, customers as well as vendors bear responsibility for the success of any software implementation. Nevertheless, it is impossible to escape the conclusion that in too many cases i2 has been over-promising and under-delivering. As a result, supply chain vendors, including i2, have already adjusted their strategies to simplify their applications, break up sales into more bite-sized pieces, and to focus on more targeted solutions, such as supply chain event management.
The study itself is on the Nucleus Research website. Computerworld has an article with some interesting, if somewhat muted, reaction to the study from i2.
Saturday, January 11, 2003
E-learning ROI isn't a slam dunk. Earlier this year, I wrote about a study that found a strong return on investment from e-learning initiatives. I still believe that the business case for e-learning can be strong, but getting the payback takes more effort than vendors of e-learning solutions like to admit.
Achieving e-learning benefits.
First, companies should devote more effort to ensuring that employees actually use the system and learn something. Simply implementing Web-based training is not enough. E-learning is a whole different type of experience than live classroom training. If this fact is not recognized, companies may replace classroom training with e-learning but find that employees are not learning anything.
For example, Web-based training tends to shift responsibility for learning from the trainer to the trainee. E-learning simply requires more self-discipline on the part of the student. One study indicated that only 25% of students who start an e-learning course actually complete it. When Bob goes off to classroom training, he leaves his office and faces a live instructor who, if he or she is any good, holds his attention in the classroom. But when Bob stays at his desk and participates in Web-based training, it's too easy for him to answer the phone or read e-mails on the side. It's also too easy for his manager to pop in with a quick question, or ask him to defer his lesson in order to deal with some crisis. Of course, sometimes the Web-based content is simply boring. Therefore, CIOs and HR professionals must remind themselves that the goal of e-learning is not merely to save training costs, but to more effectively train the workforce. Those responsible for e-learning must continually assess how well learning objectives are being met.
Finding the sweet spot.
Second, decision makers must remind themselves that e-learning is just one element of a comprehensive employee development program. Web-based training is good for some things — it is not good for others.
For example, Web-based training often does not accommodate labs or hands-on exercises as well as classroom training does. I saw this first-hand about a year ago while working with a Fortune 50 technology firm that has already made a huge investment in e-learning. I was helping to manage a program to develop classroom material along with hands-on exercises involving this firm's products. Our charter was to conduct a series of instructor-led classes, refine the material, and then convert it for Web-based delivery. Although we could see how to translate the lecture material for the Web, the courseware authors (all highly experienced system engineers) could not imagine how the lab exercises — where most of the real learning occurs, could be delivered over the Web. Therefore, the best approach for many technical subjects would be a combination of Web-based training with in-person labs or practical exercises.
Web-based training, or distance learning, is not going to replace all classroom training. More likely, it will be useful for basic subjects that must be taught to a large pool of employees, such as new employee orientation, EEOC training, HIPAA compliance, or basic user training during an ERP or CRM rollout. It also may be useful to cover prerequisite subjects prior to classroom training, or to provide follow-up. Training professionals have already discovered that computer-based training is useful for measuring the effectiveness of any kind of training. Nevertheless, for some subject matter, classroom training is simply the better vehicle. The challenge for companies will be to find the best combination of training formats to develop employee skills most cost-effectively.
Per-user Pricing Can Be Costly.
Finally, buyers should check their assumptions on the cost side of the equation. E-learning solutions can be expensive. Vendors often price their solutions based on total employee headcount or total named users. But this assumes that a large percentage of the employee population will adopt Web-based training. Companies that have signed up for such deals often find that actual adoption, or use of the system, lags far behind the total number of seats the company has licensed.
Typically, a lack of system usage does not play into building a solid business case for most other types of IT investments. For example, end users of a newly implemented transactional system, such as ERP, have little choice when it comes to using the new system. Because the end users can't do their jobs apart from the ERP system, assumptions regarding the user count in an ERP business case tend to be accurate. However, employees or entire departments can choose not to take advantage of an e-learning system. Therefore, when building a business case for an e-learning initiative, buyers should try to structure the deal to specify a conservative base number of users, with terms that allow additional users to be added on a per user basis.
Vendors like to justify the cost of their e-learning systems by pointing to the huge savings in travel costs that will result if much of the live classroom training in central locations is replaced with distance learning. But, as noted, this benefit may be overstated. I believe that there is already a backlash developing against unreasonable expectations for e-learning.
I still believe that the business case for e-learning is strong. I also believe that we are still early in the life cycle of adoption, as companies learn how to leverage the unique strengths of Web-based training. But ultimately, it has to be much more than simply avoiding travel costs. It comes down to how much knowledge is effectively transferred and to what extent employee skills are actually enhanced. By focusing on the objectives, as well as the cost savings, executives can achieve a more reliable business case for e-learning.
E-Learning vendors.
For companies considering development of an e-learning capability, there are dozens of niche vendors offering solutions, some on a license basis, others on a hosted basis. Some of the current vendors include Click2Learn, Docent, Element K, GeoLearning, Intralearn, KnowledgePlanet, NETg, Pathlore, ReadyGo, Saba, Skillsoft, and Skillview Technologies. In addition, many of the enterprise application vendors, such as SAP, PeopleSoft, J.D. Edwards, Oracle, and Siebel have introduced e-learning capabilities as part of their suite of products.
Achieving e-learning benefits.
First, companies should devote more effort to ensuring that employees actually use the system and learn something. Simply implementing Web-based training is not enough. E-learning is a whole different type of experience than live classroom training. If this fact is not recognized, companies may replace classroom training with e-learning but find that employees are not learning anything.
For example, Web-based training tends to shift responsibility for learning from the trainer to the trainee. E-learning simply requires more self-discipline on the part of the student. One study indicated that only 25% of students who start an e-learning course actually complete it. When Bob goes off to classroom training, he leaves his office and faces a live instructor who, if he or she is any good, holds his attention in the classroom. But when Bob stays at his desk and participates in Web-based training, it's too easy for him to answer the phone or read e-mails on the side. It's also too easy for his manager to pop in with a quick question, or ask him to defer his lesson in order to deal with some crisis. Of course, sometimes the Web-based content is simply boring. Therefore, CIOs and HR professionals must remind themselves that the goal of e-learning is not merely to save training costs, but to more effectively train the workforce. Those responsible for e-learning must continually assess how well learning objectives are being met.
Finding the sweet spot.
Second, decision makers must remind themselves that e-learning is just one element of a comprehensive employee development program. Web-based training is good for some things — it is not good for others.
For example, Web-based training often does not accommodate labs or hands-on exercises as well as classroom training does. I saw this first-hand about a year ago while working with a Fortune 50 technology firm that has already made a huge investment in e-learning. I was helping to manage a program to develop classroom material along with hands-on exercises involving this firm's products. Our charter was to conduct a series of instructor-led classes, refine the material, and then convert it for Web-based delivery. Although we could see how to translate the lecture material for the Web, the courseware authors (all highly experienced system engineers) could not imagine how the lab exercises — where most of the real learning occurs, could be delivered over the Web. Therefore, the best approach for many technical subjects would be a combination of Web-based training with in-person labs or practical exercises.
Web-based training, or distance learning, is not going to replace all classroom training. More likely, it will be useful for basic subjects that must be taught to a large pool of employees, such as new employee orientation, EEOC training, HIPAA compliance, or basic user training during an ERP or CRM rollout. It also may be useful to cover prerequisite subjects prior to classroom training, or to provide follow-up. Training professionals have already discovered that computer-based training is useful for measuring the effectiveness of any kind of training. Nevertheless, for some subject matter, classroom training is simply the better vehicle. The challenge for companies will be to find the best combination of training formats to develop employee skills most cost-effectively.
Per-user Pricing Can Be Costly.
Finally, buyers should check their assumptions on the cost side of the equation. E-learning solutions can be expensive. Vendors often price their solutions based on total employee headcount or total named users. But this assumes that a large percentage of the employee population will adopt Web-based training. Companies that have signed up for such deals often find that actual adoption, or use of the system, lags far behind the total number of seats the company has licensed.
Typically, a lack of system usage does not play into building a solid business case for most other types of IT investments. For example, end users of a newly implemented transactional system, such as ERP, have little choice when it comes to using the new system. Because the end users can't do their jobs apart from the ERP system, assumptions regarding the user count in an ERP business case tend to be accurate. However, employees or entire departments can choose not to take advantage of an e-learning system. Therefore, when building a business case for an e-learning initiative, buyers should try to structure the deal to specify a conservative base number of users, with terms that allow additional users to be added on a per user basis.
Vendors like to justify the cost of their e-learning systems by pointing to the huge savings in travel costs that will result if much of the live classroom training in central locations is replaced with distance learning. But, as noted, this benefit may be overstated. I believe that there is already a backlash developing against unreasonable expectations for e-learning.
I still believe that the business case for e-learning is strong. I also believe that we are still early in the life cycle of adoption, as companies learn how to leverage the unique strengths of Web-based training. But ultimately, it has to be much more than simply avoiding travel costs. It comes down to how much knowledge is effectively transferred and to what extent employee skills are actually enhanced. By focusing on the objectives, as well as the cost savings, executives can achieve a more reliable business case for e-learning.
E-Learning vendors.
For companies considering development of an e-learning capability, there are dozens of niche vendors offering solutions, some on a license basis, others on a hosted basis. Some of the current vendors include Click2Learn, Docent, Element K, GeoLearning, Intralearn, KnowledgePlanet, NETg, Pathlore, ReadyGo, Saba, Skillsoft, and Skillview Technologies. In addition, many of the enterprise application vendors, such as SAP, PeopleSoft, J.D. Edwards, Oracle, and Siebel have introduced e-learning capabilities as part of their suite of products.
Wednesday, January 01, 2003
Aberdeen: new poster child for sloppy research
Earlier this month, Aberdeen Group released a study that claims “the poster child for security glitches is no longer Microsoft; this label now belongs to open source and Linux software suppliers.” However, a closer look at Aberdeen’s research indicates that it may be more appropriate to focus the spotlight on Aberdeen itself.
Aberdeen found that:
If true, this is a stunning turnabout. It is common knowledge that Microsoft has had much bad press surrounding security of its products, such as deficiencies in IIS. Organizations with mission-critical security requirements have traditionally implemented open source products for systems exposed to the Internet, such as Unix or Linux. So, if Aberdeen’s analysis is correct, the trend has been reversed, with Microsoft’s efforts over the past year in “trustworthy computing” paying off to make Microsoft now more secure than Unix/Linux and open source in general.
However, Aberdeen's analysis is faulty. Because CERT advisories are public information, it is a simple matter to look at the raw data behind Aberdeen’s conclusions and see where Aberdeen erred. David Kelsheimer, network services practice director for Strativa, assisted me in dissecting Aberdeen’s conclusions. Based on our analysis of the 2002 CERT advisories, we can summarize the problems with Aberdeen’s study as follows:
1. Aberdeen counts CERT advisories, ignoring multiple vulnerabilities per advisory. This is like counting the number of guests arriving at a party by counting the automobiles they come in, regardless of the number of passengers in each car. For example, CA-2002-09 describes 10 separate vulnerabilities in Microsoft’s IIS, but Aberdeen counts them as one advisory. Thus, by Aberdeen’s reckoning, in the first 10 months of 2002, 16 out of the 29 security advisories are for open source/Linux, and 7 out of 29 are for Microsoft.
However, when we count based on the number of vulnerabilities within the advisories, the score is 18 for open source/Linux and 24 for Microsoft. We categorize another 34 as “other” or cross-platform vulnerabilities, because they are difficult to attribute to Microsoft or Linux/open source. (E-mail me at the address in the right column if you would like a copy of our worksheet.)
2. CERT advisories are not an adequate sample. CERT itself has said as much, in response to Aberdeen’s study. CERT’s comments were reported in an InternetWeek article, which said,
If Aberdeen were interested in a more complete sample, it could have looked at the complete database of CERT vulnerabilities, which lists over 3,000 vulnerabilities for the first ten months of 2002. The fact that it didn’t is puzzling.
3. Comparing Microsoft with “open source/Linux” is not a fair comparison. As one correspondent to SecurityFocus pointed out,
4. Aberdeen fails to note other problems with the use of CERT data. For example, CERT only reports vulnerabilities that are confirmed by the software developer. Because of the nature of open source, vulnerabilities tend to get reported and confirmed more transparently for open source than for closed source products, such as those of Microsoft, which has the option of not disclosing vulnerabilities which it finds and patches itself in the next release. Second, CERT issues advisories only for those vulnerabilities with the potential for the widest impact on the Internet. As one correspondent to OSOpinion pointed out,
Aberdeen’s failure to properly use CERT data can only have two explanations: either Aberdeen researchers did not realize the shortcomings of such data, or worse, they had a conclusion they wanted to reach and searched for data to help them reach that conclusion. If Aberdeen’s research is so poor when based on public data, which can be independently verified, how can we trust its research when it develops the data itself and does not release it in its raw form?
Aberdeen found that:
“Open source software, commonly used in many versions of Linux, Unix, and network routing equipment, is now the major source of elevated security vulnerabilities for IT buyers. Security advisories for open source and Linux software accounted for 16 out of the 29 security advisories — about one of every two advisories — published for the first 10 months of 2002 by CERT (Computer Emergency Response Team). Keeping pace with Linux and open source software are traditional Unix-based software products, which have been affected by 16 of the 29 — about half of all — advisories to date during 2002. During this same time, vulnerabilities affecting Microsoft products numbered seven, or about one in four of all advisories.”
If true, this is a stunning turnabout. It is common knowledge that Microsoft has had much bad press surrounding security of its products, such as deficiencies in IIS. Organizations with mission-critical security requirements have traditionally implemented open source products for systems exposed to the Internet, such as Unix or Linux. So, if Aberdeen’s analysis is correct, the trend has been reversed, with Microsoft’s efforts over the past year in “trustworthy computing” paying off to make Microsoft now more secure than Unix/Linux and open source in general.
However, Aberdeen's analysis is faulty. Because CERT advisories are public information, it is a simple matter to look at the raw data behind Aberdeen’s conclusions and see where Aberdeen erred. David Kelsheimer, network services practice director for Strativa, assisted me in dissecting Aberdeen’s conclusions. Based on our analysis of the 2002 CERT advisories, we can summarize the problems with Aberdeen’s study as follows:
1. Aberdeen counts CERT advisories, ignoring multiple vulnerabilities per advisory. This is like counting the number of guests arriving at a party by counting the automobiles they come in, regardless of the number of passengers in each car. For example, CA-2002-09 describes 10 separate vulnerabilities in Microsoft’s IIS, but Aberdeen counts them as one advisory. Thus, by Aberdeen’s reckoning, in the first 10 months of 2002, 16 out of the 29 security advisories are for open source/Linux, and 7 out of 29 are for Microsoft.
However, when we count based on the number of vulnerabilities within the advisories, the score is 18 for open source/Linux and 24 for Microsoft. We categorize another 34 as “other” or cross-platform vulnerabilities, because they are difficult to attribute to Microsoft or Linux/open source. (E-mail me at the address in the right column if you would like a copy of our worksheet.)
2. CERT advisories are not an adequate sample. CERT itself has said as much, in response to Aberdeen’s study. CERT’s comments were reported in an InternetWeek article, which said,
“CERT believes Aberdeen drew too much from its numbers. The organization doesn't draw any conclusions from its advisories on the vulnerability of open-source software vs. Microsoft or any other seller of proprietary applications. Instead of comparisons, the group focuses on identifying and studying security problems it considers most serious based on CERT's own metrics. That covers about 20 percent of all known vulnerabilities, said Shawn Hernan, senior member of the CERT technical staff.”
If Aberdeen were interested in a more complete sample, it could have looked at the complete database of CERT vulnerabilities, which lists over 3,000 vulnerabilities for the first ten months of 2002. The fact that it didn’t is puzzling.
3. Comparing Microsoft with “open source/Linux” is not a fair comparison. As one correspondent to SecurityFocus pointed out,
“…to take a listing of vulnerabilities from CERT (not a comprehensive list by any means!) and say that Linux is less secure because there are more open source advisories is laughable. There are more types of open source software out there, than there are software packages from Microsoft. To attribute open source flaws to Linux is like blaming Microsoft for the holes in AOL Instant Messenger.”
4. Aberdeen fails to note other problems with the use of CERT data. For example, CERT only reports vulnerabilities that are confirmed by the software developer. Because of the nature of open source, vulnerabilities tend to get reported and confirmed more transparently for open source than for closed source products, such as those of Microsoft, which has the option of not disclosing vulnerabilities which it finds and patches itself in the next release. Second, CERT issues advisories only for those vulnerabilities with the potential for the widest impact on the Internet. As one correspondent to OSOpinion pointed out,
“Since a large percentage of Internet infrastructure is based on open source software such as BIND, sendmail, and Apache, it makes sense that security flaws in these products would be considered serious, while flaws in Microsoft products may not be counted because they have much less of an impact. A security flaw in Microsoft Word may be bad, but it does not have the potential to bring down much of the Internet or compromise the integrity of millions of dollars in e-commerce.”
Aberdeen’s failure to properly use CERT data can only have two explanations: either Aberdeen researchers did not realize the shortcomings of such data, or worse, they had a conclusion they wanted to reach and searched for data to help them reach that conclusion. If Aberdeen’s research is so poor when based on public data, which can be independently verified, how can we trust its research when it develops the data itself and does not release it in its raw form?
Subscribe to:
Posts (Atom)