Y2K - An Uncertain Time

The British financial magazine, The Economist (Sept. 19, 1998) did a thorough article on Y2K. The opening piece stated "The millennium computer bug is totally predictable in its timing, but completely unpredictable in its effects. Its greatest danger lies in that uncertainty."

This past month has seen the emergence of a number of high-profile surveys, studies, reports and news items, which unfortunately have been under-reported in the main stream press due to Monica-Gate. Merrill Lynch issued a 440-page report produced as a survey of their various domestic and overseas offices and clients. The objective of their survey was to determine the economic impact of Y2K on international companies as well as on global financial markets. Most of the report was guaredly positive, with the exception of items like Brazil's telephone systems probably failing. However, within two weeks of the report's release, Merrill Lynch announced they had underestimated their own cost of bringing their systems into compliance and they increased their Y2K budget by an additional $100 million. Uncertainty strikes again! The Gartner Group, a well-respected major IT consulting firm reported that, of the 15,000 companies and government agencies they surveyed, 23% had not even started a millennium project yet and most of those (80%) were small-to-medium enterprises. Only 11% of those surveyed had started to look at embedded chips. There are about 50 billion embedded chips lurking in systems throughout the world.

As reported in last month's Hard-Copy, the state of Pennsylvania has been in the forefront of Y2K assessment and remediation. So too has been the state of Washington. In the middle of September, the Seattle Times reported that of the state's 454 mission-critical systems, only 4% have been certified as Y2K-compliant. An interesting detail that accompanied this report was the state had to re-define what was and wasn't mission-critical. Last summer, the state identified more than 600 mission-critical systems, this summer it is down to 454.

The uncertainty on the Federal level is even greater. As an example, the head of the FAA made a number of pronouncements in the media about the FAA being ahead of the power curve now in Y2K remediation and testing. She was immediately criticized by the heads of the Senate and Congress Y2K subcommittees and the GAO, charged with analyzing this problem, whereupon she announced that she appreciated her colleagues constructive input. Then in the middle of September Congressman Horn's Y2K subcommittee issued his report card on Federal agencies Y2K status. In February the overall grade for the government was D-, in May the grade was F and in August it was D. More telling than this general grade of all government agencies however is the budget creep. In February the estimated cost to the Federal government on Y2K was approximately $4.7 billion and 6 major agencies were in trouble, in May the budget went up to $5 billion and 6 agencies were still in trouble. Then in August the budget was raised to $6.3 billion with 7 agencies in trouble, one of which being the Department of Transportation, which of course includes the FAA. Lest you think that the FAA is the only agency misunderstanding or misreporting their real Y2K status, the GAO also came down hard on the Department of Defense for understating the scope and costs of the problem. How are we doing on uncertainty?

Capers Jones is generally recognized as a foremost leader in this field of Information Technology research analysis and he is an author of numerous IT textbooks. His current estimate of the costs of Y2K over the next 10 years is approximately $3.5 trillion, which includes $1.5 trillion in consequential litigation costs. The table below helps to place this figure into historical perspective.

EVENT Estimated Cost ($billions)
World War II


Capers Jones 10 yr. estimated costs


Vietnam War


Kobe, Japan earthquake


Los Angeles earthquake


Furthermore, he points out that in the 50-year history of the computer industry, there has never been defect-free software. In fact, he says "It is naive and risky to assume that 100% of Year 2000 errors will be found and repaired, since the U.S. average for other bugs is only about 85% defect removal efficiency and even best in class results are below 99% in efficiency". He goes on to say, "It would be a reasonable contingency plan to have emergency response teams available in every company and government agency to deal with the impact of undiscovered Year 2000 problems. How's that for uncertainty!

The number of news items that have recently come across my desk dealing with contingency planning has surprised even me. To name just two by way of illustration, the Canadian government recently announced that their military forces would begin to coordinate contingency plans with provincial and local governments to insure continuity of government services, named Project Abacus and a major conference was held in Houston, Texas, by the Electrical Power Industry on Y2K Contingency Planning with representatives of most of the major utilities, associated vendors, and service companies in attendance. Now what does contingency planning for the utility industry mean for us consumers and owners of PC's, I ask you?

Well that's enough of the doom and gloom. So what is CCS doing about the remaining time we have left? I'll leave you with the mainpoints of our work thus far:

To stay on top of this important issue check our regular Y2K updates at the CCS Y2K Main Page. The one thing you can be certain of is, CCS is taking their commitment to you, our members, seriously and we will be keeping you fully informed.