Exchanging data electronically is a common method of transferring information among federal, state, and local governments; private sector organizations; and nations around the world. As computers play an ever-increasing role in our society, more information is being exchanged regularly. Federal agencies now depend on electronic data exchanges to execute programs and facilitate commerce.
This may be one of the biggest yet most frequently underestimated problem Y2K poses to our way of life. The failure of computers to comunicate to each other through corrupt, non-compliant data.
For example, to purchase an airline ticket over the internet it goes through about seven levels of transactions before it is processed. if only one of those terminals is not compliant, the whole transaction cannot be made.
The US government is one giant network with mind-boggling interconnections between agencies and data interfaces. Social Security, after 8 years of working on y2k, managed to get most of its systems compliant. What is rarely mentioned by the spin-miesters is the fact that in order to deliver benefits:
The computers used to determine eligibility at the state level must be compliant. They must be able to tranfer non-corrupt data to the main Social Security computer systems.
Think of the data interchanges between GM and its 100,000 suppliers! Or Ford, Chrysler, Exxon... Or the hundreds of thousands of terminals that interact globally in our international banking system or currency markets where several trillion dollars in transactions take place every day!
The following is from Gary North's site. When he says "It's systemic. It can't be fixed!" This is what he's talking about. I tend to agree.
On April 16, 1996, the Assistant Secretary of Defense in charge of y2k testified before a Congressional committee. He offered this warning:
"The management aspects associated with the Year 2000 are a real concern. With our global economy and the vast electronic exchange of information among our systems and databases, the timing of coordinated changes in date formats is critical. Much dialogue will need to occur in order to prevent a 'fix' in one system from causing another system to 'crash.' If a system fails to properly process information, the result could be the corruption of other databases, extending perhaps to databases in other government agencies or countries. Again, inaction is simply unacceptable; coordinated action is imperative."
He was saying that if a compliant computer sends compliant data in a compliant format to another computer, this transfer may crash that computer. Or the recipent computer may not recognize the compliant format. The system breaks down.
In a July, 1998 report by the General Accounting Office of the U.S. Congress, we read: "In addition to using bridges, filters may be needed to screen and identify incoming noncompliant data to prevent it from corrupting data in the receiving system."
On August 6, 1998, Joel Willemssen of the General Accounting Office testified before the Technology Subcommittee of the House Science Committee. He made the following observation:
"Examination of data exchanges is essential to every Year 2000 program. Even if an agency's--or company's--internal systems are Year 2000 compliant, unless external entities with which data are exchanged are likewise compliant, critical systems may fail. The first step is to inventory all data exchanges. Exchange partners, once inventoried, must be contacted; agreements must be reached as to what corrections must be made, by whom, and on what schedule; and requisite testing must be defined and performed to ensure that the corrections do, in fact, work."
This, in my view, is the biggest unsolvable problem of the y2k challenge. If a company somehow revises its computer systems' legacy code, tests it by parallel testing, does not crash its systems during the testing, and transports all of its old data to the newly compliant system, it faces a problem: it is part of a larger system. Computers transfer data to other computers. If the compliant computer imports data from a noncompliant computer, the noncompliant data will corrupt the compliant system's data. A company may have spent tens of millions on its repair, but once it imports noncompliant data, or extrapolations based on bad data, it has the y2k problem again.
Understand, this is a strictly hypothetical problem. There is no compliant industry anywhere on earth. I am aware of no company in any industry that (1) has 10 million lines of code and (2) claims to be compliant. I argue that there is not going to be a compliant industry, where the participants are all compliant. But if there were one where half the participants were compliant -- and we will not see this -- the other half would pass bad data on to the others. And if the others could somehow identify and block all noncompliant data based on noncompliant code, the industry would collapse. The data lockout would bankrupt the industry. Banking is the obvious example.
This has been denied by a few of my critics, though not many. These people are in y2k denial. Here is the assessment of Action 2000, which the British government has set up to warn businesses about y2k. The problem is not just software; faulty embedded chips/systems can transmit bad data:
"In the most serious situation, embedded systems can stop working entirely, sometimes shutting down equipment or making it unsafe or unreliable. Less obviously, they can produce false information, which can mislead other systems or human users."
In short, a noncompliant computer's data can corrupt a compliant computer's data. But those in charge of the compliant computer may not recognize this when it happens. They may then allow their supposedly compliant computer to spread the data with others. Like a virus, the bad data will reinfect the system. I describe this dilemma as "reinfection vs. quarantine."
Every organization that operates in an environment of other organizations' computers is part of a larger system. If it imports bad data from other computers in the overall network, the y2k problem reappears. But if it locks out all data from noncompliant sources, it must remove itself from the overall system until that system is compliant. This threatens the survival of the entire system. Only if most of the participants in a system are compliant will the system survive.
Consider banking. A bank that is taken out of the banking system for a month -- possibly a week -- will go bankrupt. But if it imports noncompliant data, it will go bankrupt. A banking system filled with banks that lock out each other is no longer a system.
There is no universally agreed-upon y2k compliance standard. There is also no sovereign authority possessing negative sanctions that can impose such a standard. Who can oversee the repairs, so that all of the participants in an interdependent system adopt a technical solution that is coherent with others in the system?
Corrupt data vs. no system: here is a major dilemma. Anyone who says that y2k is a solvable problem ought to be able to present a technically workable solution to this dilemma, as well as a politically acceptable way to persuade every organization on earth to adopt it and apply it in the time remaining, including all those that have started their repairs using conflicting standards and approaches.
Some people say that y2k is primarily a technical problem. Others say it is a primarily managerial problem. They are both wrong. It is primarily a systemic problem. To fix one component of a system is insufficient. Some agency in authority (none exists) must fix most of them. Those organizations whose computer systems are repaired must then avoid bankruptcy when those organizations whose systems are not compliant get locked out of the compliant system and go bankrupt.
If there is a solution to this dilemma, especially in banking, I do not see it.
My critics offer no solution. All they offer is this refrain: "North is not a programmer." But what has this to do with the entries in this category? Nothing at all.
Home Page: Future, Doomsday, Year2000