Windows Buffer Overflows: Bugs or Intended Backdoors?

by Doctor Electron

Nov 12, 2003, updated

The Great Buffer Overflows Mystery

Buffer overflow vulnerabilities, often reported by Microsoft Corporation as Windows software bugs, may be intended, built-in backdoors. A bug is a fault in software code which software makers desire to fix. A backdoor is software code that allows others access to a computer over a network connection.

Evidence strongly suggests that buffer overflow bug reports arise mainly, if not entirely, when backdoor code is discovered by outsiders. Then the outsider may be blamed or vilified as a hacker, cracker or attacker as part of a possible coverup of the existence of built-in backdoor facilities in Microsoft Windows.

Who are the insiders? Good question. It is plausible that insiders have long had knowledge of the built-in backdoor facilities in Windows. In addition, specific software tools may be available for insiders to gain access to computers as simply as clicking "Start".

One of the mysteries of our times is how it is possible to have buffer overflow bugs in Windows. Let us consider facts and try to use sound reasoning:

Windows Buffer Overflows As Software Bugs

Pro Source code is statements written by programmers which form the basis of distributed Windows software. How many hundreds of component modules and how many thousands or millions of source code statements are potentially involved in this issue? A lot.

So it may be said that the source code is so extensive and the resulting system is so complex, that buffer overflows are likely to occur as mistakes or unintended outcomes. It may be further said that such bugs are very difficult to identify and fix.

Is there any other fact or rationale that should be listed here as favoring the thesis that Windows buffer overflows are software bugs?

Con A buffer overflow occurs when software code (1) sets aside a fixed amount of memory as a place to hold data and (2) copies more data to this memory space than the fixed amount. As a result, whatever is placed just after this fixed memory area will be overwritten. That sounds like something even beginner programmers learn to avoid.

A backdoor for access to the computer is built into source code when executable statements are placed after the fixed-length buffer. Thus, if the excess data also contains executable instructions, whenever the system tries to run the original statements, the overwritten instructions are run instead.

Microsoft Corporation -- not malicious outsiders -- writes and ships to customers the software containing this covert backdoor access mechanism.

When data is copied to this memory buffer, it is likely that code placed after it might be called or run in short order to further process that data.

Is there hard evidence that certain buffer overflow bugs are really intended backdoors?

Exhibit A? It would be relatively easy to rank the program procedures in a module according to how often they are run, especially when the fixed-length buffer receives new data. Question: Are the most frequently run procedures the ones that have been placed exactly after the buffer? If so, we have circumstantial evidence of intent. Most programmers arrange the order of procedures in source code thematically, alphabetically or by some other criteria, not necessarily by frequency of usage.

Exhibit B? Select Windows modules written in the same time period and divide into two groups -- (1) process data received from network connections (required for backdoor access) and (2) other. Compare allowed buffer overflows per procedure in the two groups of modules. Is there a difference? If so, why?

In group #1, backdoor access requires data from the network -- data from the party seeking access. In group #2, we are processing boring data in hum-drum work and do not want buffer overflows which would cause errors or system failure. Customers might think the software is no good if buffer overflows were allowed in group #2.

Back to the network group #1, it is known that data that is of excessive length regarding the destination buffer size is probably faulty or at least does not contain malicious code. So the worst that could happen is system failure or program error. Plus, the backdoor designers made the buffers long to avoid this. However, they also knew exactly where to put the backdoor "setup code" in the input from users of the unauthorized access backdoor.

Exhibit C? Use the network vs non-network origin of source data as in Exhibit B and cross-tabulate this with code-after-overflow buffer vs other-data-after-overflow buffer. Buffer overflow bugs should be randomly distributed in this cross-tabulation as in the comparisons in Exhibits A and B, if they are truly bugs -- mistakes or oversights by programmers. On the other hand, an association of network origin of data with code-after-buffer makes a strong case for intentional backdoor functionality.

Exhibit D? The programmers writing the backdoors. Names and serial numbers, please. When one ferry crashes into a pier in New York, detailed investigation of who, what, when and where takes place. Exactly who was where at what time and did what. The backdoors are the cyber equivalent of a whole fleet of ferries crashing into piers. Strange how authorities want to question everyone to learn who was steering the ill-fated ferry but no one wants to identify and question the programmers writing backdoors into Microsoft Windows. This contrast is even more remarkable since the ferry tragedy appears to be an accident, but Windows backdoors may have been used to commit major crimes.

The lack of investigation of the cause of Windows backdoors is stunning. The computer security community and officials in government and the private sector appear to be out to lunch on this one. No questions asked. No subpeonas to Microsoft programmers. No study of Microsoft work logs. No hearings. No special commissions. No blood tests. No lie detector tests requested or denied. No "Microsoft Staff Interviewed by FBI" headlines. Nothing.

And when computer technology luminaries are quoted in the major media, the backdoors are described with the same old "bugs are lamentable" refrain. One might think that people responsible for Windows backdoors would be investigated down to every single person they have been associated with since the day they were born. The "It's a pity" crowd appears to be unconcerned with the security of government and private data.

In summary, Exhibits A, B, C and D are examples of analysis of source code that could provide evidence of intent which might favor either the bug theory or the backdoor theory. Various governments reportedly have this source code and can conduct their own independent analysis. On the other hand, maybe the entities that have the source code are among insiders engaged in using built-in Windows backdoor functionality for their own snooping.

The only problem in such a scheme is that outsiders discovered it and rained on the insider's parade. It is almost comical. The outsiders thought they discovered something. Well, they did. But most likely, they simply duplicated backdoor access that insiders had long used for routine clandestine access to computers using this built-in Windows backdoor functionality.

Con There is no way that Microsoft Corporation can be deemed to have insufficient personnel or resources to remove backdoor vulnerabilities via buffer overflows entirely in a very short time. It is the old "willing and able" test. Microsoft is "able", but apparently, not "willing".

In fact, a simple program could quickly create a list of all source code statements prone to cause buffer overflows, based on a list of source code files and a list of keywords. The program would scan all source code files and output filename and line number of possibly every potential buffer overflow in the entire library of Microsoft.

Then, in a few days -- or do we have to say weeks? -- source code could be edited and recompiled. Is the editing extensive? Nope. About a half dozen additional bytes of instructions is all that is needed in most cases.

What keywords (or phrases) would the scanning program look for? Easy. Data is copied either with in-line code which uses a finite list of instructions or by call to a routine that does the data copy operation. All that needs to be done is truncate data that is too lengthy. Instead of a buffer overflow, some other less serious error will probably occur because the data "may not make sense" in its truncated form.

Con - It cannot be over-emphasized how easy this is to do. The in-line code or the data copy procedure needs to know only the size of the destination memory space (the buffer), and do a simple comparison of that value with the size of the data to be copied. Now the data cannot be copied anywhere in any circumstance if its size is unknown. Thus, it is a physical impossiblity that a buffer overflow could occur (without hardware error) if the size of the destination buffer and of the source data are both known at the time the copy operation is done.

So how could a programmer not know the size of a destination buffer in the program or other software module? Does the buffer overflow bug thesis assume that programmers are incompetent in the largest software maker in the world? Right, fat chance.

It is like saying that suicide bombers are really unlucky people who all coincidentally make the same mistake. Or like security and media analysts saying that Donald Trump does not know how to develop real estate. But many explosions and tall buildings indicate otherwise -- something is going on. Terrorism is real. Mr. Trump does know about building things. And it is all but certain that what is going on is not software bugs. How many years has Microsoft had to write a reliable data copy routine? Fact: they do have this capability.

The foregoing is so obvious to programmers that the "buffer overflow bug" smokescreen was required. This suggests that some mighty important people may have been using this apparent built-in backdoor functionality of Microsoft Windows.

In brief, a buffer overflow "bug" is plausibly in the category of a physical impossibility or is an intended event.

Windows Buffer Overflows As Built-in Backdoors

Pro If not knowing source and destination data size is basically impossible, how could a buffer overflow bug exist? It seems that it cannot exist unless it was intended to occur.

If buffer overflows are allowed indiscriminantly, numerous malfunctions of application programs and the Windows operating system would be expected. Thus, the software vendor has every motive to prevent buffer overflows. In this context, one must ask why are specific overflows allowed? The present thesis is that these are mechanisms to implement backdoor access to computers.

These must be limited in number to prevent certain software failures, but be sufficient in number to form a sort of backdoor cache. As backdoor mechanisms are discovered by outsiders, enough must remain for insiders to retain the ability to use those that remain "unpatched" to access computers. At least, so the theory goes.

The backdoor theory also explains why all buffer overflow mechanisms are not removed immediately by Microsoft. Instead, we have a weekly or monthly drip-drip of "new" (are they kidding?) buffer overflows and patches, where supposedly the insiders give up one or a few of their remaining backdoors.

The lack of decisive action suggests that certain buffer overflows may be allowed for specific purposes such as backdoor access. Since security analysts and hackers have put this activity into the "Game Over" category, look for other methods which may be employed to build backdoors into Windows.

The buffer overflow mechanism was an easy way to implement backdoor access, at least before outsiders blew the whistle. Still, reporters should be asking that the complete list of allowed backdoor mechanisms be published after they all are removed.

Con Any valid facts or rationales should be inserted here. Anybody?

We might mention again an argument against the built-in backdoor thesis -- namely, administrative and code review failure or programmer incompetence. Let's look again at how far-fetched this idea is.

First, Microsoft Windows may well find a place among the great achievements of the human intellect of all time. Yet, the "Oops, there goes another bug" publicists want us to think that their work is sloppy or their workers are lousy. Not credible. If anybody falls short in the competence department, it is the technology news media, who nod the head and report each "Oops, another buffer overflow" at face value.

Second, as far as the author can see, all requests by application programs to copy data into application memory space require the application to specify the size of the destination buffer. This ensures reliable and secure data copying. Do the "It's a bug" people expect us to accept that the same reliable and secure methods of data transfer are not built into the Windows operating system among its various modules? It may come down to how gullible one wants to be.

Third, the backdoor access functionality built into Windows is very clever -- far from the work of incompetents. If one looks for "Trojan horse" backdoor code, it will not be found. Rather the overflow data contains a brief "setup and launch" procedure which is not permanently stored anywhere. So where is the evidence? Just clever? Let's call that ingenius.

Unlike a propagating virus that may create files to relaunch itself on computer reboot, this snooping mechanism can run its code obtained over the network by the initial setup code from memory. No files need to be created, since access is assured to the snooper. Conventional Trojan remote access servers which might be detected are not required either.

Yes, ingenius. Since the remote snooping code is refreshed from the network with each usage, it can be polished independently. There might be a huge library of snooping tools by now and how much revenue might Microsoft earn from such for-insiders-only products? Meanwhile, even security analysts buy the "bug" story.

It is sad to think that an analyst at a computer security firm recently seems to have lost his job for writing a paper which apparently criticized Microsoft for sloppy work. While all signs point to a built-in backdoor functionality, he like many others seems to have completely swallowed the "another bug" illusion. So who's work is sloppy?

If there is any evidence at all of sloppy work by Microsoft in this area, it should be presented. From technology to public relations, the work seems to be of uniform high quality in selling exposed Windows backdoors as software bugs. And the attention to detail is commendable -- such as training courses for staff on writing secure software and dutiful expression of measured embarrassment while -- according to the present thesis -- bug-free Windows features enabling clandestine remote access are repainted as buffer overflow bugs.

Incidentally, such very intelligent people no doubt began work on alternative backdoor access methods as soon as, if not before, the first "bug report" by outsiders appeared.

The Big Buffer Overflow Scam

So what is really happening behind the scenes in what we might call "the big buffer overflow scam"? Since the real problem cannot plausibly be buffer overflows, then what are the bugs, if any? Are we to believe a programmer doesn't know the size of the program's buffer? The demonstrated fact is that the built-in backdoors were already debugged.

The great buffer overflows mystery may now be solved with this headline on another Net Census exclusive:

"Long History of Built-in Backdoors in Microsoft Windows"

Does the backdoor theory assert that Microsoft Corporation has engaged in any wrong-doing? Absolutely not. A good theory is just the best explanation of all known facts. Indeed, apparent Windows backdoors may have been a lucrative source of revenue and stockholders might be quite happy. Talk of wrong-doing requires investigation of the cause and effects of Windows backdoors.

Does the backdoor theory assert that computer security analysts are gullible in apparently accepting the bug theory in spite of many contradictions enumerated above? And how about national security officials, corporate security officials, the technology media and privacy advocates? Answers may be found once these players get facts and sound reasoning in the same place at the same time.

The "We goofed again" bug theory is not supported by evidence. All we have is that people say it. The backdoor theory is consistent with all the facts.

Finally, Microsoft may owe thanks to the outsiders who first demonstrated the backdoors for helping create and foster the software bug theory as a workable coverup for what appears to be Windows backdoors by design.

Copyright © 2003 Global Services
Original publication: Oct 16, 2003

Back to Net Census