VnutZ Domain
Copyright © 1996 - 2024 [Matthew Vea] - All Rights Reserved

2004-08-30
Featured Article

Operating System Exploits of 2004

[index] [18,991 page views]
Tagged As: Hacking, Privacy, and Security

Nothing is perfect … but some things are further from perfect than others. Computer software, particularly operating systems, are amongst the most complex systems created by man. How does one analyze and troubleshoot a complicated system of components without any physical properties? To further exacerbate the situation, consumers demand ubiquity of operation from the same code across a variety of architectures. Thus, contemporary programmers are faced with forcing the same outcome from processors that range in variance from representing number formats differently, to handling stack and memory implementations differently, and even atomicty of OPcode timing and hardware clockrates differently. Add to this situation a variety of hardware, technophobic consumer demand for simplicity through "wizard" configurations, and a need to support legacy software. Software companies are forced to choose a narrow path between ideal product development and market demands for output against competition.1

These factors do not solely account for today’s software flaws. Modern programming practices themselves have much to do with it. When lines of code number in the hundreds of millions, it is imperative to divide the programming tasks across development teams. While good project management and such tools like data flow diagrams can certainly alleviate component integration, the system is still a collaboration of human efforts and bound to incorporate errors from the sheer scale. The nature of relying upon ever changing libraries or the operational trust involved with a layered approach forces programmers into understanding their code probably will not operate exactly as planned should a dependency change down the road. As such, a more lax attitude is adopted with blame shifting towards another’s code following the excuse – "it worked before."

Lastly, code may be fundamentally sound and devoid of problems at the source code level. But unless the compiler, assembler, and linker are absolutely reliable, all will be for naught at the final stage. As algorithms evolve, the compiler may not necessarily construct machine code as the original programmer intended. Variances will be introduced through processor optimizations, memory optimizations and lexical algorithms. Such variances may add points of failure in device drivers, potential overflows in parameter passing, dependences on memory locations, or mistakenly introduce timing issues into operations that were intended to be atomic. These problems are relatively easy to tackle on the academic side of the tracks but next to impossible to eliminate with the enormous projects of the private sector.

Hence, modern operating systems ship in an imperfect state. At the moment of production, the system was more than likely the "best" possible version prior to hackers discovering nuances in the code. The Internet’s growing sphere of influence accelerates the propagation of these flaws from discovery to announcement to exploitation. As hacking "kits" abound, the timeline from discovery to exploitation has shortened dramatically to the point a system is likely to be compromised faster than it can be secured. Ongoing analysis by the SANS Internet Storm Center report that as of August 2004, the average Windows based computer stands to survive approximately 20 minutes before succumbing to vulnerabilities (see graphic below from SANS webpage). Global network insecurity will continue to grow as networks grow in speed and users continue to use unpatched, legacy systems.

Figure 1. 'Survival Time History' [on-line] (accessed 27AUG04) available from http://isc.sans.org/survivalhistory.php

The following links contain vulnerability scans produced by Nessus against operating systems in various states of configuration. This is intended to reveal the "out of box" vulnerabilities a modern computer is exposing to the Internet. A scan was run against each operating system during the installation phase (after the networking stack was in place) and again after booting into the live system. Whenever possible, the configurations are in as pure of a default state as possible. Most workstation installations consist merely of clicking "NEXT" until completion. With the server installations, the most likely server daemons were selected to show their default vulnerabilities. Following the installation, no settings are made within the operating system unless noted below. As a control, each operating system is installed into a VMware Virtual Workstation to provide a common hardware platform. Furthermore, these installations were performed on a closed network to avoid taint from a third party exploitation.

It should be noted that these scans do not reveal locally exploitable or user triggered vulnerabilities. For instance, race conditions within the system that can be exploited to gain local ROOT access will not be tested. Nor will bugs that require a user to open a malicious webpage or email be evaluated. These tests are designed to reveal what exploits are remotely available to malcontents to compromise a computer from afar without any user involvement.

Windows 2000 Scans: There are intriguing elements to examine with the W2K installs. During the installation phase (and remaining afterwards), nmap3 revealed port 21 [FTP] as active. An FTP client will successfully connect to the computer and receive a 421 error code regarding "Service not available, remote server has closed connection." Furthermore, the operating system detection routine could not match the TCP/IP stack used during the installation to any known fingerprints. There is the possibility that the Microsoft installer utilizes a different network stack than the operating system that is being installed. (NOTE: For the Server editions of Windows, the FTP, DNS, DHCP, SNMP and WINS services were added to the default list as they are commonly used services).

Linux Scans: For Linux installations, the FULL install option was selected for all distributions. This will produce a common benchmark for "total system" out-of-box vulnerabilities present in each release. Scans against Live CD style systems are made after booting is complete but prior to any system log in.

Apple Scans: The test platform did not support OSX versions prior to 10.3.3 negating the tests of earlier releases. To validate Panther’s out-of-box capability, the firewall was left off and all built-in services were enabled (personal file sharing, windows sharing, personal web sharing, remote login, FTP, apple remote desktop, remote apple events, and printer sharing).

UNIX Scans: NetBSD installs by default with all services off, therefore, INETD was modified to enable every service in their default state to get a picture of the system’s security. No additional services were added or removed, the "#" comment delimiter was simply removed from the /etc/inetd.conf file from all non-IPV6, built-in services. FreeBSD’s installation process includes a section to enable services in INETD, therefore, all "out-of-box" available services were enabled by the installer for the security check. An additional install option was for a default security level of moderate.

1 This article can also be found on Matthew Vea’s homepage as "Default Operating System Exploits; [on-line] (accessed 01DEC06) available from http://www.vnutz.com/content/exploit_in_box



More site content that might interest you:

This doesn't bode well for the future stability of the American power grid.


Try your hand at fate and use the site's continuously updating statistical analysis of the MegaMillions and PowerBall lotteries to choose "smarter" number. Remember, you don't have to win the jackpot to win money from the lottery!


Tired of social media sites mining all your data? Try a private, auto-deleting message bulletin board.


paypal coinbase marcus