I have compiled this history purely out of my personal interest in the subject, and I apologise for any omissions or mistakes in the documents. If you have any suggestions, comments, corrections or additions, please e-mail me: swhite@ox.compsoc.net.
I've re-organised the timelines, by splitting everything into a series of smaller timelines. There's still a bit of work to do in sorting out exactly what should be in each timeline and I've got quite a lot of updating that I want to do. Hopefully it's now much easier to find things, and people on slower connections can avoid downloading the entire timeline! The entire timeline is still available for those who want it.
First generation computers were normally based around wired circuits containing vacuum valves and used punched cards as the main (non-volatile) storage medium. Another general purpose computer of this era was 'ENIAC' (Electronic Numerical Integrator and Computer) which was completed in 1946. It was typical of first generation computers, it weighed 30 tonnes contained 18,000 electronic valves and consumed around 25KW of electrical power. It was, however, capable of an amazing 100,000 calculations a second.
The next major step in the history of computing was the invention of the transistor in 1947. This replaced the inefficient valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still bulky and strictly the domain of Universities and governments.
The explosion in the use of computers began with 'Third Generation' computers. These relied Jack St. Claire Kilby's invention - the integrated circuit or microchip; the first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. While large 'mainframes' such as the I.B.M. 360 increased storage and processing capabilities further, the integrated circuit allowed the development of Minicomputers that began to bring computing into many smaller businesses. Large scale integration of circuits led to the development of very small processing units, an early example of this is the processor used for analyising flight data in the US Navy's F14A `TomCat' fighter jet. This processor was developed by Steve Geller, Ray Holt and a team from AiResearch and American Microsystems.
On November 15th, 1971, Intel released the world's first commercial microprocessor, the 4004. Fourth generation computers were developed, using a microprocessor to locate much of the computer's processing abilities on a single (small) chip. Coupled with one of Intel's inventions - the RAM chip (Kilobits of memory on a single chip) - the microprocessor allowed fourth generation computers to be even smaller and faster than ever before. The 4004 was only capable of 60,000 instructions per second, but later processors (such as the 8086 that all of Intel's processors for the IBM PC and compatibles is based) brought ever increasing speed and power to the computers. Supercomputers of the era were immensely powerful, like the Cray-1 which could calculate 150 million floating point operations per second. The microprocessor allowed the development of microcomputers, personal computers that were small and cheap enough to be available to ordinary people. The first such personal computer was the MITS Altair 8800, released at the end of 1974, but it was followed by computers such as the Apple I & II, Commodore PET and eventually the original IBM PC in 1981.
Although processing power and storage capacities have increased beyond all recognition since the 1970s the underlying technology of LSI (large scale integration) or VLSI (very large scale integration) microchips has remained basically the same, so it is widely regarded that most of today's computers still belong to the fourth generation.
Many people wanted to put their ideas into the standards for communication between the computers that made up this network, so a system was devised for putting forward ideas. Basically you wrote your ideas in a paper called a 'Request for Comments' (RFC for short), and let everyone else read it. People commented on and improved your ideas in new RFCs. The first RFC (RFC0001) was written on April 7th, 1969. There are now well over 2000 RFCs, describing every aspect of how the internet functions.
The first Interface Message Processor (IMP) was plugged in and switched on in Len Kleinrock's lab at UCLA on 2nd September 1969, forming the beginning of an real network to test the ideas.
ARPAnet was opened to non-military users later in the 1970s, and early takers were the big universities - although at this stage it resembled nothing like the internet we know today. International connections (i.e. outside America) started in 1972, but the internet was still just a way for computers to talk to each other and for research into networking, there was no World-Wide-Web and no email as we now know it.
It wasn't until the early to mid 1980s that the services we now use most on the internet started appearing. The concept of 'domain names', things like 'microsoft.com', and special `Domain Name Servers' wasn't even introduced until 1984 - before that all the computers were just addressed by their IP addresses (numbers). Most protocols used for email and other services appeared after this - although email itself had been around much longer the way it was sent between institutions was less standardized.
The part of the internet most people are probably most familiar with is the World-Wide-Web. This is a collection of hyperlinked pages of information distributed over the internet via a network protocol called HTTP (hyper-text-transfer-protocol). This was invented by Tim Berners-Lee in 1989. He was a physicist working at CERN, the European Particle Physics Laboratory, and wanted a way for physicists to share information about their research - the World-Wide-Web was his solution. So the web was started, although at this time it was text-only. Graphics came later with a browser called NCSA Mosaic. Both Microsoft's Internet Explorer and Netscape were originally based on NCSA Mosaic.
The graphical interface opened up the internet to novice users and in 1993 its use exploded as people were allowed to 'dial-in' to the internet using their computer at home and a modem to ring up an 'Internet Service Provider' (ISP) to get their connection to this (now huge) network. Before this the only computers connected were at Universities and other large organisations that could afford to hire cables between each other to transfer the data over - but now anyone could use the internet and it evolved into the 'Information Superhighway' that we know and (possibly) love today.
This first version of Windows wasn't very powerful and so not very popular. Popularity increased with the release of Windows 2 in 1987. The first really popular version of Microsoft Windows was version 3.0, released in 1990. This benefited from the improved graphics available on PCs by this time, and also from the 80386 processor which allowed 'true' multitasking of the Windows applications. This made it more efficient and more reliable when running more than one piece of software at a time. It would even allow you to run and multitask older MS-DOS based software. Windows 3 made the IBM PC a serious piece of competition for the Apple Mac. Various improvements - Windows 3.1 and Windows 3.11 were released, although to the user they looked very similar to Windows 3.
Also available at a similar time to Windows 3 was IBM's OS/2 (which was actually written in partnership with Microsoft). This was followed by OS/2 Warp, which was a full 32 bit operating system designed exclusively for the 80386 and better processors. It came out long before Windows 95, and boasted many similar features. Unfortunately IBM failed to market it successfully enough and it didn't catch on.
Windows 95 was released in 1995 (no surprises there), in August. Although it shared much code with Windows 3 and even MS-DOS, Windows 95 had two big advantages. First, it was an entire Operating System, you no-longer needed to buy MS-DOS and then install Windows on top of it. Second it was specially written for 80386 and better processors and made 'full' use of the 32 bit facilities. In this respect Windows 95 moved closer to Windows NT.
Windows NT (New Technology) was developed alongside Windows for use on servers and businesses. It is designed to be more reliable and secure than Windows 95, but as a trade-off it is less compatible with older MS-DOS based software (crucially for the home market it won't run many video games).
1998 (June 25) saw the release of Windows 98, which is very similar to Windows 95, except that it provided an improved filing system (which controls the way data is stored on disks), the improvements allowed it to support disks larger than the 2 GB allowed by the first release of Windows 95. Windows 98 also brought support for USB and AGP.
It was Microsoft's aim - with Windows 2000 - to merge the two seperate versions of Windows (Windows 95/8 and Windows NT) into one product. This failed. Windows 2000 was based on Windows NT, internally it often refers to itself as 'NT 5', but it boasts a slightly prettier interface and a more exciting name than previous members of the NT series. Because of the memory protection (which helps provide reliability and security for the NT series), Windows 2000 is unable to run some of the 'legacy software' (in particular games) that Windows 95 and 98 can. This resulted in the development of Windows Millenium Edition (ME), a new member the 95/98 family.
The next major release of Microsoft Windows was 'Windows XP', again the intention of Microsoft was that this release would replace both the older versions of Windows: NT/2000 and the 95/8/ME family. Again Windows XP is a really a continuation of the Windows NT product line, but this time Microsoft were successful in selling it to the home users that traditionally bought 95, 98 or ME. This success was partially because of improved technology, but largely because a long time (6 years or more) had passed since the first release of Windows 95 (which marked the end of MS-DOS development). This meant that much of the 'legacy' MS-DOS/non-Windows software (and in particular games) that caused problems under Windows 2000 had been re-written or replaced to work properly under Windows, so was no longer such an issue for Windows XP.
All this leaves the question of the court case between Apple and Microsoft, the one Apple started in 1985 by trying to sue Microsoft for copying the 'look and feel' of their operating system. Well the answer was that in 1997, August 6, after 18 months of losses by Apple, Microsoft helped 'bail' them out of serious financial trouble by buying 100,000 non-voting shares in the company for $150 million. Microsoft had several political reasons for doing this, but one condition was that Apple had to drop this long-running court case.
It is also worth mentioning another windowing system, the 'X Window System'. This was developed at MIT, starting in 1984, for use on graphics workstations. Due largely to the availability of the source code used to write it, it has become the standard graphical interface on many Unix based systems - including most Linux distributions. Although the X Window System provides functionality for drawing and moving Windows on the screen and also for providing a mouse cursor it provides none of the user interface features (such as buttons, menus, window title bars and so on) that people expect. These features are provided by other pieces of software, window managers, graphics toolkits, and the like. The most popular graphical desktop environments under Linux rely on the X Window System but also provide all of the other features themselves - so providing an integrated and uniform interface to the user. Most popular of these are KDE and GNOME. Solaris users have CDE, which provides similar functionality for their workstations.