The development of computing

Based on the binary system of 0s and 1s the modern computer age started in the Second World War, with COLOSSUS in England, used for deciphering German military codes, and ENIAC in the United States. These machines were large and cumbersome: ENIAC was eight feet high and contained 17,000 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1500 relays and 6000 switches.

Data processing with punched cards was developed by Herman Hollerith for the US population census of 1890. Along with punched tape, such cards were in widespread use for many years.

The ‘first generation’ of electronic digital computers spanned the 1950s. These computers stored their programs internally and initially used vacuum tubes as their switching technology. However, such tubes were bulky, hot and unreliable, and were gradually replaced by transistors. This in turn gave rise to the problem of trying to interconnect many thousands of simple circuits to form a system with sufficient computing power. 

Since the 1950s computers have developed at an ever-accelerating pace, with a massive increase in speed and power and a corresponding decrease in size and cost. The late 1960s saw the development of printed circuit boards on which thin strips of copper were ‘printed’, connecting the transistors and other electronic components. This led to the all-important introduction of the integrated circuit, an assembly of thousands of transistors, resistors, capacitors and other devices, all interconnected electronically and packaged as a single functional item.

In the 1970s the first personal computers became available, for use in the home and office. Communications were also transformed with the introduction of electronic mail and the internet; the Thailand stamp portrays King Bhumibol checking his e-mail.

[Bosnia and Herzegovina 2001; Ivory Coast 1972; Japan 1980; Marshall Islands 1999; Norway 1969; Switzerland 1970; Thailand 1997]


Published/edited: 14/03/2015