Technology History

Technology runs our lives these days. Smartphones, tablets and computers – we really can’t seem to function without them. Technology has exploded in the market and now, many people cannot imagine a life without it.

But what exactly is technology? Well, we shall give many related definitions of technology on this site; and in fact one could say that answering this question is the primary goal of all the work presented here. Technology refers to tools and techniques that are used for solving problems. Technology can refer to methods ranging from as simple as stone tools to the complex genetic engineering and information technology that has emerged since the 1980s. 

Here on this site we shall focus on those particular classes of technology that involve computers and machines. Basically we are concerned with the application and embedding of information technology  into the real world (i.e. problem solving using a  data processing and data networking capacity). 

Put simply, we wish to place chips and connectivity into everything; that is, we wish to beneficially apply: knowledge machines, AI, IoT, Virtual Reality, Robotics etc; to all human activities (mental and physical).

Purpose Drives Technology

The term technology comes from the Greek word techne, meaning art and craft, and the word logos, meaning word and speech. It was first used to describe applied arts, but it is now used to describe advancements and changes which affect the environment in which we humans operate. Ergo technology refers to capability magnification/extension tools of one kind or another.

All technologies are born out of purpose. One example is that search engines were created to sort through the massive amounts of data online. With each upgrade – a new technology compounds existing technologies to create something better than what was previously used before. Technologies feed – or are built – one ontop of another ad-infinitum. 

We end up with a mind-blowing technology such as an iPhone that represents perhaps millions of technologies bundled together in the most sophisticated manner imaginable – forming a technology that can be used for literally millions of different purposes.

But we are immersed in a vast number (and range) of other technologies which shape all of our lives profoundly (ie. overtly and visibly, and/or invisibly) such as: news sites, banking systems, shopping sites, social media platforms etc. Add each technology assembly may – and often will – have quite dramatic implications for potentially millions of humans. Accordingly, we must ask ourselves – if each new technology is affecting each individual human beneficially.  Are new technologies optimal/humane? And if not – how can we make them so – or prevent their development and deployment?

It is salient to consider this key question – related to the social impact of technology – as we look over the history of information technology.

Brief History of the Computer / Internet

The computer was born not for entertainment or email but out of a need to solve number-crunching problems. A series of developments were instrumental in the development of digital computers as we know them today; and some important ones are listed below.

 

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers.

1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.

1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.

1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.

1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California.

1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FOTRAN programming language an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip.

1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack’s TRS-80 — affectionately known as the “Trash 80” — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the “world’s first minicomputer kit to rival commercial models.”

1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board.

1977: Radio Shack’s initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International releases WordStar.

1981: The first IBM personal computer, code-named “Acorn,” is introduced. It uses Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor.

1983: Apple’s Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh.

1985: Microsoft announces Windows. This was the company’s response to Apple’s GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.

1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of the Internet. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1994: PCs become gaming machines as “Command & Conquer,” “Alone in the Dark 2,” “Theme Park,” “Magic Carpet,” “Descent” and “Little Big Adventure” are among the games to hit the market.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple’s court case against Microsoft in which it alleged that Microsoft copied the “look and feel” of its operating system.

1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.

2003: The first 64-bit processor, AMD’s Athlon 64, becomes available to the consumer market.

2004: Mozilla’s Firefox 1.0 challenges Microsoft’s Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches.

2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo’s Wii game console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first programmable quantum computer was created.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new “Molecular Informatics” program that uses molecules as computers. “Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing.

2017: AlphaGo is a AI computer program that uses Deep-Learning to play the board game Go, and was developed by DeepMind Technologies which was later acquired by Google. In 2017 AlphaGo beat Ke Jie, the world No.1 ranked human player at the time.

Conclusion

It is surprising how far, in just over 200 years of history, the field of computing has progressed. Indeed, there is no area of human activity, or thought, that this IT revolution has not transformed. Doubtless the next 200 years will see even more incredible progress – and one could ask where, precisely, is it all headed. What will be the end result for humanity as a whole – and for ordinary humans in particular?

The answer is almost impossible to predict; but what we can say is that barring a large-scale catastrophe of some kind (i.e. the coming of a world war, environmental disaster etc); progress will continue unabated.

The premise of the present site is that we cannot just let technology develop as it may; but rather that we must carefully manage (or guide) developments in a direction that provides real benefits (and opportunities) for all.