Time flies when it comes to tech.
And from dial-up to data scams, it’s crazy thinking about how much has happened in such a short period of time.
But looking back even further, this December marks the anniversary of a milestone event in computing: 125 years since Herman Hollerith founded The Tabulating Machine Company.
Hmmm, that name not ringing any bells?
Well, it was actually what would later become International Business Machines - or IBM.
So to mark this significant occasion, we’re hopping inside the DeLorean and punching a destination date for 1896; essentially, the birth year of modern computing.
But first, we need to set the scene with a little context.
Because early experiments with what would later become the computer were already happening long before Hollerith invented the tabulating machine.
Early attempts at crunching the numbers
In 1801, a French weaver and merchant named Joseph Marie Jacquard invented a device that would become crucial in developing the first computer. The Jacquard machine is a loom that uses punched wooden cards to automatically weave fabric designs.
Ok - but what’s the significance of that, you ask?
Well, early computers would actually use similar punch cards to tabulate results.
But across The Channel in 1822, an English mathematician called Charles Babbage was working on his own invention: the first steam-powered machine to calculate and compute tables of numbers.
Until it wasn’t.
Funded by the English government, the project was deemed a complete failure.
But this appetite to find an easier way of crunching the numbers wasn’t going away.
In fact, the U.S. population had grown so large by 1880 that it would take over seven years to tabulate the Census results (according to the United States Census Bureau). The government had to find a faster way to get the job done. Thankfully, our man Herman was up to the task.
In 1890, he designed a punch card system to calculate the results. Accomplishing the task in just three years, Hollerith’s punch card system saved the government a whopping five million dollars.
Despite being big enough to take up an entire room, it had finally happened: the first computer was born.
Never miss an update from us.
Get monthly doses of brand, marketing and tech delivered directly to your inbox!
A techie timeline of computing events (1896 - 2021)
At that point - as is the case with technological advancements - things began to move lightning fast.
Herman Hollerith incorporates the Tabulating Machine Company. The first innovation came in this very year when he introduced the Hollerith Integrating Tabulator. This could add numbers coded on punched cards - not just count the number of holes.
The Tabulating Machine Company, then known as Computing-Tabulating-Recording Company (CTR), outgrows its roots and becomes IBM (International Business Machines Corporation).
Alan Turing invents the Turing machine. Basically, it was a system that’s capable of computing anything that’s computable. Crucially, the central concept of the modern computer was based on his ideas.
J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts. Remember that name as he’ll come up again...
Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage (according to the Computer History Museum).
Atanasoff has a milestone moment in computing. Along with his graduate student Clifford Berry, he designs a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory. Told you he’d be back.
Another historical landmark was reached when a couple of University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Filling a 20-foot by 40-foot room and made up of 18,000 vacuum tubes, it’s considered the granddaddy of digital computers.
Computers go commercial when Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC. This is the first commercial computer for business and government applications.
Adios vacuums. This was the year that William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invented the transistor; an electric switch with solid materials that didn’t need a vacuum.
Another mega moment in tech, as the first computer language was developed by Grace Hopper. This would eventually be known as COBOL.
Chips ahoy! Jack Kilby and Robert Noyce unveil the integrated circuit - or the computer chip.
Computers go mainstream. Douglas Engelbart reveals a prototype of the modern computer - complete with mouse and graphical user interface (GUI). No longer reserved for scientists and mathematicians, computers were now becoming accessible to the public.
Sharing’s caring. Alan Shugart and his team of IBM engineers invent the "floppy disk". Data could now be easily shared among computers.
Robert Metcalfe of Xerox develops Ethernet. Now multiple computers and hardware could be connected to one another.
Another tech titan is born. Paul Allen and Bill Gates form their own software company: Microsoft.
Not to be outdone, Steve Jobs and Steve Wozniak launch Apple Computers when they release the Apple I. According to Stanford University, it was the first computer with a single-circuit board.
Word processing becomes a reality as MicroPro International releases WordStar.
The first IBM personal computer is introduced. It uses Microsoft's MS-DOS operating system, has an Intel chip, two floppy disks and an optional colour monitor. It’s the first time a computer is available through outside distributors and popularises the term PC.
Apple's Lisa is the first personal computer with a GUI, drop-down menu and icons. Unfortunately, it doesn’t do well. However, it does eventually evolve into the Macintosh. Meanwhile, the Gavilan SC is the first portable computer to be marketed as a "laptop."
Microsoft announces Windows, according to Encyclopedia Britannica. Apparently, it was the company's response to Apple's GUI. The first dot-com domain name is also registered. A small Massachusetts computer manufacturer called The Symbolics Computer Company registers Symbolics.com.
A researcher at CERN named Tim Berners-Lee develops HyperText Markup Language (HTML). This innovation gives rise to the World Wide Web.
Blockchain technology is first conceived by two mathematicians called Stuart Haber and W. Scott Stornetta. The idea was to implement a system where document timestamps could not be tampered with.
Dial-up was first offered commercially in 1992. Pipex supplied UK’s internet, whilst Sprint provided it in the United States.
The Pentium microprocessor pushes forward the capabilities of graphics and music on PCs. This would huge ramifications for gamers the following year.
PCs become gaming machines. Games like Command & Conquer and Theme Park are just a couple of the now-iconic titles released this year.
The Google search engine is developed at Stanford University by Sergey Brin and Larry Page.
The first mobile app appears when the Nokia 6110 launched with a built-in version of Snake.
No strings attached. Wi-Fi becomes part of the computing language as users start connecting to the Internet without wires.
EA releases The Sims on February 4th; it was the best-selling PC game in history. Oh yeah, and computers continue to work as the millennium bug didn’t show up after all.
The “digital information age” begins. Because it was this year when the total digitised information in the world exceeded traditional analogue information.
Apple opens the iTunes store on April 28th.
Mozilla's Firefox challenges Microsoft's Internet Explorer as the world’s most dominant Web browser. Oh and Facebook launches, too.
YouTube is founded and Google acquires Android.
Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market.
The iPhone brings many computer functions to the smartphone.
Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features. Not only that, Bitcoin was created by pseudonymous developer Satoshi Nakamoto; this was the first decentralised cryptocurrency.
Apple changes the game again. This year saw the big unveil of the iPad, kickstarting the tablet computer craze.
Google releases the Chromebook: a laptop running Google Chrome OS.
Apple launches the Apple Watch and Microsoft releases Windows 10.
Quantum leap… 2016 saw the birth of the first reprogrammable quantum computer. It was the first of its kind to have the capacity to program new algorithms into its system.
The year of the scandal. On January 3, information about Meltdown and Spectre attacks were publicly released. These security flaws affected nearly all the world's computers and smartphones. Then on March 17th, it was revealed that Cambridge Analytica harvested 50 million Facebook profiles and used that data to help Donald Trump's election team.
Into the fold. In February, the first folding smartphones are introduced to the world; Galaxy Fold and Huawei Mate X are the first to be released.
Tech goes track and trace. In light of the pandemic, the NHS Covid-19 app is launched. Despite troubled beginnings, it’s downloaded more than 20.7 million times (according to BBC).
Back to life, back to (virtual) reality. English scientist Berners-Lee creates the first NFT and Facebook announces the Metaverse. Beta testing this year, a full launch is planned for 2023.
OK with 125 years of computing history to cover, we’ve just scratched the surface here. Know any major techie events our timeline’s missing? Get in touch and let us know.