Technical
Compiled in conjunction with Gemini
2700 BCE – 100 BCE
- 2700–2300 BCE: The Abacus (Sumeria/Babylon) – The first "data storage" device. It allowed humans to perform calculations faster than the brain could hold numbers.
- 150 – 100 BCE: The Antikythera Mechanism (Greece) – The peak of ancient engineering. A mechanical analog computer used to predict eclipses and planetary motion with 30+ bronze gears.
- c. 800 – 1000 CE: The Astrolabe (Islamic Golden Age) – Highly sophisticated analog "GPS" and calculators used by astronomers and navigators to determine time and position based on the stars.
1600s - 1800s
- 1642: The Pascaline – Blaise Pascal creates a mechanical calculator that could add and subtract using a series of notched wheels.
- 1673: Leibniz’s Stepped Reckoner – Gottfried Wilhelm Leibniz extends Pascal’s work to create a machine that can multiply and divide. Crucially, Leibniz is also the father of Binary Logic ($0$ and $1$), which he proposed as a way to represent all mathematical truths.
- 1801: The Jacquard Loom – Joseph Marie Jacquard uses punched cards to automate weaving patterns. This is the first instance of "binary" storage used to control a machine.
- 1837: The Analytical Engine – Charles Babbage combines the "Logic" of Leibniz with the "Punched Cards" of Jacquard to design the first general-purpose computer.
- 1843: Ada Lovelace writes the first algorithm for the Analytical Engine, becoming the world's first computer programmer.
1930s – 1950s
- 1936: The Universal Turing Machine – Alan Turing provides the mathematical proof that a single machine could, in theory, perform any calculation if given the right instructions.
- 1943: Colossus – The world's first electronic digital programmable computer, built by British codebreakers to crack German ciphers during WWII.
- 1945: Von Neumann Architecture – John von Neumann describes the design for a computer where the program and the data are stored in the same memory—the blueprint for every computer you use today.
- 1945: ENIAC is unveiled. It was the first electronic, general-purpose digital computer, weighing 30 tons and using 18,000 vacuum tubes.
- 1947: The Transistor is invented at Bell Labs. This eventually replaced bulky vacuum tubes, allowing computers to become smaller, faster, and more reliable.
- 1953: Grace Hopper develops the first compiler, leading to high-level languages like COBOL.
- 1958: The Integrated Circuit (Microchip) is co-invented by Jack Kilby and Robert Noyce, enabling the placement of multiple electronic components onto a single sliver of silicon.
1960s – 2010s
- 1964: IBM System/360 is launched. This was the first "family" of computers that allowed software to be shared across different models, revolutionizing business computing.
- 1971: Intel 4004 is released—the first commercial Microprocessor. The entire CPU was now on a single chip.
- 1973: Xerox Alto is developed. It was the first computer to use a Graphical User Interface (GUI) and a mouse, though it was never sold commercially.
- 1977: The "Trinity" of home computing—the Apple II, Commodore PET, and TRS-80—brings computing into the average home.
- 1984: The Apple Macintosh popularizes the GUI, making computers intuitive for non-technical users.
- 1991: Tim Berners-Lee releases the code for the World Wide Web, creating a user-friendly layer on top of the existing Internet.
- 1997: IBM Deep Blue defeats Garry Kasparov at chess, marking a massive milestone in Artificial Intelligence.
- 2007: The iPhone is released, shifting the computing paradigm from the desktop to "Mobile-First" and making the internet ubiquitous via smartphones.
- 2010s: The rise of Cloud Computing (AWS, Azure) allows massive computing power to be accessed on-demand over the web rather than on local hardware.
2020 – Present
- 2022–2023: Large Language Models (LLMs) like GPT-4 and Gemini mainstream Generative AI, changing how humans interact with machines through natural language.
- 2024–2025: Agentic AI and specialized NPU (Neural Processing Unit) hardware become standard, allowing computers to perform complex, multi-step tasks autonomously.
- Current Frontier: Quantum Computing reaches "Quantum Utility," where quantum processors begin solving specific scientific problems that are impossible for classical supercomputers.