The Machine That Changed the World: The World at Your Fingertips

Here’s the fifth and final episode of The Machine That Changed the World, this one focusing on global information networks including the Internet, and the communication benefits and privacy risks they create. This is the most familiar material of the documentary, so I’m going to skip the notes and annotations this time. I hope you enjoyed the documentary as much as I did.

And, as promised, here’s the BitTorrent file for high-resolution copies of all five videos. It’s a 3.1GB download with five H.264 encoded MP4 files. (If you only want a single video, use your BitTorrent client to select only the videos you need.) Enjoy!

(Previously: Part 1, Part 2, Part 3, Part 4.)

Interviews:

Robert Lucky (AT&T Bell Labs), Dave Hughes, Kathleen Bonner (Trader, Fidelity), George Hayter (Former Head of Trading, London Stock Exchange), Ben Bagdikian (UC Berkeley), Arthur Miller (Harvard Law School), Forman Brown (songwriter, died in 1996), Tan Chin Nam (Chairman, National Computer Board of Singapore), B.G. Lee (Minister of Trade and Industry, Singapore), Lee Fook Wah, (Assistant Traffic Manager, MRT Singapore), David Assouline (French Activist, now a senator), Mitch Kapor (founder, Lotus), Michael Drennan (Air traffic controller, Dallas-Fort Worth)

The Machine That Changed the World: The Paperback Computer

The third episode of The Machine That Changed the World covers the development of the personal computer and the modern graphical user interface, which made computing easy to use for everyone. Highlights include interviews with Apple’s Steve Jobs and Steve Wozniak, drawing with a computer in 1963, great footage from Xerox PARC, and some 1992-era predictions of the future from Apple and others.

Continue reading “The Machine That Changed the World: The Paperback Computer”

The Machine That Changed the World: Inventing the Future

The first part of The Machine That Changed the World covered the earliest roots of computing, from Charles Babbage and Ada Lovelace in the 1800s to the first working computers of the 1940s. The second part, “Inventing the Future,” picks up the story of ENIAC’s creators as they embark on building the first commercial computer company in 1950, and ends with the moon landing in 1969 and the beginning of the Silicon Valley.

Notes:

Shortly after the war ended, ENIAC‘s creators founded the first commercial computer company, the Eckert-Mauchly Computer Corporation in 1946. The early history of the company’s funding and progress is told through interviews and personal home videos. They underestimated the cost and time to build UNIVAC I, their new computer for the US Census Bureau, quickly sending the company into financial trouble. Meanwhile, in London, the J. Lyons and Co. food empire teamed up with the EDSAC developers at Cambridge to build LEO, their own computer to manage inventory and payroll. It was a huge success, inspiring Lyons to start building computers for other companies.

The Eckert-Mauchly company was in trouble, with several high-profile Defense Department contracts withdrawn because of a mistaken belief that John Mauchly had Communist ties. After several attempts to save the company, the company was sold to Remington-Rand in 1950. The company, then focused on electric razors and business machines, gave UNIVAC its television debut by tabulating live returns during the 1952 presidential election. To CBS’s amazement, it accurately predicted an Eisenhower landslide with only 1% of the vote. UNIVAC soon made appearances in movies and cartoons, leading to more business.

IBM was late to enter the computing business, though they’d built the massive SSEC in 1948 for scientific research. When the US Census ordered a UNIVAC, Thomas Watson, Jr. recognized the threat to the tabulating machine business. IBM introduced their first commercial business computers in 1953, the mass-produced IBM 650. While inferior technology, it soon dominated the market with their strong sales force, relative affordability, and integration with existing tabulating machines. In 1956, IBM soared past Remington-Rand to become the largest computer company in the world. By 1960, IBM captured 75% of the US computer market.

But developing software for these systems often cost several times the hardware itself, because programming was so difficult and programmers were hard to find. FORTRAN was one of the first higher-level languages, designed for scientists and mathematicians. It didn’t work well for business use, so COBOL soon followed. This led to wider adoption in different industries, as software was developed that could automate human labor. “Automation” become a serious fear, as humans were afraid they’d lose their jobs to machines. Across the country, companies like Bank of America (with ERMA) were eliminating thousands of tedious tabulating jobs with a single computer, though the country’s prosperity and booming job market tempered some of that fear.

In the ’50s, vacuum tubes were an essential component of the electronics industry, located in every computer, radio, and television. Transistors meant that far more complex computers could be designed, but couldn’t be built because wiring them together was a logistical nightmare. The “tyranny of numbers” was solved in 1959 with the first working integrated circuit, developed and introduced independently by both Texas Instruments and Fairchild. But ICs were virtually ignored until adopted by NASA and the military for use in lunar landers, guided missiles, and jets. Electronics manufacturers soon realized the ability to mass-produce ICs. Within a decade, ICs cost pennies to produce while becoming a thousand times more powerful. The result was the birth of the Silicon Valley and a reborn electronics industry.

Interviews:

Ted Withington (network engineer, industry analyst), Paul Ceruzzi (Smithsonian), J. Presper Eckert (ENIAC co-inventor, died 1995), Morris Hansen (former US Census Bureau, died 1990), John Pinkerton (Chief Engineer, LEO, died 1997), Thomas J. Watson, Jr. (Chairman Emeritus, IBM, died 1993), James W. Birkenstock (retired Vice President, IBM, died 2003), Jean Sammet (programming language historian), Dick Davis (retired Senior V.P., Bank of America), Robert Noyce (co-inventor, integrated circuit, died 1990), Gordon Moore (former Chairman of the Board, Intel), Steve Wozniak (Co-founder, Apple)

Up Next…

Part 3: The Paperback Computer. The development of the personal computer and user interfaces, from Doug Engelbart and Xerox PARC to the Apple and IBM PCs.

The Machine That Changed the World: Giant Brains

The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced, but since its release in 1992, it’s become virtually extinct. Out of print and never released online, the only remaining copies are VHS tapes floating around school libraries or in the homes of fans who dubbed the original shows when they aired.

It’s a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players — several of whom passed away since the filming. Jointly produced by WGBH Boston and the BBC, it originally aired in the UK as The Dream Machine before its U.S. premiere in January 1992. Its broadcast was accompanied by a book co-written by the documentary’s producer Jon Palfreman.

With the help of Simon Willison, Jesse Legg, and (unofficially) the Portland State University library, we’ve tracked down and digitized all five parts. This week, I’m uploading them, annotating them with Viddler, and posting them here as streaming Flash video as they’re finished. Also, the complete set is available for download as high-quality MP4 downloads via BitTorrent.

Here’s the first of the five-part series, The Machine That Changed the World. Enjoy!

Continue reading “The Machine That Changed the World: Giant Brains”