From the Collection Archives - CHM https://computerhistory.org/blog/category/from-the-collection/ Computer History Museum Mon, 09 Jun 2025 20:10:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 Memories of Lisa https://computerhistory.org/blog/memories-of-lisa/ Thu, 30 Jan 2025 16:43:55 +0000 https://computerhistory.org/?p=31738 Listen to new oral history interviews with Apple engineer Bill Atkinson, who shares his insights and personal experiences about working on the legendary Lisa computer.

The post Memories of Lisa appeared first on CHM.

]]>
Apple’s Bill Atkinson In His Own Words

Two years ago, CHM celebrated the 40th anniversary of the Apple Lisa, an innovative computer that brought the graphical user interface to a mass market for the first time. Apple insiders shared their insights and reminiscences with an enthusiastic audience in-person and online. Read all about it.

At the same time, CHM released the source code to the Lisa’s OS and applications software, which you can read more about here and here and access here.

As part of this celebration, CHM recorded a series of three interviews with Bill Atkinson, a software engineer who developed key portions of the Lisa’s software, especially its QuickDraw graphics library, as well as important aspects of the Lisa’s user interface.

Session 1: Challenges and Decisions

The first interview, conducted by CHM’s Marguerite Gong Hancock, asked Bill Atkinson about technical challenges, eureka moments, team dynamics, and the Lisa’s significance. Atkinson discusses crucial decisions that made the Lisa what it became:

  1. Switching to the Motorola 68000 microprocessor;
  2. Making the mouse standard equipment;
  3. Using black text on a white background;
  4. Moving the menu bar to the top of the screen.

Atkinson also recounts how he was recruited to Apple by Steve Jobs to help “change the world.” An important aspect of Atkinson’s work on Lisa was his collaboration with the late Larry Tesler, a former PARC engineer who was in charge of the Lisa’s applications team and was an advocate for user testing for making the interface more accessible to everyday people.

A common misconception Atkinson wanted to clear up was that Apple merely copied the GUI from Xerox PARC. Instead, his team improved on the interface far beyond what they had seen during that fateful visit to PARC in 1979. Many of these refinements were informed by user testing and were later inherited by the Macintosh. Without the Lisa, says Atkinson, there would be no Macintosh, the machine which ultimately brought the GUI to the rest of us.

While two excerpts of this interview were shown during the CHM Live Lisa 40th anniversary event, Something Special and Catcher’s Mitt, we are pleased to offer the full interview below.

Bill Atkinson interview, part 1.

Session 2: UI Development

A second interview was conducted by CHM curator Hansen Hsu. For that session, Bill Atkinson brought a binder of Polaroid photographs which he took during the development of the Lisa, providing a personal, narrated tour of the stages of development of the UI.

Through these photographs, one can trace the evolution of the Lisa user interface through its earliest iteration with softkeys, through the addition of overlapping windows and pulldown menus, to its final form with icons on a desktop. Atkinson also discusses a prototype graphics editor that later evolved into MacPaint, and the halftone dithering algorithm he created for displaying scanned photographs on a black and white screen.

Watch the full Session 2 video below.

Bill Atkinson traces the evolution of the Lisa GUI.

The story of this evolution as documented through these Polaroids has previously been written about by Macintosh software engineer Andy Hertzfeld at this page at Folklore.org, which is now being hosted by CHM.

Session 3: Source Code

A third interview was conducted with Bill Atkinson by Hansen Hsu on the Lisa source code, which was released by CHM to coincide with the Lisa’s 40th anniversary.

In this interview, Atkinson goes over particular portions of the Lisa source code that he worked on, focusing especially on QuickDraw and the Window Manager. A key innovation was his development of “Regions,” which identified portions of the screen that were obscured by windows on top of them and therefore did not need to be drawn, saving time. He also remarks on the implementation of Rounded Rectangles, the importance of which he was convinced of by Steve Jobs.

Other topics Atkinson discusses include:

  1. Lisa’s non-square pixels;
  2. QuickDraw’s ability to save a recording of drawing operations that could be replayed later;
  3. The Font Manager’s ability to algorithmically generate text styles;
  4. The invention of the I-beam cursor;
  5. How a conversation with Doug Engelbart inspired him to create keyboard shortcuts for menu commands.

Several excerpts of this interview were previously posted on CHM’s social media channels.

Now, you can view the interview in its entirety.

Bill Atkinson: Lisa Source Code

Additional Resources

Sadly, Bill Atkinson passed away on June 5, 2025. To learn more about his pioneering work that helped shape the personal computing revolution, explore the resources below:

Insanely Great: The Apple Mac at 40

MacPaint and QuickDraw Source Code

The Lisa: Apple’s Most Influential Failure

 

SUPPORT CHM’S MISSION

Collecting unique interviews like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Memories of Lisa appeared first on CHM.

]]>
Hearing Tech History https://computerhistory.org/blog/hearing-tech-history/ Tue, 22 Oct 2024 15:41:44 +0000 https://computerhistory.org/?p=30998 CHM has received a donation of a rare, one-of-a-kind portable speech processing device designed at Stanford University for use with cochlear implants to aid hearing.

The post Hearing Tech History appeared first on CHM.

]]>
Speech Processor Joins the Collection

CHM curators like me are always scanning the historical horizon for artifacts and materials that can reveal the pre-history of significant computer-based devices we take as commonplace today. Recently, the Museum received a rare donation of real human benefit: a one-of-a-kind prototype of a cochlear implant (CI) speech processor developed at Stanford University in the late 1970s/early 1980s.

Cochlear implants are surgically implanted devices that provide a sense of sound to people with severe or profound hearing loss. The speech processor does the heavy lifting of converting sounds heard by the cochlear implant’s microphone into signals sent directly into the deaf person’s ear via a brain-computer interface (sensor array). Initially, a single channel was used, producing very poor results. Compare a single channel to later—multi-channel—cochlear implants in the video below.

Cochlear implant simulation.

Built into a 1970s-era briefcase, the portable speech processor (PSP) was an important step in using computers to address profound hearing loss, a widespread disability. According to the World Health Organization (WHO), approximately 1.5 billion people (nearly 20% of the global population) experience some degree of hearing loss. Of these, around 430 million people have disabling hearing loss, which significantly impacts their ability to communicate and perform daily activities.

The Stanford PSP-2 Portable Speech Processor

The donation process began in April 2024, when NYU professor Dr. Mara Mills approached CHM curator David Brock about a history of the cochlear implant she is cowriting. She put us in touch with Dr. Les Atlas—one of several Stanford electrical engineering PhDs who had worked on the PSP-2 in the 1980s—to consider preserving some of the physical artifacts from this project. Back in the day, they, and other graduate students, created a 6-channel system under Principal Investigators Professor Robert L. White (Dept. of Electrical Engineering) and Dr. Blair Simmons (School of Medicine).

Teams around the world experimenting with Cochlear Implants. Time on vertical axis. Source: National Library of Medicine

An equally important piece of the puzzle was the brain-computer interface: the method of connecting thin electrical wires to human acoustic nerves. This was partially solved by specialized sensors built by the Stanford team.

Custom brain-computer interface auditory nerve stimulation array

Fun fact: In 1790, electrical experimenter Alessandro Volta (from whom we devise the unit of Volt), attached a 50V battery to metal rods in both his ears and reported hearing bubbling sounds.

How it works

A small pocket in the skull (behind the ear) is made surgically to hold the internal receiver-stimulator unit, and then an electrode array is threaded into the cochlea. The implant is securely attached to the skull using sutures or clips, and the incision is closed. After the surgical site heals, the external processor is magnetically attached to the internal implant, enabling the device to function.

Components of a cochlear implant system. In this modern system, the speech processor that once needed a briefcase to hold it now sits behind the ear.

Preserving and Sharing History

The donation handoff of the PSP-2 to the Museum occurred on May 31, 2024. I attended a special presentation of the prototype to CHM at the Stanford School of Medicine. In the room were some of the original inventors for whom the prototype was their PhD project: Dr. Les Atlas, Dr. Rob Mathews, and Dr. Martin Walker.

CHM’s Senior Curator Dag Spicer receiving the Stanford Portable Speech Processor (PSP-2) from Dr. Martin Walker.

During the handover, a surgical resident in otolaryngology, Dr. Shayna Cooperman, joined in the conversation. Cooperman implants CIs into patients and wears a CI herself, having been mostly deaf from birth. One of the speakers, Les Atlas, co-inventor of the PSP-2, could not hear well at the event due his own limitations, and it was Dr. Cooperman who listened and repeated the questions for him. So, a current CI wearer—and surgeon who implants them—was facilitating communication between one of the original CI designers. From prototype to life-changing technology in the same room!

Martin Walker, Shayna Cooperman

Dr. Shayna Cooperman, showing her own CI, with Dr. Marty Walker, PSP-2 designer.

Back at CHM, I worked with CHM’s Director of Collections Aurora Tucker to create a lobby display of the PSP-2, where it’s unique packaging and role in helping others can be appreciated by visitors. Centered on the PSP-2, the display also shows a model of the ear, period advertisements for early hearing aids, and the world’s first hybrid (vacuum tube and transistor) hearing aid, ca. 1954.

Four original Stanford PSP-2 PhD students of Dr. Robert White, who created the hardware and software for the PSP-2, visit the display at CHM. (From left to right) Les Atlas, Martin Walker, Rob Mathews, Mathew Herndon.

Over one million cochlear implants have been installed to-date. The breakthrough came in 1984, when the FDA approved CIs for human use. Since then, they have been used in people ranging in age from the young to the elderly.

Computers and medicine have a long relationship: in fact, it is impossible to conceive of most of modern medicine without computers: whether it is a PCR machine, for electronic health records, or for robotic surgery, computers bring accuracy, precision, repeatability and safety to healthcare.

While CIs are now very good at replicating human speech (not so much music), recent development relating to neural networks and artificial intelligence should result in a new generation of even better implants.

Dig Deeper

The Early History of the Cochlear Implant: https://pubmed.ncbi.nlm.nih.gov/23681026/

The Long Road to Today’s Cochlear Implant: https://spectrum.ieee.org/cochlear-implant-history

Cochlear implants: a remarkable past and a brilliant future: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3707130/

A Machine Hearing (by Dr Dick Lyon): https://hearinghub.edu.au/education/resources/Signal%20Processing%20for%20Hearing%20-%20Lecture%20Series/

Special thanks to: Martin Slaney; Jay Rubinstein, MD; Martin Walker; Shayna Cooperman, MD; Rob Mathews; Dick Lyon; Mara Mills, Aurora Tucker, and Les Atlas.

FacebookTwitterCopy Link

The post Hearing Tech History appeared first on CHM.

]]>
Neural Network Chip Joins the Collection https://computerhistory.org/blog/neural-network-chip-joins-the-collection/ Fri, 09 Aug 2024 15:40:58 +0000 https://computerhistory.org/?p=30588 New additions to the collection, including a pair of Intel 80170 ETANNN chips, help to tell the story of early neural networks.

The post Neural Network Chip Joins the Collection appeared first on CHM.

]]>

It is something of a breakthrough, having achieved the theoretical intelligence level of a cockroach.

— John C. Dvorak, PC Magazine, May 29, 1990, p.77.

Recently, Intel ETANN pioneer Mark Holler donated a pair of Intel 80170 ETANN integrated circuits and other materials related to early neural networks to the Computer History Museum.

Neural networks are very much in the news these days as the enabling technology for much of the artificial intelligence boom, including large language models, search, generative AI, and chatbots, like ChatGPT. Yet this idea has deep roots in past work going back to the 1950s with the work of Professor Frank Rosenblatt at Cornell University, who built devices that could recognize simple numbers and letters using the world’s first neural network, the Perceptron, which he invented.

Frank Rosenblatt, left, and Charles W. Wightman work on part of the unit that became the first perceptron in December 1958. Credit: Cornell University, Division of Rare and Manuscript Collections.

Over the course of several AI “Winters” since that time—during which the generous government funding dried up—neural networks have been in the background waiting for their moment. That moment came about a decade ago as the power of computing driven by Moore’s Law allowed for a new level of neural network complexity, resulting in “neural nets” being deployed in real-world contexts. For example, Google’s 2013 Tensor Processing Units are essentially neural network accelerators that Google uses to improve search. NVIDIA, Amazon, Meta, and (again) Intel have all designed their own neural network processors.

NVIDIA Blackwell AI Accelerator

Meta Training and Inference Accelerator (MTIA)

Intel Gaudi 3 AI Accelerator

While there have been many such milestones in neural networks, especially recently, a very interesting and unique implementation of neural networks took place in 1989 with the announcement of Intel’s 80170 ETANN integrated circuit at that year’s International Joint Conference on Neural Networks (IJCNN).

ETANN stands for Electrically Trainable Analog Neural Network. Neural networks aren’t really programmed in the traditional sense, they are trained. An AI programmer’s job is therefore not writing instructions per se but organizing the data you give a neural network in such a way that it can—through repetition across multiple “layers” of these networks—discover patterns and information from incomplete information. Some might call this a form of “thinking.” Others, notably commentator Emily Bender et al, in a now-famous paper, consider neural networks and the large language models they support a form of “stochastic parrot,” not involving thought at all but just a clever statistical parlor game.

Back to the ETANN: In practical terms, to “feed in” the data to the neural network, Intel provided the Intel Neural Network Training System (iNNTS), a training device accessible by a standard Intel PC. Intel also provided a set of software and drivers for controlling the iNNTS. The ETANN incorporated 64 analog neurons and 10,240 analog synapses. At the time, neural networks were pretty much a dead end in terms of applications. There was neither the computing power nor the understanding to deploy them at scale in useful applications, though there was at least one attempt in the military to base a missile seeker on the 80170 chip. As personal computing columnist John C. Dvorak wrote at the time, “Nobody at Intel knows what to make of this thing.”

Block diagram of Intel’s ETANN 80170 chip.

But the potential seemed there, if only a bit into the future. As Pashley recalled in a recent email, “… I remember when Bruce McCormick first became interested in Neural Nets. It was sometime in the winter of 1987-88 when Bruce and I were on an Intel recruiting trip to Caltech. On this trip to Caltech, we met with Carver [Mead] to talk about his EE281 Integrated Circuit Design students and ended up talking about the potential of Neural Nets. Personally, I knew nothing about Neural Nets, but Carver and Bruce were really excited. On the flight back to Intel, Bruce was bubbling with ideas to use EEPROM and Flash Memory to test out Neural Net’s potential.”

Intel 801700 ETANN Neural Network Integrated Circuit, 1989.

Like other ideas “ahead of their time” in the history of technology, we see in the Intel 80170 chip the earliest implementation of a reasonably sophisticated neural network in silicon, a prefiguring of the more complex—indeed, world-changing—AI accelerators of today.

Main image: ETANN Development Team. Mark Holler is holding the ETANN chip.

Dig Deeper

Holler, Mark, et al. An electrically trainable artificial neural network (ETANN) with 10240 floating gate synapses. International Joint Conference on Neural Networks. Vol. 2. 1989.

Castro, Hernan A., Simon M. Tam, and Mark A. Holler. Implementation and performance of an analog nonvolatile neural network. Analog Integrated Circuits and Signal Processing 4.2 (1993): 97-113.

L. R. Kern, Design and development of a real-time neural processor using the Intel 80170NX ETANN, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks, Baltimore, MD, USA, 1992, pp. 684-689 vol.2, doi: 10.1109/IJCNN.1992.226908.

CHM Donation Interview with Mark Holler: https://www.dropbox.com/scl/fi/4ybj210gjtmlxwp7pfg71/Mark-Holler-Interview.mp4?rlkey=68wvbse2js7ntmb5w6zb3fn2u&e=1&dl=0

On Frank Rosenblatt: Professor’s perceptron paved the way for AI – 60 years too soon, https://news.cornell.edu/stories/2019/09/professors-perceptron-paved-way-ai-60-years-too-soon

On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, March 2021, pp. 610 – 662.

Mark Holler’s Donation 

  • Handwritten timeline of Neural Network Development at Intel
  • Two neural network chips in PGA packaging (Ni1000, 80170)
  • Approximately 8 loose photos of chips and members of team
  • Framed photo of 64 Neuron ETANN chip
  • Copies of 2 technical papers: “A Configurable Multi-Chip Analog Neural Network ; Recognition and Back Propagation Training” and “Extraction of Fingerprint Orientation Maps Using a Radial Basis Function Recognition Accelerator
  • Approximately 5 trade publication articles of Intel’s NN chips
  • Datasheet for 80170NX “Electrically Trainable Analog Neural Network” – 40 pages
  • Intel Article Reprint: “An Electrically Trainable Artificial Neural Network (ETANN) with 10240 Floating Gate Synapses” from International Joint Conference on Neural Networks – 1989
  • ETANN application note: “Neural Networks”
  • Article entitled “Neural Network Research at Intel” published in Intel’s publication “Innovator”, Winter 1992
  • Datasheet: Ni1000 Development System

Contributors to the Project

The ETANN chip was a team effort by some very talented engineers. Here they are:

  • Mark Holler, project manager, chip definition, development tools, marketing, personnel
  • Simon Tam, principle engineer, architecture, floating gate synapse cell design, training algorithm
  • Hernan Castro, analog design engineer, non-linear neuron amplifier, I/O buffers, feedback S/H
  • Been-Jon Woo, process engineer, ETANN fabrication, Intel Flash memory process
  • Mike Roy, product/test engineer, analog test set, production
  • Ken Buckmann, senior technician, testing, reliability, production
  • Lily Law, ETANN mask design/layout

In the image at the top of the article and above: back row on far left, Hernan Castro; back row second from left, Been-Jon Woo; back row third from left, ?; back row fourth from left, Ken Buckmann; back row holding chip, Mark Holler; back row far right, Mike Roy; front row on left, Lily Law; front row middle, Simon Tam; front row far right, ?

Thank you!

This donation would not have been possible without first hearing about the 80170 from CHM Semiconductor Special Interest Group (SEMISIG) member Jesse Jenkins. The historical and research assistance of SEMSIG chair Doug Fairbairn and Intel Alumni Network member Dane Elliot were invaluable in effecting the donation of the historical hardware and supporting materials from donor Mark Holler, to whom, of course, we offer our deepest thanks for this donation.

FacebookTwitterCopy Link

The post Neural Network Chip Joins the Collection appeared first on CHM.

]]>
Game On https://computerhistory.org/blog/game-on/ Fri, 24 May 2024 16:22:16 +0000 https://computerhistory.org/?p=29480 Find out about the little-known history of the consoles and gaming PCs that ran the classic video games we know and love.

The post Game On appeared first on CHM.

]]>
Retro Computer Games at CHM

The Computer History Museum’s new exhibit, Retro Games: From Atari to Xbox, showcases classic games and gaming systems from CHM’s collection and runs from May 24 through August 4, 2024. You may remember many of the games featured, but you might not know the history behind the machines that ran these addictive diversions.

Today, the video game market is divided between consoles and gaming PCs, but throughout history the line between them has often been blurry. These two categories are closer than you think.

Early Consoles and PCs: late 1970s–early 1980s

The 1970s saw the birth of the video game and PC industries, thanks to the invention of the microprocessor by Intel in 1971. Early PCs, like the Altair 8800, were hobbyist kits that required technical skills to use. Video games, starting with arcade machines like Atari’s Pong in 1972, were accessible to all. But early home video game consoles could only play a limited number of built-in games, usually Pong or a close imitation, that were hard-wired into the electronics.

1976’s Channel F by Fairchild Semiconductor, was the first console to use a microprocessor and interchangeable cartridges to hold games. Because microprocessors are computers on a chip, games can be written as software and stored in a cartridge’s read-only memory chip, or “ROM.”

The most successful console of the 1970s and early 1980s was the Atari 2600. In 1985, Nintendo took the lead in the US with the Nintendo Entertainment System (NES). Both were initially branded as computers for the home: the 2600 as the “Video Computer System,” or VCS, and the NES as the “Family Computer,” or Famicom in Japan. They used variants or clones of the MOS Technology 6502 microprocessor, the same chip used in the Apple II, BBC Micro, Commodore 64, and Atari 800 home computers.

The MOS Technology 6502 was a low-cost 8-bit microprocessor popular in both game consoles and home PCs. Photo by Aurora Tucker.

Atari 2600 (102730637) with Pitfall! game cartridge (102705661).

Japanese market Nintendo Famicom (102716419) with Super Mario Bros. game cartridge (102746997).

Atari entered the home PC market with the Atari 400 and 800 in 1979, which had similar hardware to 1977’s 2600 console but with better graphics, sound, memory, and expandability. These computers also had cartridge slots, making them popular for gaming. 1982’s Commodore 64, also a popular gaming computer with a cartridge slot was cheaper than competitors and became the best-selling computer of all time. Despite their gaming features, both the Atari 800 and Commodore 64 were considered PCs, not consoles.

Commodore 64 computer (102674084) with joystick (102647138) and Castle Wolfenstein game (102689439). Monitor not shown.

Some companies tried to create hybrid systems by creating add-on devices for existing consoles, turning them into full-blown PCs. APF’s Imagination Machine and Mattel’s Intellivision were both unsuccessful attempts at this strategy.

APF Imagination Machine microcomputer (102801477) with attached APF MP1000 game console (102801478) and Blackjack game cartridge (102801482).

1990s: PCs as an Open Gaming Platform

By the late 1980s, the PC market began to shift from proprietary systems to an open platform governed by industry standards. 1981’s IBM PC, though initially proprietary, used standard components and licensed its operating system, DOS, from Microsoft. This allowed other companies to create compatible “clones,” leading to the rise of the Wintel (Windows and Intel) PC.

Consoles remained proprietary, with Nintendo and Sega dominating the market in the late 1980s, followed by Sony’s PlayStation in the 1990s. Game consoles were designed like appliances and could not be easily upgraded like PCs, which could add new components, allowing them to evolve over time. By the mid-1990s, PCs with aftermarket graphics and sound cards were outperforming many dedicated game consoles. The debut of 3D graphics cards and new game genres like first-person shooters further boosted PC gaming.

Quake I CD-ROM disc and manual (102752170). First person shooter games like Quake drove the development of 3D graphics hardware for PCs in the 1990s.

During the mid-1990s, Microsoft also created DirectX, a dedicated software layer for games on Windows. Prior to this, most PC games ran directly on DOS since it was faster and gamers didn’t like slowdowns. Windows 95 removed this option, so Microsoft provided DirectX to give game developers direct access to PC hardware such as graphics cards.

Xbox: Console or PC?

In the 1990s, video game companies began integrating PC technologies into their consoles. CD-ROMs, which stored more data and were cheaper than ROM cartridges or floppy disks, became common first on PCs, with games such as 1993’s Myst. The Sega Saturn and Sony PlayStation both used CD-ROMs for storage, with Nintendo’s N64 being a noteworthy holdout. However, Nintendo did incorporate 3D graphics technology by partnering with Silicon Graphics for the N64. 3D graphics and optical disc-based media marked a significant shift in video game console technology.

The Sony PlayStation 2 (102752166, 102752167) is the bestselling video game console of all time. Grand Theft Auto: San Andreas (102752169) is the top selling game for the PS2, with Final Fantasy X coming in at 5th (102752168).

In the early 2000s, Microsoft was trying to expand into the living room and saw Sony’s PlayStation 2, which also played DVD movies, as a threat. Microsoft responded with the Xbox in 2001, a system that was essentially a Windows PC turned into a console. It used an Intel microprocessor, NVIDIA graphics chips, and a hard drive, and it ran a specialized version of Windows with DirectX. This allowed game developers to easily port PC games to the Xbox.

Xbox debug kit (102718658), Xbox controller (102752163), Halo: Combat Evolved game (102752162) and Halo Masterchief action figure. Halo’s popularity drove sales of the Xbox. Photo by Aurora Tucker.

Gaming in the 2020s

Both the Xbox and PlayStation now use PC technologies, with many games based on Epic Games’ Unreal Engine, which was first used in the PC first-person shooter game Unreal in 1998. First-person shooters, a genre that originated on PCs in the 1990s, became popular on consoles with games like Halo and Call of Duty in the 2000s.

Today, the main difference between game consoles and gaming PCs is cost and upgradeability. The same types of games are available for both through online stores. Vintage console games like Street Fighter II can be played on PCs, while complex simulations like Civilization can be enjoyed on consoles.

Console or PC, does it matter what you’re playing on if you’re having fun?

Get Your Game On

We hope to see you at CHM for the Retro Games showcase—you’ll even be able to play some games, and you can discover more about games in the Revolution exhibition. Retro Games closes August 4, 2024, so get your ticket now!

FacebookTwitterCopy Link

The post Game On appeared first on CHM.

]]>
A Computer for the Rest of Us https://computerhistory.org/blog/a-computer-for-the-rest-of-us/ Fri, 15 Dec 2023 16:48:43 +0000 https://computerhistory.org/?p=28411 Check out some highlights from CHM's exhibit on the iconic Apple Macintosh computer that is turning 40 next year.

The post A Computer for the Rest of Us appeared first on CHM.

]]>
The Apple Macintosh Turns 40

On January 24, 1984, Apple Computer launched its new Macintosh computer in a Super Bowl ad that aired only once. Alluding to George Orwell’s novel 1984, it both impressed and bewildered the millions who viewed it. Directed by Ridley Scott, the ad symbolized Apple’s desire to “rescue” humanity from the conformity of computer industry giant IBM. It was a call for “a computer for the rest of us.”

In a market dominated by the IBM PC, most computer users were struggling to learn tricky commands and special keywords to run their software. But Mac users could just “point and click.” The Mac used a graphical user interface and a mouse—both new features on mass market computers.

The Macintosh with graphical user interface, keyboard, and mouse.

Even with its splashy introduction and its breakthroughs in usability and design, however, the Mac started slowly in the marketplace and sales were modest in the first year. Moreover, Steve Jobs’ intense personality, drive for perfection, and difficult management style frequently clashed with others at Apple, and in 1985, he suffered the same fate as many Silicon Valley founders when he was fired by the board of directors. Jobs’ departure marked the end of an era and the beginning of a ten-year period of massive hits and equally big misses for him outside of Apple.

Over the years, Apple has continued to cultivate an “outsider” image with campaigns that portray Mac users as rebels, and the Mac’s hardware and software changed many times since 1984. Apple’s way of combining elegant industrial design, brilliant marketing, and advanced engineering remain its core values, expressed in everything it makes. The company consistently ranks in first place of all global brands.

The Impact of the Mac

The celebration of the Apple Macintosh’s 40th anniversary is not merely a retrospective glance at a piece of technology; it’s a reflection on how a single innovation, introduced at the crossroads of design and functionality, has rippled through time, shaping the way we interact with and perceive computing.

CHM has curated a unique array of Apple Macintosh artifacts, both from the Museum’s collection and on loan from Apple alums. The temporary retrospective mini pop-up showcases the technological achievements, the team behind them, the cultural influences that shaped the brand, and a few more surprises.

Pop-up Highlights

The evolution of Macintosh hardware began with wire-wrap prototypes built in the early 1980s by Apple employees Dan Kottke and Brian Howard. As the design changed throughout the development process, new versions were made. Shown here is Prototype #4.

Macintosh Wire-wrap Prototype #4, Apple Computer, Inc., 1981, Gift of Andy Hertzfeld, catalog number 102638251.

Once the hardware was finalized using wire-wrap, a printed circuit board was designed by team member Colette Askeland for mass production. This board was very compact, allowing for the Mac’s highly portable “all-in-one” packaging. To make a complete Macintosh system, a screen, floppy disk, keyboard, mouse, and power supply were added.

Macintosh Main Logic Board, Apple Computer, Inc., 1984, Gift of Henri Socha, catalog number 102667053.

 

A marketer’s dream, Apple’s loyal customers feel a strong emotional connection with the company, transforming them from customers into promoters. Beyond regularly buying new Apple products, Apple fans have shown their affection for the Mac in many different—and sometimes oddball—ways: Mac tattoos, a “MacQuarium,” Apple logo haircuts, paper Macs, Mac-inspired music, and more. 

“What is love?,” Mac the Rapper, performed by the Apple Macintosh, Shinola Records, 1987, catalog number 102651542. You can listen to it here.

One last thing. For the very first public showing of the Macintosh, Steve Jobs, Steve Wozniak and the leaders of the Macintosh design team came to Boston on January 30, 1984, to put on a spectacular show with 1,200 Boston Computer Society members in attendance, then the largest personal computer user organization in the country with 32,000 members in all 50 states and 57 countries.  The BCS general meeting recordings are now part of the CHM collection. To watch the first public showing of the Macintosh video see  “The Very First ‘Stevenote’.”

We hope to see you at CHM for our Apple Mac at 40 mini pop-up, closing February 25, 2024. Get your ticket now!

If you can’t make it in person, check out the online pop-up.

FacebookTwitterCopy Link

The post A Computer for the Rest of Us appeared first on CHM.

]]>
How Old Is Your Furby? https://computerhistory.org/blog/how-old-is-your-furby/ Wed, 13 Dec 2023 17:00:01 +0000 https://computerhistory.org/?p=28607 CHM remembers the Furby frenzy on the 25th anniversary of the fuzzy robot toy that took the world by storm and is still popular today.

The post How Old Is Your Furby? appeared first on CHM.

]]>
The iconic toy turns 25

Do you remember 1998? That was the year Google was founded, France won the World Cup, and Microsoft was the largest company in the world. And a furry little creature—half owl, half hamster—was the “it” gift for the holiday season. Marketed in a high-profile advertising campaign on television and mass media, Furby was a $35 digital toy that captured the imaginations of millions around the world and continues to have an active community of fans today.

Furby was a robotic “friend” that could respond to touch, light, and sound. An electric motor and a system of cams and gears closed the Furby’s eyes and mouth, raised its ears, and lifted it off the ground. Equipped with these sensors and motors, Furbys could blink, wiggle their ears, and even dance to music. They came in several different outfits and dozens of colors, but they all spoke “Furbish,” an imaginary language that slowly morphed into English as the creature interacted more with its owner.

Kids going wild for Furbys at New York’s FAO Schwartz store, 1998. Credit: UNITED STATES – OCTOBER 02: Students from PS 59 meet Furby, a new interactive toy, at FAO Schwarz. (Photo by Susan Watts/NY Daily News Archive via Getty Images)

The Furby Frenzy

Catapulting demand for the toys during the initial 1998 holiday season drove the resale price to well over US $100 and sometimes several times that. As supplies dwindled, arguments and fistfights broke out between rival parents at Toys R’ Us stores. When supplies ran out, consumers turned to the internet, where Furbys could be purchased for many times their retail price.

Furby also caught the attention of American intelligence agencies and the Federal Aviation Administration. The FAA was concerned that the little creatures might interfere with takeoffs and landings and cause safety issues. It even issued an alert that, “Furbys should not be on when the plane is below 10,000 feet.” Because of the way Furbys reacted to light, touch and sound, and fearing that Furbys might record their surroundings (they don’t), the US National Security Agency banned the toy from its headquarters in 1999, as did the US Naval Shipyard in Portsmouth, VA.

But no one else seemed to be worried: During the first three years of production, over 40 million Furbys were sold. A global phenomenon, people from Birmingham to Berlin to Bombay clamored for a fuzzy little creature of their own. The hysteria continued for several years, in part because Furbys were so customizable, making it one of the hottest toys of the late ’90s and early 2000s.

Speaking Furbish

Unveiling of the new Furby toy at the UN Plaza on Aug. 2, 2005, in New York City. https://www.nydailynews.com/2023/06/27/furby-then-and-now/

In 2019, CHM’s Director of Curatorial Affairs David Brock interviewed Furby co-inventor David Hampton about his goals in creating Furby and its language abilities.

So later when it came to Furby, I said, “I do not want to do anything that interrupts the play, imagination, of a child,” and that’s why I developed a Furby language . . . It follows a format, it follows lots of details of a language, and by some it’s been classified officially as a language . . . I was interviewed at a period of time later and they said, “You’ve got children that are talking back and forth to each other kind of like pig Latin. They’ve learned Furbish and the parents don’t understand what they’re saying. What do you think about that?” I said, “What could be better?”

— David Hampton, Furby co-inventor

In his oral history, Hampton also relates how when customers who reported a broken Furby were told they would have to send it in to get a replacement, they often decided to keep the original instead. It was their Furby—drooping ear or broken gears and all.

Furby Forever

Like many things that change popular culture and, in turn, are changed by it, Furbys appeared in movies, TV shows, and even in the lyrics of songs. In fact, sometimes the line got blurred. Tiger Electronics (Hasbro), who made the toy, sued the Warner Brothers film studio for the Furby’s resemblance to the “Gizmo” character in their comedy-horror movie, Gremlins. (It was settled out of court).

Actress Hilary Duff unwraps a Furby during her 18th birthday party on Sept. 28, 2005, inside Club Mood in Hollywood, CA. https://www.nydailynews.com/2023/06/27/furby-then-and-now/

With the rise of the internet, online communities dedicated to Furbys sprang up rapidly, with enthusiasts sharing tips on how to care for their robotic pets and teaching each other the nuances of Furbish.

Furby’s popularity continues to this day, with thousands of passionate collectors and experts discussing Furby facts and trivia online, as well as a very active market for the toys on eBay. Furbys are a durable cultural touchstone that brought a new degree of fun and interactivity to toys. It’s simplicity and customizability remain its strengths, and with the 2023 Furby the legend lives on for new generations to enjoy.

The secret to its success? Furby creator Hampton says, “Here’s what I think: The magic act worked. . . . people added their own imagination to the toy and made it become better than what I had created. They became part of it.”

Visit the Furbys in CHM’s collection and see David Hampton’s full video oral history here.

Main image: Original Furby, 1998. https://official-furby.fandom.com/wiki/Official_Furby_Wiki

 

SUPPORT CHM’S MISSION

The care and feeding of our Furbys and other artifacts of the computing revolution would not be possible without the generous support of people like you who care deeply about decoding technology. Please consider making a donation.

FacebookTwitterCopy Link

The post How Old Is Your Furby? appeared first on CHM.

]]>
Turtles, Blocks, and Memories https://computerhistory.org/blog/turtles-blocks-and-memories/ Tue, 05 Dec 2023 16:54:55 +0000 https://computerhistory.org/?p=28288 CHM welcomes the new VMware Founders Collection, a collection of artifacts and oral history interviews with the innovative Silicon Valley company.

The post Turtles, Blocks, and Memories appeared first on CHM.

]]>
The VMware Founders Collection

Last spring, on the occasion of its 25th anniversary, VMware reached out to the Computer History Museum with a proposal to create the VMware Founders Collection. The resulting collaboration preserves the history of one of Silicon Valley’s most innovative and successful companies. I’m the archivist tasked with building it from scratch.

The VM in VMware stands for “virtual machine.” When presented with this assignment, I was a bit stumped. What artifacts can we collect for a company whose product is virtual?

Solving Hard Problems

VMware’s history is one of the most compelling Silicon Valley stories I’ve ever heard. It begins with smart people trying to solve difficult problems. In the late 1990s, one of those problems was that computers could only run one operating system. In some cases, individual business applications took up an entire machine, while utilizing less than 20% of the machine’s capability. The x86 virtualization program developed by VMware’s founding technologists Mendel Rosenblum, Scott Devine, Ellen Wang, and Edouard Bugnion, enabled a single machine to be segmented into several virtual machines, each with its own operating system.

Once virtualized, computers could handle multiple functions safely and securely. The savings in hardware, space, electricity, and human resources were incredible.

The front cover of VMware Ready to Run Virtual Machine software CDs with a photograph of company founders, 2000. Catalog number 102801286.

Virtualize Everything

Thus was VMware’s vision born: to “virtualize everything.” It was a pretty easy sell to enterprises in the early 2000s. In just a few years, VMware had succeeded in virtualizing individual computers and turned its attention to virtualizing networks, another formidable challenge. Under the leadership of CEO Diane Greene, VMware built its smart team into a company with some of the best engineers in the business. Over the course of several years, VMware’s innovators developed effective and reliable ways to virtualize all kinds of networks.

Installation discs for VMware vSphere 4 and 1 installation disc for Cisco Nexus 1000V Virtual Ethernet Module for VMware vSphere 4 from 2009. Catalog number 102801283.

Virtualization at many levels and in many different areas of computing has enabled the IT revolution in which we are currently living. It is, in fact, the very basis of cloud computing.

But still, as an archivist, I had to ask, “What are the things we can collect? What can we put in our museum to tell this story?”

The Real (Hi)Story

Most long-lived companies develop a very strong corporate culture that’s reflected in the material culture they produce, like marketing collateral, T-shirts, awards, etc. (SWAG, anyone?) We collect these things because they encapsulate the spirit of the company and the uniqueness of the innovation environment.

For VMware, the spirit of innovation is found in the form of a block. The earliest concept of the virtual machine was as a hardware block (a CPU) divided into virtual blocks (virtual machines). Employees receive glass blocks for their 4th, 8th, and 12th anniversaries. We’ve collected these, along with boxes and discs of VMware’s early products.

Glass cubes awarded to employees on their 4th anniversary with VMware. Catalog number 102801292.

VMware’s creative and passionate spirit can be found in T-shirts featuring cheeky taglines and images; digital materials documenting VMware employee activities; and a crew of turtles who call the pond in VMware’s Palo Alto campus their home. We aren’t collecting the turtles, of course, but we will hold on to Rosie’s Turtle Tales, a book chronicling their adventures, along with representative sample of VMware SWAG.

Hardback book titled Rosie’s Turtle Tales, by Cathy Luo, illustrated by Chelsea Wilson, 2019. Catalog number 102801296.

T-shirt designed by VMware with a graphic design showing network components and that reads “Liberation Through Virtualization.” Catalog number 102801297.

All of these things are fun and interesting, but they only tell part of the story. The real manifestation of the culture is in the people who have made VMware’s history through their work. We must collect their memories too.

I’ve had the privilege to interview 21 current and former VMware employees from all areas of the company and all regions of the world. I’ve learned a lot about VMware’s groundbreaking innovations, their worldwide impact, and the culture that made it all possible.

The author with Ganesh Venkitachalam, VP of Engineering & Product Management, Cloud Storage and Data Protection, VMware for his interview on September 12, 2023.

On Innovation

Ganesh Venkitachalam on practical innovation.

Ray O’Farrell on respect for engineers.

On Strong Leadership and Strong Core Values

Pam Cass on making a difference.

On Lasting Impact

Chris Wolf on how virtualization changed the world.

We invite you to explore the VMware Founders Collection here, including my interviews with VMware employees. More material will be added in the coming months. And CHM curators will be conducting full-length oral history interviews with some of VMware’s most important contributors.

It has been a real pleasure for me to get to know VMware—its technologies, its culture, and most of all its people around the globe. I would like to thank Amy Plunkett, Director—Global Communications, for being my guide and partner in building the VMware Founders Collection.

See more VMware artifacts in the VMware Founders Collection. If you have materials (or memories) related to VMware, please reach out to us at vmware-history@computerhistory.org.

Main image: VMware book titled “Who We Are, 2013/2014.” Catalog number 102801295.

 

SUPPORT CHM’S MISSION

Preserving collections like these would not be possible without the generous support of people like you who care deeply about computing history. Please consider making a donation.

FacebookTwitterCopy Link

The post Turtles, Blocks, and Memories appeared first on CHM.

]]>
ERMA Can Do It! https://computerhistory.org/blog/erma-can-do-it/ Mon, 26 Jun 2023 15:40:09 +0000 https://computerhistory.org/?p=27512 Find out the story behind ERMA, CHM's exciting (and rare) new addition to the collection!

The post ERMA Can Do It! appeared first on CHM.

]]>

ERMA was the absolute beginning of the mechanization of business.

— Thomas Morrin, SRI Director of Engineering

On May 17, the Computer History Museum received a very special artifact for its permanent collection. This object, a highly-specialized computer system, rescued Bank of America—indeed the entire American banking industry—from being buried under an avalanche of paper and marked the earliest large-scale use of computers in business. The machine was called ERMA: the Electronic Recording Machine, Accounting, and the system came from the Bank of America building in Concord, California, where it had proudly been on display for several decades.

ERMA display at Bank of America’s Concord campus. Only the major parts of ERMA were preserved. Photo by Aurora Tucker.

Recently arrived ERMA units at CHM’s environmentally controlled storage facility. Photo by Aurora Tucker.

ERMA’s Story

Before the mid-twentieth century, banking was a time-consuming, manual process. Deposits and withdrawals were recorded by hand, and account balances were calculated using mechanical adding machines. An experienced bookkeeper could post 245 bank accounts in an hour—about 2,000 in an eight-hour workday, or approximately 10,000 per week. (Even today, despite new payment systems like Venmo and Paypal, the average American writes 38 checks per year.) The system was prone to errors, and transactions could take days to process. And, because communication and record-keeping were largely done on paper, banks were limited in their ability to serve any but local customers.

Amy Weaver Fisher and James L. McKenney, “The Development of the ERMA Banking System: Lessons From History,” IEEE Annals of the Historv of Computing, Vol 15, no. 1, (1993): 45.

After WW II, a booming middle class placed huge demands on American banks. At Bank of America (BofA), checking accounts were growing at a rate of 23,000 per month and branches had to close every day at 2 or 3:00 p.m. to handle the paperwork for that day. Even as early as 1948, the bank was processing over two billion checks per year.

ERMA was designed and built in mid-century. Note the atomic and “space-age” themes. https://archive.computerhistory.org/resources/access/text/2023/04/102726943-05-01-acc.pdf

Bank of America, called Bank of Italy until 1930, was founded in 1904 by A.P. Giannini as a small neighborhood bank in San Francisco’s North Beach district. The bank‘s philosophy was to provide banking services to those not traditionally served by local banks, like immigrants and new small business owners. The bank’s success was phenomenal. By the end of 1941, Bank of America boasted 495 branches and $2.1 billion in assets. During World War II, California’s population and economy mushroomed, boosting Bank of America’s resources to more than $5 billion, more than any commercial bank in the world.

Preparing ERMA’s sorter for transport while A.P. Giannini looks on. Photo by Aurora Tucker.

Given this growth, and the expected growth in bank accounts after the War, B of A Vice President S. Clark Beise went to business machine companies like IBM and Burroughs to see if they could design a solution. But, these companies were not interested. Beise then approached the Stanford Research Institute (SRI) in Menlo Park, California, about applying automated methods to the problem, and they agreed to explore possible solutions. The proposed system was called ERMA and took some seven years of development to complete.

Led by SRI’s director of engineering, Thomas H. Morrin, ERMA’s development was a complex undertaking that involved multiple teams and disciplines and yet never fully functioned reliably. After being awarded the contract to build ERMA banking systems by Bank of America, General Electric found the SRI ERMA obsolete in both system architecture and vacuum tube technology. Little, if any, of the prototype could be reused, and GE had no alternative but to renegotiate the contract. To execute on this new contract, GE established a Computer Department and manned it with experienced computer engineers and programmers. It took a major engineering effort to design and build a working, manufacturable product, but the result, based on a stored program computer implemented using transistor circuitry, was an extremely reliable, efficient, and general purpose computer design that launched GE into the computer business.

On the BofA side, Al Zipf headed the equipment research department and coordinated development of important subsystems, including MICR. At its core, ERMA was a combination of hardware and software that automated banking operations. The hardware included a magnetic ink character recognition (MICR) system that could read information encoded on checks, a high-speed printer and sorter for check processing, and a computer to process transactions.

ERMA’s high-speed sorter. Built by GE in partnership with National and Pitney-Bowes. Photo by Aurora Tucker.

A major challenge was developing the software to process transactions. The team had to design a system that could handle millions of transactions per day, while ensuring that account balances were accurate and up-to-date. They accomplished this by using a combination of processing methods. Batch processing was used to process transactions in large groups, while real-time processing was used for high-priority transactions, such as those involving large sums of money. Interestingly, future AI pioneer Joseph Weizenbaum was on the ERMA software team, just a few years prior to writing his famous (and controversial) ELIZA chatbot program.

ERMA tape drives. Made by Ampex. Photo by Aurora Tucker.

After renegotiating the contract, it would take GE two years to manufacture the first GE ERMA system, which was delivered to the BofA in September, 1959. Over the next two years, an additional 32 systems were built, tested, and installed, and by 1966, 12 regional ERMA centers served all but 21 of Bank of America’s 900 branches. Handling more than 750 million checks a year, the new ERMA was an immediate success. Within just a couple of years, it had shown the entire banking industry a new way of doing business.

ERMA was introduced to the world in September 1959 by General Electric spokesman and future American president Ronald Reagan.

By 1961, ERMA was handling 2.3 million accounts. The system was eventually able to read ten checks a second, with errors on the order of 1 per 100,000 checks. ERMA contained more than a million feet of wiring, 5 input consoles with MICR readers, 2 magnetic memory drums, the check sorter, a high-speed printer, a power control panel, a maintenance board, 24 racks holding 1,500 electrical packages and 500 relay packages, and 12 magnetic tape drives for 2,400-foot tape reels.

ERMA’s Impact

The development of a Common Machine Language [MICR] had more impact than any other bank operation in the 20th century.

— The Federal Reserve

ERMA sped up check processing by 80%, handling 33,000 accounts in the time it would take a human teller to process 250—and did it without error. The system relied on a clever encoding system: by writing the three important numbers required to process a check (bank routing number, the customer’s account number, and the check number) using a special magnetic ink at the bottom of every check, ERMA could instantly get the relevant information it needed for the transaction. Magnetic ink was chosen to provide resistance against smudging or wrinkling of the check, and the system—called Magnetic Ink Character Recognition—was quickly adopted by the American Banking Association as a standard. Soon all banks were using the MICR system invented for ERMA. Take a look at one of your checks today: the MICR characters are still there.

Modern check showing MICR characters at bottom.

In the 1970s, BofA offered a credit card linked to customer checking accounts, another first and so ERMA was also a precursor to modern electronic banking and credit card systems. More generally, ERMA demonstrated the potential of electronic data processing for banking transactions, brought GE into the computer business, and was one of the earliest successful large-scale application of computers to business anywhere. Given that at the time the project started, electronic digital stored-program computers were less than five years old, ERMA was incredibly ambitious.

Today, electronic banking is a ubiquitous part of our lives—we can even use our smartphones to transfer money, pay bills, and check our account balances. When we take photos of our checks for mobile deposit, our smartphones read the MICR characters. ERMA’s legacy is still with us.

Main image: ERMA graphic from Bank of America publicity brochure, https://archive.computerhistory.org/resources/access/text/2023/04/102726943-05-01-acc.pdf

Learn more about ERMA

  1. A Survey of Digital Computers, Ballistic Research Laboratory, 1961, GE 100 ERMA, p. 263 ff.

  2. Interview with Dr. Robert Johnson, Manager, GE Computer Department, Annals of the History of Computing, vol 12, no. 2, 1990, pp.130-137.

  3. SRI Alumni Hall of Fame: https://srialumni.org/halloffame-archive.html

  4. Woodbury, David, O., Let ERMA Do it: The Full Story of Automation, New York: Harcourt, Brace, 1956.

  5. Fisher, A. W., and McKenney, J. L., “The development of the ERMA banking systems: lessons from history,” Annals of the History of Computing, vol. 15, no. 1, 1993, pp. 44-57.

  6. Head, R.V., “ERMA’s lost battalion,” Annals of the History of Computing, vol. 23, no. 3, pp. 64-72.

  7. 1955 SRI newsletter, Research for Industry: http://ed-thelen.org/comp-hist/SRI_Newsletter_Oct55-1.pdf

  8. GE Computer, GE 210 ERMA: https://www.youtube.com/watch?v=QfHMu75cfjg

  9. Kim, H. Hannah, ERMA’s Whiz Kids: https://increment.com/teams/ermas-whiz-kids/

FacebookTwitterCopy Link

The post ERMA Can Do It! appeared first on CHM.

]]>