CHM Live Archives - CHM https://computerhistory.org/blog/category/chm-live/ Computer History Museum Fri, 13 Feb 2026 16:38:45 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 Dating in a Digital World https://computerhistory.org/blog/dating-in-a-digital-world/ Fri, 13 Feb 2026 16:38:45 +0000 https://computerhistory.org/?p=33366 Computer dating experts from different eras share their experiences coding machines to tackle the ancient challenge of human attraction.

The post Dating in a Digital World appeared first on CHM.

]]>

Anything that could create more love is a positive thing.

— Gary Kremen

Have you ever used a dating app? If so, you’re not alone. Hundreds of millions of people use dating apps daily. Just in time for Valentine’s Day, CHM Live featured a panel of online dating experts from different eras, including Cofounder of Operation Match Jeffrey Tarr, Founder of Match.com Gary Kremen, and online dating consultant Steve Dean. The panel was moderated by Hanna Kozlowska, author of an upcoming book on the topic.

Punch Card Connecting

“We were two males in college who were very unlucky at dating,” explains Jeff Tarr, cofounder of Operation Match. It was 1965, and he was a 19-year-old undergraduate at Harvard. With money he’d won on a quiz show and a knowledge of IBM machines gained in a summer job, he and a friend launched a new endeavor. Offered at elite colleges in New England, Operation Match was originally a questionnaire with 75 questions that hopeful students could submit to have an “all-knowing” computer match them with a compatible date for $3.

Advertised by newspapers that would receive 10% of the take, Tarr received 7,800 responses. He and his partner paid to have them punched onto cards and rented service on an IBM 1401 during cheap off-hours to have them processed. Participants received 6 “ideal dates” and Operation Match was up and running. Improving the questionnaire and expanding across the country, the second version was wildly successful.

Jeffrey Tarr describes the popularity of Operation Match.

Operation Match worked on a simplistic basis, Tarr noted, nowhere near today’s dating apps. Of the 150 questions, they only effectively used 10 in the computer sort. But, there is plenty of anecdotal evidence that it worked—married couples still approach Tarr to thank him for connecting them.

Internet Introductions

When Gary Kremen was in his late 20s and a graduate student at Stanford’s business school, he was looking for dates through personal ads and 900 numbers without success. Good at computers and intrigued by how personal ads drove revenue for newspapers, he devised Match.com, the first and biggest online dating service.

The first incarnation, in 1993–1994, was based on email since few people had web browsers at the time. When the internet arrived, Match servers would go into overdrive during lunchtime because people could only access the web at work. At that time, more women were entering the workforce, people were marrying later, and everyone seemed eager to find a match efficiently. When Kremen realized there was a disconnect regarding profile photos—men wanted them, women didn’t—he decided to dig deeper.

Gary Kremen explains how talking to and hiring women improved Match.com.

Talking to women customers and bringing women into the company made Match.com better, with security features like blocking. Back then, the market was so huge that Kremen wasn’t worried about customer acquisition, even though he might lose two customers if the service succeeded so well that a couple dropped off when they committed to each other.

App Attraction

Steve Dean says that today’s dating apps have made the cost of rejection very low, and users don’t usually leave an app permanently. Relationships often end after all. Lifetime user value is calculated not just on the initial period when a user joins the app but rather over the course of years. Often, a new user will burn out in first couple months because they’re using many apps at the same time, but after that wears off, they’re back on again. People clearly want to believe the apps work, but do they?

Dean believes dating apps have solved the problem of compatibility—delivering attractive matches—but a longer-term commitment that probably has a certain element of randomness is more difficult to deliver. Mobile devices and the ability to make profiles quickly has streamlined the industry. In 2012, Tinder collapsed everything down to four taps and a user could make a profile and get a match in seconds. That was unheard of at the time—the platforms required extensive questionnaires, and an eHarmony profile took 45 minutes to complete, for example.

But lately, dating app fatigue seems to be setting in. Dean is clear on the cause—the monopolistic Match Group and their addictive products.

Steve Dean says Match Group is causing dating app fatigue.

Match Group owns Tinder, Hinge, Match, Plenty of Fish, and countless other dating apps. Dean treats himself as a guinea pig, joining all of them and more so that he can see what people are experiencing. That sometimes involves messages coming in during the middle of the night trying to get him to engage. He takes screenshots of those and puts them in a folder he calls “Notification Hell.”

AI is now playing a role in the industry. Dean notes that it is now possible to be engaging only with an AI on a dating app, further reducing the human authenticity that people crave. On the positive side, some apps are adding AI that can help a user create a better profile or join in on a thread to help them flirt. As Gary points out, AI is like any other new tool or platform, and it can be used for good and bad. The tech is still in its infancy as far as helping to solve the business side of dating apps. Once it succeeds, we’ll see connection like we never have before.

So, hold on a little longer and you just might find that special someone!

Main image: From left to right, Hanna Kozlowska, Gary Kremen, Steve Dean.

Watch the Full Conversation

Algorithms of Love | CHM Live, February 4, 2026

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Dating in a Digital World appeared first on CHM.

]]>
Read Me https://computerhistory.org/blog/read-me/ Wed, 11 Feb 2026 19:25:22 +0000 https://computerhistory.org/?p=33317 Historian and author Patrick McCray shares stories about writing his new book that explores a variety of books about computing.

The post Read Me appeared first on CHM.

]]>
Exploring the Books of the Computing Revolution

It may be hard to imagine in our current digital age, but printed books were once considered an innovative (and dangerous!) technology. And, for decades, books have helped people to understand the quickly evolving revolution in computing.

On January 20, 2026, author and UC Santa Barbara history professor W. Patrick McCray was on stage at CHM Live to share his experiences writing ReadMe: A Bookish History of Computing from Electronic Brains to Everything Machines (MIT Press, 2025). The fireside chat was moderated by David C. Brock, CHM’s Robert and Bette Finnigan Fellow.

Prologue

McCray began to think about his project during the pandemic, when he thought he might be able to write a book about books so that he could read at home and avoid traveling to archives. Those circumstances made him consider how every book has its own origin story and history, and McCray became just as interested in exploring the authors and their writing processes, their relationships with their publishers, and the cultural context of the books he was writing about as he was their content.

For ReadMe, McCray wanted to explore the ways technology was presented to the public. He limited his selection to nonfiction books that represented a range of different functions. Some, like Alvin Toffler’s Future Shock (1970), were bestsellers by any measure, some were technical textbooks, and others popularized computing for a general audience. He also had to choose his historical timeframe.

Patrick McCray explains the rationale for the books he selected.

Records really do matter, says McCray, noting that it’s critical that people donate their papers to a repository. He shared what he found in some of those archives.

Chapter 1: Giant Brains

McCray first discussed Giant Brains: Or, Machines That Think, published in 1949. The author, Edmund C. Berkeley, had worked in the booming insurance industry from the 1930s to ‘50s, a sector rich in data and among the major adopters of digital and electronic computers—called “giant brains” at the time.

Berkeley wanted to explain what computers were and how they worked and hired a writing coach to ensure that the average person could understand his book. Alarmed by post-WWII geopolitical tensions and the threat of nuclear war, Berkeley used the book to warn that computers were powerful tools that could be dangerous if not developed with robust ethics and morals.

Chapter 2: Power vs. Reason

MIT has a robust archival collection for Joseph Weizenbaum and his book Computer Power and Human Reason (1976), reported McCray. Weizenbaum was a computer scientist at MIT in the early 1960s famous for writing the program for the ELIZA chatbot that functioned as a Rogerian psychotherapist. Many users felt that ELIZA was interacting with them on an emotional level. Horrified to see how people imbued the program with empathy, Weizenbaum became concerned about what computers and computer scientists should and shouldn’t do. His book was a critique of the profession, and perhaps of himself.

Patrick McCray describes an author’s personal dilemma.

Chapter 3: Manifesto

Ted Nelson’s 1974 Computer Lib/Dream Machines combined two books back-to-back, assembled and self-published by the author himself. It was a political manifesto that promoted computers as tools for personal liberation, freedom, and democracy. Nelson also predicted that people would someday have computers in their pockets and considered how they would interact with hypermedia and multimedia, like images, sound, and text. Reprinted in 1987, after the PC revolution had not unfolded the way he’d hoped, Nelson lamented that ubiquitous computers could oppress people from everywhere.

Chapter 4: Fonts and Text

Don Knuth, author of the influential The Art of Computer Programming series, was sitting in front of the CHM stage. When reading Knuth’s lectures about typography and typesetting, McCray recognized a shared appreciation for printing, fonts, books, and history. He felt that Knuth’s book on creating digital typesetting, The TeXbook (1986), had to be included in his own book somehow. Fortunately, many of the papers related to the book are at Stanford and many have been digitized.

Chapter 5: Radical Textbook

Carver Mead and Lynn Conway’s Introduction to VLSI Systems, published in 1979, was a textbook, but also a catalyst for the formation of a community. McCray found robust archival materials, for Mead at Caltech and in Conway’s online archive to explore how a textbook about how chips were designed actually had radical agenda.

Patrick McCray explains how a textbook can be radical.

Later, in the ‘80s and ‘90s, Conway told an interviewer that she could see her enduring influence at engineering schools, where chip designs reflected the principles laid out in her and Mead’s textbook.

Chapter 6: Newspapers and Newsletters

In addition to books, McCray also included newspapers and newsletters as important media for communicating about the computing revolution. In her Release 1.0 electronics newsletter, business analyst Esther Dyson explained the new world of cyberspace to the average reader in the 1980s and ‘90s and became a regular talk show guest.

McCray included the San Jose Mercury News in his book in order to discuss the evolution of the modern tech journalist. Today, there are hundreds writing about some aspect of the tech industry, but the tech journalist was only beginning to emerge in the late ‘70s and early ‘80s.

People like Evelyn Richards, a Mercury business reporter with traditional journalism training, and freelancer Michael Malone, who had once written promotional copy for HP, began writing about Silicon Valley at a time when mainstream publications were still learning what the place was all about. They helped bring attention and understanding to it, covering both the good and the bad.

Chapter 7: Readers

The enthusiastic audience shared their favorite books about computers and computing, with the clear winner being Soul of a New Machine by Tracy Kidder, published in 1981. They also reported what books featured in McCray’s ReadMe they had read. See the results below.

Epilogue

McCray says that one of the most interesting aspects of writing his book was seeing the way that ideas like “author,” “writer,” “publisher,” and “bookstore” were dynamic over the time period he covered. How students learn about books, purchase them and consume them today is very different than in past decades. And introducing AI into the mix is making things even more dynamic. What does it mean when computers become authors? That remains to be seen.

In the meantime, be assured that Patrick McCray wrote every word of his book himself.

Watch the Full Conversation

ReadMe | CHM Live, January 20, 2026

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Read Me appeared first on CHM.

]]>
Pixar’s True Story https://computerhistory.org/blog/pixars-true-story/ Fri, 05 Dec 2025 19:32:02 +0000 https://computerhistory.org/?p=32951 The true story of Pixar's IPO and the Silicon Valley investment bankers who took a chance on Steve Jobs' passion project.

The post Pixar’s True Story appeared first on CHM.

]]>
In a world where we’ve gotten more cynical about technology there’s something pure about Pixar that people trust, says former CFO Lawrence Levy. With 29 films over 30 years, the company has never compromised in striving to entertain families in a wholesome way. But, in the early days, Pixar almost didn’t make it.

On stage for CHM Live on November 20, 2025, insiders told the behind-the-scenes story of how Silicon Valley investment bankers rallied around the struggling company next door. They wrangled founder Steve Jobs and manufactured an improbable IPO that rescued Pixar and delivered the first feature-length, computer animated film—the beloved Toy Story. The program was made possible by the generous support of J.P. Morgan.

Moderator Paul Noglows, formerly of Hambrecht & Quist, is cowriting a book with JP Mark, formerly of Robertson Stephens, on the two companies, which, along with Cowen, were the investment banks behind the Pixar IPO (initial public offering). He opened the discussion by asking Levy what it was like at Pixar in the spring of 1995, less than a year before the IPO.

The Setting

Levy had arrived at Pixar in late 1994 and quickly realized the company was doomed. It was facing three major challenges. The first was Steve Jobs, who was at a low point in his career and as difficult as ever. The second was that Pixar had no business, profits, or money. Despite their groundbreaking RenderMan graphics software, Jobs was covering payroll with personal checks. The third problem was that the company had signed a crippling contract with Disney.

Lawrence Levy unpacks Pixar’s contract with Disney.

Although Pixar was in dire straits, Steve Jobs had aspirations for it to go public, and he wanted Morgan Stanley and Goldman Sachs to underwrite the IPO. But the investment banking behemoths immediately saw that the company did not have “up and to the right” growth potential and declined to invest.

So, with Jobs’ begrudging agreement, Levy took the deal to his “local heroes at Robertson Stephens.” Former President and CEO Michael McCaffery remembered that it was hard to figure out who on staff could check out a company that wasn’t like anything on their typical list of semiconductors, software, computing systems, and communications. They, too, realized the numbers weren’t there, but that didn’t scare them. And when they saw what Pixar was doing, they were excited.

Cristina Morgan, the head of technology investment banking at Hambrecht & Quist at the time, also went down to see Pixar. As a Board member of Steve Jobs’ NeXT, CEO Dan Case had told Jobs that H&Q would play any role he wanted them to in an IPO. Like H&Q, she, too, was impressed with what she saw at the Pixar studio.

Cristina Morgan describes her first visit to Pixar.

The bankers knew they were taking a risk with Pixar, but they believed that Pixar’s first movie, Toy Story, was worth betting on.

The Plot

With the investment banks on board, the Pixar team had to finish Toy Story, and that was a nearly impossible task from a technical standpoint. Everything in the movie was set in rooms inside a house because computer graphics could do boxes. They didn’t know if they could even make an outdoor scene. And they only had a matter of months before the film’s scheduled Thanksgiving release to figure it out.

Then there was the challenge of deciding when the IPO should happen. If they did it after a successful movie release, they could be accused of hyping the stock. If they did it after, and the movie was a flop, they could be accused of duping investors. And, of course, if the movie flopped, Pixar was dead.

Plot Twist

They decided to move forward with the IPO, and Steve Jobs set out on a three-week “road show” to pitch the company to potential investors. Cristina Morgan and Mike McCaffery went along. Picky about everything—from the hotels to the food and every detail in between—Jobs created plenty of difficult moments.

In New York City, potential investors were invited to a rented theater in the Upper East Side and told to bring their families to view Toy Story. To sweeten the pot, they offered free candy. The events were designed to, in Mike McCaffery’s words, “create the sugar high of all time.” After New York, the road show was supposed to go on to Boston for a breakfast meeting with investors. But there was trouble.

Mike McCaffery tackles a snowstorm for Steve Jobs.

While the investment bankers knew that Pixar’s future depended on Toy Story’s opening box office success, Levy says that he and Jobs worried about beating the stock price that had been set. No one knew if investors would pay $22 per share, and if Pixar wasn’t “oversubscribed,” the IPO could be deemed a failure.

And, of course, Jobs felt that Disney was not doing enough marketing and everything they did do was terrible. He was on the phone telling a company that had been releasing movies for 50 years how it should be done. The stress was getting to everyone.

Point of No Return

Toy Story opened on Wednesday, November 29, 1995, on the night before Thanksgiving. It made $29 million its opening weekend and went on to become the #1 film in the US. It was the first non-Disney animated film that was a blockbuster.

The IPO happened a week later, and shares closed at $39, up 78% from the offering price. Jobs’s 80% stake was worth over $1 billion. Everyone involved could enjoy the success. Morgan recalled the incredible talent, and the artistry of the revolutionary graphics and technology. She said that it was striking how different and compelling Toy Story was and that without the movie’s magic there would have been no IPO.

Happy Ending

Although the stock price had dropped to $12 three weeks later, Pixar’s IPO had been a success as well as something of a miracle. Morgan credits the investors for their long-term vision in seeing the company’s potential. And Toy Story’s success allowed Levy to renegotiate the terrible Disney contract.

Twelve years after Levy arrived at a company with a negative retained earnings of $50 million, Pixar was sold to Disney for $7.6 billion. He recalled “walks and talks” with Jobs to make decisions and appreciated that Jobs was always more interested in getting to the right answer than in being right. After Pixar, Jobs returned to Apple in a remarkable comeback story that resulted in the revolutionary iPod and iPhone.

Watch the Full Conversation

To Infinity and Beyond | CHM Live, November 20, 2025

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Pixar’s True Story appeared first on CHM.

]]>
Taiwan Rising https://computerhistory.org/blog/taiwan-rising/ Thu, 13 Nov 2025 17:03:45 +0000 https://computerhistory.org/?p=32883 Honghong Tinn, author of Island Tinkerers, shares the fascinating history of how hobbyists and enthusiasts in Taiwan helped transform the country through innovative and creative computer use.

The post Taiwan Rising appeared first on CHM.

]]>
The Origins of a High-Tech Industry

In college, Honghong Tinn built her own computers, using parts from electronic stores at her local shopping mall. While pursuing a PhD, she decided to research other Taiwanese “tinkerers,” uncovering how in the 1960s, ‘70s, and ‘80s they gained the skills and laid the groundwork for global tech giants like Acer, Asus, Quanta, and TSMC.

On November 4, 2025, Tinn, an assistant professor at the University of Minnesota, was on stage at CHM Live to share insights from her book Island Tinkerers: Innovation and Transformation in the Making of Taiwan’s Computing Industry. CHM Curator Hansen Hsu moderated the discussion.

Foundations

Tinn first provided a helpful summary of Taiwanese history. After World War II and the Communist takeover of China, Nationalist leader Chiang Kai Shek moved to Taiwan with 1.2 million followers. One thousand were alumni of National Chiao-Tung University, an engineering school dubbed the “MIT of the Orient.” They worked together to lobby the government to reopen the university in Taiwan, arguing that electrical engineering was critical for both the economy and the military during the Cold War. They succeeded, and the university opened in 1958, enabling a new generation of engineers.

A United Nations technical aid program allowed the National Chiao-Tung University to install the first two mainframe computers in Taiwan. They are IBM 650 and 1620 computers. Technicians, visiting professors, and other computer users had the opportunity to tinker with the mainframe computers. Soon, students in Taiwan began to build minicomputers and calculators from scratch. Many of the parts were not available, said Tinn, and they had to source recycled items, work with factories to custom make some components, or else import expensive parts. Future business leaders, like Barry Lam, the founder of Quanta Computer, was one of those students.

Honghong Tinn describes how tinkering inspired Barry Lam’s career.

Factories

Taiwan became an important components manufacturing center in the mid-1960s, when the government encouraged multinational corporations to set up factories with tax breaks and inexpensive labor. American, European, and Japanese companies like Wang Laboratories, Philips, General Instrument, and Philco-Ford signed on. Women factory workers soldered IC chips, assembled transistor radios, black and white TVs, and wove copper wire into magnetic core memory units, sometimes under a microscope.

Honghong Tinn explores the experience of women factory workers in Taiwan.

In 1972, just $200 US dollars could enable a tinkerer to buy a microprocessor and build a calculator, creating many entrepreneurial opportunities, and by 1978, 20% of calculators in the global market were made by Taiwanese companies. Those companies often transitioned to building computers in the 1980s. Entrepreneurs could choose to build one-of-a-kind computers and find customers, create an Apple or IBM compatible computer, or make a counterfeit knockoff.

Companies that built compatible machines for the export market had to make sure they weren’t running afoul of copyright infringement or risk being labeled as a counterfeiter. Apple, in particular aggressively pushed back against compatible computers with lawsuits claiming unfair foreign trade practices, working with US Customs and Congress to bolster their position. Tinn related how Taiwanese products and entrepreneurs were often stereotyped as counterfeiters.

Honghong Tinn unpacks counterfeiting and stereotypes.

Tinn used CHM oral histories to explore computer company Multitech (later renamed Acer), whose founder, Stan Shih, worked with engineers to ensure that his compatible computers did not copy Apple. As a franchisee for US companies like Texas Instruments, Zilog, and Intel, it was important that he was not seen as a counterfeiter. In fact, his computers had a unique feature to display Chinese characters, missing from US computers.

Unlike Apple, IBM allowed compatible computers until 1987, when they began to charge royalties for patents and licensing. Each company, including Compaq and Acer, negotiated their own rates. In the 1980s, those two companies, one American and the other Taiwanese, were the first to produce IBM PC compatible computers using Intel’s new 32-bit 386 chip amid a global competition. Doing so was a great technical accomplishment, and the companies also demonstrated their strong manufacturing capabilities and even marketing skills.

By around 2011, Taiwan had 90% of the global market share for laptops. Desktop market share was also growing, and if components made in Taiwan were counted, the numbers would be much higher. When a huge earthquake rocked Taiwan in 1999, CNN interviewed Steve Jobs, who noted that the whole industry gets components from Taiwan and implied that it could cause significant supply chain delays for people building computers.

Foundries

Tinn believes that tinkering activities prepared Taiwanese entrepreneurs and skilled labor that could advance computing technologies. For example, in addition to engineers, companies developed strong quality control and equipment maintenance roles and processes. In fact, an entire ecosystem of universities, factories, startups, and hobbyists were all interested in engaging with hardware and tinkering with technology.

This entrepreneurial ecosystem was evident in the case of global giant TSMC, founded by Morris Chang, who combined governmental and non-governmental support to create a company dedicated to fabricating chips for designers in a “foundry” model.

Honghong Tinn explores the origins of TSMC.

Founded in 1987, TSMC grew along with ASML, a Dutch spinoff of Philips that supplied lithograph machines for TSMC’s integrated circuit, or IC, wafer manufacturing. By 1995–96, 60% of TSMC’s revenue came from IC design houses, and Nvidia began to work with TSMC around 1998. And, in 2014, the company reached a turning point when Apple became a client and they began making chips for iPhones. Looking back on his long career, Morris Chang was most proud of his contribution in advancing the evolution of smartphones.

Gone were the days where Taiwanese tinkerers were seen as counterfeiters.

Watch the Full Conversation

Taiwan Rising | CHM Live, November 4, 2025

FacebookTwitterCopy Link

The post Taiwan Rising appeared first on CHM.

]]>
Is Today’s AI Boom Different? https://computerhistory.org/blog/is-todays-ai-boom-different/ Fri, 17 Oct 2025 15:02:25 +0000 https://computerhistory.org/?p=32730 AI company founders from three different eras of artificial intelligence booms (and busts) share their experiences and insights into the future of AI technology.

The post Is Today’s AI Boom Different? appeared first on CHM.

]]>

What’s important is that the public begins to develop some AI literacy.

— Daniela Rus, Cofounder, Liquid AI

We’re living through a boom in artificial intelligence. But, many people may not realize that there have been AI booms—and busts—before. Is it different this time? CHM sought to find out by inviting three AI pioneers who have each navigated distinct eras of AI innovation for a discussion on October 7, 2025. The CHM Live event, “This Time It’s Different: AI Startups Across Three Generations,” was made possible by the generous support of Mark and Mary Stevens.

Marc Weber, a CHM curator and director of the Internet History Program moderated the discussion. He noted that AI busts in the past made the term itself toxic even as startups and companies were still using the technology, and only with more recent hype has AI become popular again.

AI Company Founders

Jerry Kaplan (center) makes a point. Marc Weber is at left and Adam Cheyer at right.

AI Company Founders Jerry Kaplan, cofounder of Teknowledge, an AI company that helped ignite the 1980s expert systems boom, was one of the first to earn a PhD in artificial intelligence, and the company he founded took the work he and others were doing in expert systems at Stanford and sold it to corporations to solve problems. But, he notes, the business model was based on incorrect assumptions—just like the AI industry today.

Representing the next generation, Adam Cheyer, cofounder of Siri, explained that they didn’t pitch their company as AI. They called Siri a “do engine” to distinguish it from a search engine. It could combine knowledge and action to serve as a virtual assistant. With its “ecosystem” of businesses partners, for example, Siri could call you a cab or buy movie tickets. Cheyer credits Steve Jobs, who bought Siri for Apple in 2011 and called it an AI company publicly, with reinvigorating the field.

Director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) Daniela Rus is also the cofounder of AI startup Liquid AI. The company’s AI model operates on physical laws rather than the statistical basis of LLMs (large language models). It’s inspired by nature—a worm with a relatively small number of neurons and synapses that are remarkably powerful and energy efficient. Where a classical AI model needs about 100,000 neurons to keep an autonomous car in a lane, for example, with Liquid AI, you need only 19. And the model can also explain how the vehicle makes decisions and gains skills, allowing for generalization and application to different environments, which classical AI transformers are not good at. Further, the energy savings is significant because it is 1,000 times more efficient. It could, Rus said, democratize AI.

Daniela Rus explains how a new model can open AI tools to everyone. 

AI Problem Solvers

Jerry Kaplan believes that expert systems technology was not good enough to do the things it promised to do and that’s why it collapsed. He sees the ambitions of his era’s academics in AI as part of a continuum of people’s fascination with using technology to enhance or serve humanity, from Frankenstein to today’s dream of artificial general intelligence.

Cheyer was looking at the problem of how to make language practical. He says that Siri shocked the world when it first came out.

Adam Cheyer describes how Siri had to make sense of language.

One of the big innovations with Siri was doing language well in a conversational way and executing on actions—what we would call agentic AI today.

Boom Effects

Both Cheyer and Kaplan believe that the timing of a tech innovation is what makes or breaks it. Rus argued for the value of technological entrepreneurship that brings new ideas and new capabilities to the world over timing. But, when she recalled that it’s often very hard to get the rest of the world to back new ideas she came around to agree with her co-panelists that timing is critical.

So, is today’s AI boom different than previous booms? Kaplan says, “It’s different just like last time.” He believes this bubble will burst, with dire consequences for the entire society.

Jerry Kaplan predicts the AI bust.

What effect will today’s AI revolution have on jobs? Kaplan believes it will have the same effects as other technological advances that substitute capital for labor—it will change the nature of work. Some will lose their jobs, but new kinds of jobs will arise. Rus noted that her MIT colleague, economist David Otter, has done research showing that more than 60% of what people do today did not exist before 1948. No one foretold the rise of the service industry. She believes there will be a flurry of economic activity around AI that will draw all kinds of people with different skills and talents into the industry. It’s critical, she says, that the public begin to develop some AI literacy and empower themselves by understanding what aspects will affect your role.

Adam Cheyer describes the power of AI.

Adam Cheyer notes that AI is already solving problems, including 50-year-old math problems and exponentially advancing our understanding of protein folding, critical for drug discovery. He acknowledges that AI will also cause problems, but overall he’s optimistic it will solve a broad range of challenges. It is, he believes, the most powerful tool humanity has ever created.

For young people aspiring to be founders of new AI companies, Cheyer and Kaplan urge caution. Don’t do it, they say, just because you want to be an entrepreneur or to make money. Do it because you’re passionate about what you’re creating. Hopefully, aspiring AI founders will take Daniela Rus’ words to heart and recognize both the extraordinary opportunity AI provides to enrich our lives and look after humanity and the planet, and create responsible AI tools that serve the greater good.

Watch the Full Conversation

This Time It’s Different | CHM Live, October 7, 2025

Main Image: From left to right, Marc Weber, Jerry Kaplan, Adam Cheyer, Daniela Rus.

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Is Today’s AI Boom Different? appeared first on CHM.

]]>
Cold War Computing https://computerhistory.org/blog/cold-war-computing/ Mon, 29 Sep 2025 18:09:45 +0000 https://computerhistory.org/?p=32638 Historian and author Victor Petrov explores the rise of Bulgaria's powerhouse computing industry during the Cold War and how it evolved.

The post Cold War Computing appeared first on CHM.

]]>
Did you know that Bulgaria became an electronics powerhouse during the Cold War? In an illuminating lecture for CHM Live on September 18, 2025, historian Victor Petrov shared insights from his new book, Balkan Cyberia, a CHM Book Prize winner.

Petrov joked that few people can find Bulgaria on a map. But, in 1981, the small country in Southeastern Europe launched itself into the world of computing with a satellite. Commemorating the 1,300th anniversary of the Bulgarian state, the satellite sent a nationalist message that the country was one of the oldest in Europe and that its Communist leadership was controlling a highly technical society.

Apples to Electronics

In 1944, when the Communists took over, Bulgaria was an agricultural country with few cities and almost no heavy industry. But, by the 1970s and ‘80s, 47% of electronics exports from the Eastern Bloc were Bulgarian, and the industry employed about 13% of the country’s workforce—215,000 people out of a population of under 9 million.

By the late 1950s, Bulgaria’s fast-paced industrialization triggered a debt crisis, and the ruling Communist Party realized it needed a cash cow. For a country without many natural resources, an industry that needed only capital and labor was ideal. Electrical engineer Ivan Popov, who was pursuing a PhD in East Germany, convinced the Party leadership that mass producing computers was the answer.

The first Bulgarian computer was the Vitosha. Built in 1962 with vacuum tubes and lamps, development of the machine was rushed in order to present it in an exhibit in Moscow. It apparently needed so much power that a Russian engineer had to be bribed with brandy to steal power from the Indian delegation’s pavilion. In any case, the Vitosha was a key step forward.

Victor Petrov explains early mass production of electronics in Bulgaria.

The success of ELKA calculators wasn’t enough for the ruling party, who wanted billions in profits. Through his connections with the head of Fujitsu in Japan, Ivan Popov secured the first license to mass produce computers in the Eastern Bloc—a functional copy of the Fujitsu FACOM, called the ZIT in Bulgaria. From 1965 to 1969, hundreds of of engineers trained in Japan to produce 20 of the machines. When they returned home, they brought with them ideas about the Japanese style of management and work that differed sharply from socialism.

Pirates and Pravetz

By the 1970s, the computing industry was bringing in billions. Bloc countries cooperated to build IBM 360 compatible computers, and Bulgaria produced the processors. The conglomerate also reverse-engineered minicomputers when they came on the scene and then the personal computer, all with components from Eastern Bloc countries. Bulgaria’s Pravetz PC was introduced in 1979 and mass produced and exported. At a cost of more than a year’s salary, it wasn’t accessible to the average Bulgarian.

Unique machines were made in addition to IBM copies, like the MIK-16 that operated on the Russian Mir space station. For Bulgaria, the most lucrative products were memory devices, bought in large quantities and at high prices by the Soviet military, among other customers. Electronic secrets became the focus of spy games during the Cold War.

Victor Petrov shares stories of tech espionage.

Exports and Impacts

Exploring the computing industry outside of the Cold War framework, Petrov found that Bulgaria exported to 54 countries and that its biggest market in Asia was India. That relationship, he believes, became a conduit for capitalist thinking.

Victor Petrov explains how Indian customers required new thinking.

Petrov also examined how computerization impacted Bulgarian society through the prism of socialism. He found that everything bad that happened on factories and farms was blamed on workers rather than the automated machines that had been introduced by the 1980s. Workers at the time experienced anxiety and physical strain similar to that of people in the industry today. There is evidence that they sabotaged machines.

Engineering and computing also became embedded with creative pursuits and had cultural and gender dimensions. For example, the vast majority of factory workers were women, who, far from receiving the promised benefits of socialism, like three years off to care for a child, instead were expected to breastfeed while programming.

Bulgarian children had a chance to take computer classes, with the home-grown Pravetz computers provided to schools and computer clubs. The last socialist generation, steeped in sci-fi as well as computing, made a memorable contribution to tech history—in the early ‘90s, the vast majority of the world’s computer viruses came from Bulgaria.

But today, the computing industry that Petrov studied no longer exists.

Watch the Full Conversation

Cold War Computing | CHM Live, September 18, 2025

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Cold War Computing appeared first on CHM.

]]>
Decoding Ancient History With AI https://computerhistory.org/blog/decoding-ancient-history-with-ai/ Mon, 16 Jun 2025 21:20:00 +0000 https://computerhistory.org/?p=32357 Experts in ancient history, computer science, and technology team up to use artificial intelligence to virtually unroll and decipher papyrus scrolls burned in the same volcanic eruption that destroyed Pompeii.

The post Decoding Ancient History With AI appeared first on CHM.

]]>
The Herculaneum Scrolls

Innovations in artificial intelligence are not only changing the present and supercharging a whole new future, they’re also revolutionizing the study of history. On stage at CHM Live, an expert panel shared groundbreaking work deciphering the Herculaneum scrolls, fragile ancient Greek texts burned in the same volcanic eruption that destroyed nearby Pompeii and were thought to have been lost forever.

A burnt scroll, still rolled up, from Herculaneum.

CHM Senior Producer and Manager of Programming Russell Ihrig moderated the fascinating discussion with investor and entrepreneur Nat Friedman, who co-launched the Vesuvius Challenge, Federica Nicolardi, assistant professor of papyrology at the University of Naples Federico II, and Brent Seales, the Stanley and Karen Pigman Chair of Heritage Science and professor of computer science at the University of Kentucky. The program was made possible by the generous support of the Patrick J. McGovern Foundation.

Preserved By Destruction

In 79 CE, Herculaneum was a vibrant Roman city, says Federica Nicolardi. It had shops and tavernas, homes, public buildings, and even ancient fast food. The populace was used to being shaken by frequent earthquakes. Everything changed on August 24, when Mount Vesuvius erupted. The devastating effects of the volcano were different in the neighboring cities of Pompeii and Herculaneum—and key to the survival of the scrolls.

Federica Nicolardi describes the eruption of Mount Vesuvius. 

Buried in 60 feet of thick mud, the city was lost for 1,700 years, until Italian farmers digging wells began to find ancient statues and marbles. An official excavation began in 1738, but it was not conducted in the top-down method used by modern archaeologists. Instead, exploration was done by tunneling, which was difficult and dangerous, as the tunnels could collapse at any time.

Unrolling the Scrolls

When the Herculaneum scrolls were discovered in the ruins of a villa, it wasn’t clear what the compact, irregular, black shapes were. Then, when pieces began to come off and ink became visible, people tried to open them. Seales noted that over a 50-year span, various methods were used to attempt to unroll the scrolls, including with a specialized machine. The results varied widely.

Friedman tried to replicate the process at home with papyrus he bought on Amazon and cooked in a Dutch oven. The result was a flakey, light, very delicate object. Trying to cut it with a knife, soaking it in water, and pouring mercury into it—all methods tried in the 1700s—did not work very well and gave him an appreciation for the challenge.

A scroll that was unrolled physically.

The advent of photography starting in the 1860s helped make the contrast of the writing—essentially black on black—more readable, but since the early 2000s, there’s been a moratorium on further attempts to unroll the scrolls to prevent damage.

Seales had the idea to virtually unwrap those scrolls that hadn’t been opened at all by using a scanner. His team developed software to trace the surface of the scrolls and reconstruct where the glued sheets overlapped. Then they had to find the ink. They’d had a little success with machine learning computing models when Seales received a cold call from Friedman. The two hit it off, Friedman suggested they “open source” the ink challenge, and the Vesuvius Challenge was born.

Nat Friedman describes the Vesuvius Challenge.

The team hired a dozen people to look at cross section X-Rays of the papyrus to follow the spiral so they could provide flattened segments to the community to help make the challenge of finding the ink easier. While some contestants ran machine learning models, one took an unusual approach—he just looked at the X-Rays for hours until he began to identify patterns of cracks and realized it might be dried ink. That revelation was used by another contestant to train an ink detection model. Seales explains how critical AI has been to the project.

Brent Seales explains AI’s role in deciphering the scrolls.

Making History

The first word deciphered was “porphyra,” which means purple in Greek. Nicolardi notes that it’s an interesting word and hard to understand without context. Soon, however, pieces of five or six columns were deciphered and progress was rapid. Today, there are around 15 columns that are readable out of 160, and two thirds of the upper parts of those columns are decipherable. The Greek texts are likely from a specialized part of the Italian villa’s library and relate to Epicurean philosophy. Occurrences of the words “music” and “pleasure” are key.

A scroll that has been digitally “unscrolled.”

There are hundreds of scrolls still to be examined, and many more are likely buried in the vast unexcavated areas of the Herculaneum site. The chance to restore entire works of ancient Greek and Latin texts rather than the fragments scholars usually find is a compelling challenge. And it’s exciting to imagine how the tech of the future is bringing the past into the present and could help to solve the mysteries that remain. In fact, it’s enough to make anyone “scroll obsessed.”

Main image: From left to right, Russell Ihrig, Federica Nicolardi, Brent Seales, Nat Friedman.

 

Watch the Full Conversation

AI Decodes Ancient History | CHM Live, June 10, 2025

FacebookTwitterCopy Link

The post Decoding Ancient History With AI appeared first on CHM.

]]>
Encoding Language https://computerhistory.org/blog/encoding-language/ Fri, 23 May 2025 19:37:43 +0000 https://computerhistory.org/?p=32278 What do you do if your language is not available on devices you want to use to communicate, like computers and smart phones? Experts discuss how Unicode works to make our digital world inclusive.

The post Encoding Language appeared first on CHM.

]]>
How can we ensure that every language—and the communities that speak them—can fully participate in the digital world? That was the question explored at the CHM Live event Character Building: Bridging Code and Culture through Unicode. With over 7,000 modern languages in use today, it’s a difficult task, but the Unicode Consortium, a nonprofit organization that establishes and maintains standards for representing written language, is trying.

An expert panel decoded how Unicode works for the audience and included Roy Boney, Jr., Cherokee language revitalization manager at Cherokee Film, Mark Davis, cofounder and CTO of the Unicode Consortium, and Anushah Hossain, research director of the Script Encoding Initiative. The moderator was Teresa Marshall, vice president of Globalization & Localization at Salesforce.

Equality

In a video clip from a recent CHM oral history interview with Unicode cofounders, Lee Collins and Mark Davis made the point that Unicode aims to enable people everywhere to communicate digitally in their own language. That means Unicode is always evolving. For example, new Chinese ideographs are often added, and additional levels of support like being able to read or type in a particular language are provided.

Hossain added that it’s hard to overstate how important it is that Unicode found a common way to treat the wide variety of writing systems we have in the world.

Anushah Hossain explains the difference between language and script.

Globally, there are close to 350 writing systems and 170 are currently in Unicode. The CLDR (Common Locale Data Repository) project at Unicode deals with language-specific issues, with the goal to customize everything so that the specifics of a language work, like how dates, times, numbers, and currency formats are portrayed in a particular location. Unicode also produces code libraries that can be taken into any product and used so that programmers don’t have to manage all the data that handles the character properties.

Inclusivity

The first step in getting a new script into Unicode is to submit a proposal to a subcommittee called the Script Encoding Working Group, explained Hossain. The 15 or so experts on the committee have linguistic backgrounds or a deep interest in language as well as a programming background. They meet once a month to review all the proposals for new characters or scripts and discuss how the script works, if the proposal adequately explains all the characters, and the reach and legitimacy of the script.

Successful proposals often go back and forth with the authors two or three times before being approved, and then they advance to the Unicode Technical Committee that meets once a quarter. The ISO (International Standards Organization) also has a specific working group dedicated to a universal character code, and they review the same proposals. It’s a complex, multistakeholder process. Davis added that Unicode also hopes to make it easier for individuals and organizations to contribute to fleshing out their own language in Unicode.

Mark Davis explains how Unicode tries to be inclusive.

Like most indigenous languages in the US, Cherokee is endangered, says Roy Boney, so for last 40 years the tribe has been trying to preserve and revitalize it. But getting people to shift away from the fonts they had created to the Unicode script has not been easy. There’s been a lot of education in the community about what the tools are and what they can be used for. Originally, they needed a font and keyboard and operating systems that would support the language, and then they began working with companies in Silicon Valley to make sure the language was supported on all their products.

Sometimes the process of adding a language to Unicode can become controversial if different groups disagree on what the script actually looks like, noted Hossain. Old Hungarian, for instance, went through 13 proposals because social and political tensions around a few characters stalled the process. Boney described how a team that included scholars, font designers, historians of the language, and community members worked together to research and craft a proposal that still required revisions.

Davis noted that occasionally characters are fast-tracked, like when a Japanese emperor died, requiring a new era character to be used in dates. Chinese ideographs are the largest part of Unicode, outnumbering all other characters. Tranches come out regularly and involve very large data sets. It’s an involved process because they have to verify that an ideograph is actually new and not a variation on an existing one.

Access

Many of us may take for granted that our language is supported on the devices we use every day—like our computer or smart phone. When it’s not, says Boney, you realize very quickly how limited you are in what you can do.

Roy Boney describes the impact of Unicode Cherokee.

Now that it’s common for people in the Cherokee community to have access to their language on digital devices, more and more people are making their own content. Access gives you confidence to do things in your language and pursue your dreams, says Boney, and he’s thankful for Unicode.

While most people’s languages are in Unicode and it has fairly full support for about 100, many languages don’t have enough to help them get to the same level as Cherokee. And there are still a lot of historical works, like those in hieroglyphics, that are not yet able to have digital representation, notes Davis. And, as people find more things they want to do on computers, Unicode has to adapt to meet product requirements.

Hossain says it’s important for Unicode to maintain what’s already there and respond to reported bugs. Arabic has Unicode but it doesn’t work great and there’s a lot still to do to make it functional for people. If there is even a little friction, it’s easy for people to just switch to Latin script or come up with a hack. That’s a problem, because text won’t be processed properly by search engines or anything on the internet.

These are big challenges for a small organization that has more work than people. While everyone benefits from Unicode’s vital work, it’s easy to use their tools without contributing to help it survive. But perhaps telling real-life stories about the positive impacts and the challenges of language inclusivity can help inspire and motivate stakeholders to continue to invest in Unicode and our collective digital future.

Main image: From left to right, Teresa Marshall, Roy Boney, Jr., Mark Davis, Anushah Hossain.

Watch the Full Conversation

Character Building | CHM Live, May 13, 2025

 

SUPPORT CHM’S MISSION

Free events like these would not be possible without the generous support of people like you who care deeply about decoding technology for everyone. Please consider making a donation.

FacebookTwitterCopy Link

The post Encoding Language appeared first on CHM.

]]>