retrotech

gordon-bell,-an-architect-of-our-digital-age,-dies-at-age-89

Gordon Bell, an architect of our digital age, dies at age 89

the great memory register in the sky —

Bell architected DEC’s VAX minicomputers, championed computer history, mentored at Microsoft.

A photo of Gordon Bell speaking at the annual PC Forum in Palm Springs, California, March 1989.

Enlarge / A photo of Gordon Bell speaking at the annual PC Forum in Palm Springs, California, March 1989.

Computer pioneer Gordon Bell, who as an early employee of Digital Equipment Corporation (DEC) played a key role in the development of several influential minicomputer systems and also co-founded the first major computer museum, passed away on Friday, according to Bell Labs veteran John Mashey. Mashey announced Bell’s passing in a social media post on Tuesday morning.

“I am very sad to report [the] death May 17 at age 89 of Gordon Bell, famous computer pioneer, a founder of Computer Museum in Boston, and a force behind the @ComputerHistory here in Silicon Valley, and good friend since the 1980s,” wrote Mashey in his announcement. “He succumbed to aspiration pneumonia in Coronado, CA.”

Bell was a pivotal figure in the history of computing and a notable champion of tech history, having founded Boston’s Computer Museum in 1979 that later became the heart of Computer History Museum in Mountain View, with his wife Gwen Bell. He was also the namesake of the ACM’s prestigious Gordon Bell Prize, created to spur innovations in parallel processing.

Born in 1934 in Kirksville, Missouri, Gordon Bell earned degrees in electrical engineering from MIT before being recruited in 1960 by DEC founders Ken Olsen and Harlan Anderson. As the second computer engineer hired at DEC, Bell worked on various components for the PDP-1 system, including floating-point subroutines, tape controllers, and a drum controller.

Bell also invented the first UART (Universal Asynchronous Receiver-Transmitter) for serial communication during his time at DEC. He went on to architect several influential DEC systems, including the PDP-4 and PDP-6. In the 1970s, he played a key role in overseeing the aforementioned VAX minicomputer line as the engineering manager, with Bill Strecker serving as the primary architect for the VAX architecture.

After retiring from DEC in 1983, Bell remained active as an entrepreneur, policy adviser, and researcher. He co-founded Encore Computer and helped establish the NSF’s Computing and Information Science and Engineering Directorate.

In 1995, Bell joined Microsoft Research where he studied telepresence technologies and served as the subject of the MyLifeBits life-logging project. The initiative aimed to realize Vannevar Bush’s vision of a system that could store all the documents, photos, and audio a person experienced in their lifetime.

Bell was elected to the National Academy of Engineering, National Academy of Sciences, and American Academy of Arts and Sciences. He received the National Medal of Technology from President George H.W. Bush in 1991 and the IEEE’s John von Neumann medal in 1992.

“He was immeasurably helpful”

As news of Bell’s passing spread on social media Tuesday, industry veterans began sharing their memories and condolences. Former Microsoft CTO Ray Ozzie wrote, “I can’t adequately describe how much I loved Gordon and respected what he did for the industry. As a kid I first ran into him at Digital (I was then at DG) when he and Dave were working on VAX. So brilliant, so calm, so very upbeat and optimistic about what the future might hold.”

Ozzie also recalled Bell’s role as a helpful mentor. “The number of times Gordon and I met while at Microsoft – acting as a sounding board, helping me through challenges I was facing – is uncountable,” he wrote.

Former Windows VP Steven Sinofsky also paid tribute to Bell on X, writing, “He was immeasurably helpful at Microsoft where he was a founding advisor and later full time leader in Microsoft Research. He advised and supported countless researchers, projects, and product teams. He was always supportive and insightful beyond words. He never hesitated to provide insights and a few sparks at so many of the offsites that were so important to the evolution of Microsoft.”

“His memory is a blessing to so many,” wrote Sinofsky in his tweet memorializing Bell. “His impact on all of us in technology will be felt for generations. May he rest in peace.”

Gordon Bell, an architect of our digital age, dies at age 89 Read More »

here’s-your-chance-to-own-a-decommissioned-us-government-supercomputer

Here’s your chance to own a decommissioned US government supercomputer

But can it run Crysis —

145,152-core Cheyenne supercomputer was 20th most powerful in the world in 2016.

A photo of the Cheyenne supercomputer, which is now up for auction.

Enlarge / A photo of the Cheyenne supercomputer, which is now up for auction.

On Tuesday, the US General Services Administration began an auction for the decommissioned Cheyenne supercomputer, located in Cheyenne, Wyoming. The 5.34-petaflop supercomputer ranked as the 20th most powerful in the world at the time of its installation in 2016. Bidding started at $2,500, but it’s price is currently $27,643 with the reserve not yet met.

The supercomputer, which officially operated between January 12, 2017, and December 31, 2023, at the NCAR-Wyoming Supercomputing Center, was a powerful (and once considered energy-efficient) system that significantly advanced atmospheric and Earth system sciences research.

“In its lifetime, Cheyenne delivered over 7 billion core-hours, served over 4,400 users, and supported nearly 1,300 NSF awards,” writes the University Corporation for Atmospheric Research (UCAR) on its official Cheyenne information page. “It played a key role in education, supporting more than 80 university courses and training events. Nearly 1,000 projects were awarded for early-career graduate students and postdocs. Perhaps most tellingly, Cheyenne-powered research generated over 4,500 peer-review publications, dissertations and theses, and other works.”

UCAR says that Cheynne was originally slated to be replaced after five years, but the COVID-19 pandemic severely disrupted supply chains, and it clocked in two extra years in its tour of duty. The auction page says that Cheyenne recently experienced maintenance limitations due to faulty quick disconnects in its cooling system. As a result, approximately 1 percent of the compute nodes have failed, primarily due to ECC errors in the DIMMs. Given the expense and downtime associated with repairs, the decision was made to auction off the components.

  • A photo gallery of the Cheyenne supercomputer up for auction.

With a peak performance of 5,340 teraflops (4,788 Linpack teraflops), this SGI ICE XA system was capable of performing over 3 billion calculations per second for every watt of energy consumed, making it three times more energy-efficient than its predecessor, Yellowstone. The system featured 4,032 dual-socket nodes, each with two 18-core, 2.3-GHz Intel Xeon E5-2697v4 processors, for a total of 145,152 CPU cores. It also included 313 terabytes of memory and 40 petabytes of storage. The entire system in operation consumed about 1.7 megawatts of power.

Just to compare, the world’s top-rated supercomputer at the moment—Frontier at Oak Ridge National Labs in Tennessee—features a theoretical peak performance of 1,679.82 petaflops, includes 8,699,904 CPU cores, and uses 22.7 megawatts of power.

The GSA notes that potential buyers of Cheyenne should be aware that professional movers with appropriate equipment will be required to handle the heavy racks and components. The auction includes seven E-Cell pairs (14 total), each with a cooling distribution unit (CDU). Each E-Cell weighs approximately 1,500 lbs. Additionally, the auction features two air-cooled Cheyenne Management Racks, each weighing 2,500 lbs, that contain servers, switches, and power units.

As of this writing, 12 potential buyers have bid on this computing monster so far. The auction closes on May 5 at 6: 11 pm Central Time if you’re interested in bidding. But don’t get too excited by photos of the extensive cabling: As the auction site notes, “fiber optic and CAT5/6 cabling are excluded from the resale package.”

Here’s your chance to own a decommissioned US government supercomputer Read More »

after-48-years,-zilog-is-killing-the-classic-standalone-z80-microprocessor-chip

After 48 years, Zilog is killing the classic standalone Z80 microprocessor chip

rest in silicon —

Z80 powered game consoles, ZX Spectrum, Pac-Man, and a 1970s PC standard based on CP/M.

A cropped portion of a ca. 1980 ad for the Microsoft Z80 SoftCard, which allowed Apple II users to run the CP/M operating system.

Enlarge / A cropped portion of a ca. 1980 ad for the Microsoft Z80 SoftCard, which allowed Apple II users to run the CP/M operating system.

Microsoft

Last week, chip manufacturer Zilog announced that after 48 years on the market, its line of standalone DIP (dual inline package) Z80 CPUs is coming to an end, ceasing sales on June 14, 2024. The 8-bit Z80 architecture debuted in 1976 and powered a small-business-PC revolution in conjunction with CP/M, also serving as the heart of the Nintendo Game Boy, Sinclair ZX Spectrum, the Radio Shack TRS-80, the Pac-Man arcade game, and the TI-83 graphing calculator in various forms.

In a letter to customers dated April 15, 2024, Zilog wrote, “Please be advised that our Wafer Foundry Manufacturer will be discontinuing support for the Z80 product and other product lines. Refer to the attached list of the Z84C00 Z80 products affected.”

Designers typically use the Z84C00 chips because of familiarity with the Z80 architecture or to allow legacy system upgrades without needing significant system redesigns. And while many other embedded chip architectures have superseded these Z80 chips in speed, processing power, and capability, they remained go-to solutions for decades in products that didn’t need any extra horsepower.

Zilog will continue to manufacture the eZ80 microcontroller family, which was introduced in 2001 as a faster version of the Z80 series and comes in different physical package configurations (pin layouts).

Powering a microcomputer revolution

The 8-bit Z80 microprocessor was designed in 1974 by Federico Faggin as a binary-compatible, improved version of the Intel 8080 with a higher clock speed, a built-in DRAM refresh controller, and an extended instruction set. It was extensively used in desktop computers of the late 1970s and early 1980s, arcade video game machines, and embedded systems, and it became a cornerstone of several gaming consoles, like the Sega Master System.

The Tandy Radio Shack TRS-80 (1977), which used the Zilog Z80.

Enlarge / The Tandy Radio Shack TRS-80 (1977), which used the Zilog Z80.

SSPL/Getty Images

During the mid-late 1970s, the Z80 became a popular CPU for S-100 bus machines, which were early personal computers with a 100-pin modular bus system that allowed swapping cards to build systems based on parts from various manufacturers. Digital Research targeted the Z80 as a key platform for its CP/M operating system, and the association between Z80 and CP/M stuck, powering dozens of small business computers until the mid-1980s, when IBM PC clones running Microsoft’s MS-DOS became the new industry standard.

Interestingly, Microsoft’s first hardware product, the Z80 SoftCard for the Apple II in 1980, added the famous Zilog CPU to the classic personal computer and allowed users to run CP/M on that machine. In 1982, Bill Gates claimed that SoftCard installations represented the largest single user base of CP/M machines.

Last call in June 2024

Zilog is notably discontinuing several Z84C00 chips that are still available in classic 40-pin DIP packages resembling the classic Z80 CPU chips of the 1970s. (These standalone chips include a CPU and nothing else, unlike a microcontroller, which can include RAM and other accessory devices.) The DIP design features two rows of 20 pins with a plastic package in between that contains the actual embedded silicon chip, resembling the classic Z80 CPU chips of the 1970s.

After June 14, Zilog will stop taking orders, manufacture whatever orders are available if they are sufficient in quantity, then ship the last runs of the chips to resellers like Mouser Electronics and Digikey.

A classic dual inline package (DIP) version of the Z80 from the 1970s. It features two rows of 20 pins in a ceramic package.

Enlarge / A classic dual inline package (DIP) version of the Z80 from the 1970s. It features two rows of 20 pins in a ceramic package.

The discontinuation list provided by Zilog in its letter includes 13 products from the Z84C00 series, which are chips in the Z80 family that run at clock speeds from 6 to 20 MHz and maintain compatibility with the original Z80 architecture. Here’s the full list of part numbers that will be discontinued:

  • Z84C0006VEG
  • Z84C0006PEG
  • Z84C0010PEG
  • Z84C0008AEG
  • Z84C0020VEG
  • Z84C0008PEG
  • Z84C0010AEG
  • Z84C0008VEG
  • Z84C0010VEG
  • Z84C0010VEG00TR
  • Z84C0020AEG
  • Z84C0020PEG
  • Z84C0006AEG

So while the Z80 architecture will stick around in eZ80 form, it appears that this is the last call for newly manufactured standalone 8-bit Z80 CPU chips in the classic DIP form factor. We reached out to Zilog for clarification about its plans for the future of the Z80 platform but did not receive a response by press time.

After 48 years, Zilog is killing the classic standalone Z80 microprocessor chip Read More »

it’s-no-accident:-these-automotive-safety-features-flopped

It’s no accident: These automotive safety features flopped

safety first —

Over the years, inventors have had some weird ideas about how to make cars safer.

a toy car crashing into another toy car

Aurich Lawson | Getty Images

Turn signals have been a vehicle safety staple since they first appeared on Buicks in 1939. Of course, many drivers don’t use them, perhaps believing that other motorists can telepathically divine others’ intentions.

More people might use turn signals if they knew that drivers’ failure to do so leads to more than 2 million accidents annually, according to a study conducted by the Society of Automotive Engineers. That’s 2 percent of all crashes, according to the National Highway Traffic Safety Administration. And not using turn signals increases the likelihood of an accident by 40 percent, according to the University of Michigan Research Institute.

Human nature could be to blame—death and injury will never happen to us, only others.

You wish.

So, is it any wonder that during the first six decades of automobile production, there were few safety features? The world into which the automobile was born was one in which horses powered most transportation, but that didn’t mean getting around was safe. Say a horse got spooked. If the animal was pulling a carriage, its actions could cause the carriage to barrel away or even overturn, injuring or killing its occupants. Or the horse could cause death directly. In fact, a surprising number of kings met their end over the centuries by a horse’s swift kick. And rail travel proved even deadlier. Studies comparing modern traffic accidents with those of the early 20th century reveal that death from travel is 90 percent less likely today than it was in 1925.

Yet America’s passive acceptance of death from vehicle travel in the late 19th and early 20th century explains why auto safety was sporadically addressed, if at all. Sure, there were attempts at offering basic safety in early automobiles, like windshield wipers and improved lighting. And some safety features endured, such as Ford’s introduction of safety glass as standard equipment in 1927 or GM’s turn signals. But while other car safety features appeared from time to time, many of them just didn’t pan out.

Dead ends on the road to safer cars

Among the earliest attempts at providing safety was the O’Leary Fender, invented by John O’Leary of Cohoes, New York, in 1906. “It is made of bands of iron of such shape and design that falling into it is declared to be like the embrace of a summer girl on a moonlit night on the shore,” wrote The Buffalo News in 1919, with more than a little poetic license.

Advertisement for Pennsylvania Vacuum Cup Tires by the Pennsylvania Rubber Company in Jeannette, Pennsylvania. The Pennsylvania Auto Tube is pictured, 1919.

Enlarge / Advertisement for Pennsylvania Vacuum Cup Tires by the Pennsylvania Rubber Company in Jeannette, Pennsylvania. The Pennsylvania Auto Tube is pictured, 1919.

Jay Paull/Getty Images

According to the account, O’Leary was so confident of the fender’s ability to save lives that he used his own child to prove its safety. “The babe was gathered up on the folds of the fender as tenderly as it had ever been in the arms of its mother,” the newspaper reported, “and was not only uninjured but seemed to enjoy the experience.”

There’s no word on what Mrs. O’Leary thought of using the couple’s child as a crash test dummy. But the invention seemed worthy enough that an unnamed car manufacturer battled O’Leary in court over it and lost. Ultimately, his victory proved futile, as the feature was not adopted.

Others also tried to bring some measure of safety to automobiles, chief among them the Pennsylvania Rubber Company of Jeanette, Pennsylvania. The company’s idea: make a tire tread of small suction cups to improve traction. Called the Pennsylvania Vacuum Cup tire, the product proved to be popular for a while, with reports of sales outnumbering conventional tires 10 to 1, according to the Salt Lake Tribune in 1919. While Pennsylvania wasn’t the only rubber company to offer vacuum cup tires, the concept had its day before fading, although the idea does resurface from time to time.

Nevertheless, safety remained unaddressed, even as the number of deaths was rising substantially.

“Last year more than 22,000 persons were killed in or by automobiles, and something like three quarters of a million injured,” wrote The New Republic in 1926. “The number of dead is almost half as large as the list of fatalities during the nineteen months of America’s participation in the Great War.”

“The 1925 total is 10 percent larger than that for 1924,” the publication added.

The chief causes cited were the same as they are today—namely, speeding, violating the rules of the road, inattention, inexperience, and confusion. But at least one automaker—Stutz—was trying to put safety first.

It’s no accident: These automotive safety features flopped Read More »

after-32-years,-one-of-the-’net’s-oldest-software-archives-is-shutting-down

After 32 years, one of the ’Net’s oldest software archives is shutting down

Ancient server dept. —

Hobbes OS/2 Archive: “As of April 15th, 2024, this site will no longer exist.”

Box art for IBM OS/2 Warp version 3, an OS released in 1995 that competed with Windows.

Enlarge / Box art for IBM OS/2 Warp version 3, an OS released in 1995 that competed with Windows.

IBM

In a move that marks the end of an era, New Mexico State University (NMSU) recently announced the impending closure of its Hobbes OS/2 Archive on April 15, 2024. For over three decades, the archive has been a key resource for users of the IBM OS/2 operating system and its successors, which once competed fiercely with Microsoft Windows.

In a statement made to The Register, a representative of NMSU wrote, “We have made the difficult decision to no longer host these files on hobbes.nmsu.edu. Although I am unable to go into specifics, we had to evaluate our priorities and had to make the difficult decision to discontinue the service.”

Hobbes is hosted by the Department of Information & Communication Technologies at New Mexico State University in Las Cruces, New Mexico. In the official announcement, the site reads, “After many years of service, hobbes.nmsu.edu will be decommissioned and will no longer be available. As of April 15th, 2024, this site will no longer exist.”

OS/2 version 1.2, released in late 1989.

OS/2 version 1.2, released in late 1989.

os2museum.com

We reached out to New Mexico State University to inquire about the history of the Hobbes archive but did not receive a response. The earliest record we’ve found of the Hobbes archive online is this 1992 Walnut Creek CD-ROM collection that gathered up the contents of the archive for offline distribution. At around 32 years old, minimum, that makes Hobbes one of the oldest software archives on the Internet, akin to the University of Michigan’s archives and ibiblio at UNC.

Archivists such as Jason Scott of the Internet Archive have stepped up to say that the files hosted on Hobbes are safe and already mirrored elsewhere. “Nobody should worry about Hobbes, I’ve got Hobbes handled,” wrote Scott on Mastodon in early January. OS/2 World.com also published a statement about making a mirror. But it’s still notable whenever such an old and important piece of Internet history bites the dust.

Like many archives, Hobbes started as an FTP site. “The primary distribution of files on the Internet were via FTP servers,” Scott tells Ars Technica. “And as FTP servers went down, they would also be mirrored as subdirectories in other FTP servers. Companies like CDROM.COM / Walnut Creek became ways to just get a CD-ROM of the items, but they would often make the data available at http://ftp.cdrom.com to download.”

The Hobbes site is a priceless digital time capsule. You can still find the Top 50 Downloads page, which includes sound and image editors, and OS/2 builds of the Thunderbird email client. The archive contains thousands of OS/2 games, applications, utilities, software development tools, documentation, and server software dating back to the launch of OS/2 in 1987. There’s a certain charm in running across OS/2 wallpapers from 1990, and even the archive’s Update Policy is a historical gem—last updated on March 12, 1999.

The legacy of OS/2

The final major IBM release of OS/2, Warp version 4.0, as seen running in an emulator.

Enlarge / The final major IBM release of OS/2, Warp version 4.0, as seen running in an emulator.

OS/2 began as a joint venture between IBM and Microsoft, undertaken as a planned replacement for IBM PC DOS (also called “MS-DOS” in the form sold by Microsoft for PC clones). Despite advanced capabilities like 32-bit processing and multitasking, OS/2 later competed with and struggled to gain traction against Windows. The partnership between IBM and Microsoft dissolved after the success of Windows 3.0, leading to divergent paths in OS strategies for the two companies.

Through iterations like the Warp series, OS/2 established a key presence in niche markets that required high stability, such as ATMs and the New York subway system. Today, its legacy continues in specialized applications and in newer versions (like eComStation) maintained by third-party vendors—despite being overshadowed in the broader market by Linux and Windows.

A footprint like that is worth preserving, and a loss of one of OS/2’s primary archives, even if mirrored elsewhere, is a cultural blow. Apparently, Hobbes has reportedly almost disappeared before but received a stay of execution. In the comments section for an article on The Register, someone named “TrevorH” wrote, “This is not the first time that Hobbes has announced it’s going away. Last time it was rescued after a lot of complaints and a number of students or faculty came forward to continue to maintain it.”

As the final shutdown approaches in April, the legacy of Hobbes is a reminder of the importance of preserving the digital heritage of software for future generations—so that decades from now, historians can look back and see how things got to where they are today.

After 32 years, one of the ’Net’s oldest software archives is shutting down Read More »

inventor-of-ntp-protocol-that-keeps-time-on-billions-of-devices-dies-at-age-85

Inventor of NTP protocol that keeps time on billions of devices dies at age 85

A legend in his own time —

Dave Mills created NTP, the protocol that holds the temporal Internet together, in 1985.

A photo of David L. Mills taken by David Woolley on April 27, 2005.

Enlarge / A photo of David L. Mills taken by David Woolley on April 27, 2005.

David Woolley / Benj Edwards / Getty Images

On Thursday, Internet pioneer Vint Cerf announced that Dr. David L. Mills, the inventor of Network Time Protocol (NTP), died peacefully at age 85 on January 17, 2024. The announcement came in a post on the Internet Society mailing list after Cerf was informed of David’s death by Mills’ daughter, Leigh.

“He was such an iconic element of the early Internet,” wrote Cerf.

Dr. Mills created the Network Time Protocol (NTP) in 1985 to address a crucial challenge in the online world: the synchronization of time across different computer systems and networks. In a digital environment where computers and servers are located all over the world, each with its own internal clock, there’s a significant need for a standardized and accurate timekeeping system.

NTP provides the solution by allowing clocks of computers over a network to synchronize to a common time source. This synchronization is vital for everything from data integrity to network security. For example, NTP keeps network financial transaction timestamps accurate, and it ensures accurate and synchronized timestamps for logging and monitoring network activities.

In the 1970s, during his tenure at COMSAT and involvement with ARPANET (the precursor to the Internet), Mills first identified the need for synchronized time across computer networks. His solution aligned computers to within tens of milliseconds. NTP now operates on billions of devices worldwide, coordinating time across every continent, and has become a cornerstone of modern digital infrastructure.

As detailed in an excellent 2022 New Yorker profile by Nate Hopper, Mills faced significant challenges in maintaining and evolving the protocol, especially as the Internet grew in scale and complexity. His work highlighted the often under-appreciated role of key open source software developers (a topic explored quite well in a 2020 xkcd comic). Mills was born with glaucoma and lost his sight, eventually becoming completely blind. Due to difficulties with his sight, Mills turned over control of the protocol to Harlan Stenn in the 2000s.

A screenshot of Dr. David L. Mills' website at the University of Delaware captured on January 19, 2024.

Enlarge / A screenshot of Dr. David L. Mills’ website at the University of Delaware captured on January 19, 2024.

Aside from his work on NTP, Mills also invented the first “Fuzzball router” for NSFNET (one of the first modern routers, based on the DEC PDP-11 computer), created one of the first implementations of FTP, inspired the creation of “ping,” and played a key role in Internet architecture as the first chairman of the Internet Architecture Task Force.

Mills was widely recognized for his work, becoming a Fellow of the Association for Computing Machinery in 1999 and the Institute of Electrical and Electronics Engineers in 2002, as well as receiving the IEEE Internet Award in 2013 for contributions to network protocols and timekeeping in the development of the Internet.

Mills received his PhD in Computer and Communication Sciences from the University of Michigan in 1971. At the time of his death, Mills was an emeritus professor at the University of Delaware, having retired in 2008 after teaching there for 22 years.

Inventor of NTP protocol that keeps time on billions of devices dies at age 85 Read More »

why-i-hope-the-atari-400-mini-will-bring-respect-to-atari’s-most-underrated-platform

Why I hope the Atari 400 Mini will bring respect to Atari’s most underrated platform

Have you played Atari today? —

Can USB, HDMI, and built-in games raise awareness for a platform overshadowed by the C64?

Retro Games' THE400 Mini console.

Enlarge / Retro Games’ THE400 Mini console.

Retro Games / Benj Edwards

Last week, UK-based Retro Games, Ltd. announced a mini console version of the Atari 400 home computer, first released in 1979. It’s called “THE400 Mini,” and it includes HDMI video output, 25 built-in games, a USB version of Atari’s famous joystick, and it retails for $120. But this release means something more to me personally because my first computer was an Atari 400—and as any other Atari 8-bit computer fan can tell you, the platform often doesn’t get the respect it should. This will be the first time Atari’s 8-bit computer line has received a major retro-remake release.

My Atari 400 story goes a little something like this. Around the time I was born in 1981, my dad bought my older brother (then 5 years old) an Atari 400 so he could play games and learn to program. My brother almost immediately found its flat membrane keyboard frustrating and the Atari 410 cassette drive too slow, so my dad ordered an Atari 800 and an Atari 810 disk drive instead. This began our family’s golden age of Atari 800 gaming, which I’ve written about elsewhere.

I’ve often said if a modern game designer wants to learn how to make games, just dive into the Atari 400/800 game library. There are some priceless gems there you can’t find anywhere else, plus others that play best on the platform. OK, I’ll name a few: The Seven Cities of Gold, Archon, M.U.L.E., Wizard of Wor, Salmon Run, Star Raiders, The Halley Project, and so much more.

A photo of Benj Edwards' family Atari 800 and Atari 400 in his brother's room, Christmas 1985.

Enlarge / A photo of Benj Edwards’ family Atari 800 and Atari 400 in his brother’s room, Christmas 1985.

Even with the new 800, it seems that my dad must have kept the original Atari 400, because by the time I grew up more and wanted “my own computer” in the late 1980s, he gave me the Atari 400. The 800 was still my brother’s baby and typically remained in his bedroom. When I wasn’t playing more complex games like M.U.L.E. and Archon on the 800 with my brother, I hooked up the 400 to a small black-and-white TV set in my room and mostly played Galaxian, Pac-Man, and Donkey Kong on a cartridge. Not long after, I got an Apple II Plus and learned BASIC on that, but the Atari 400 always got pride of place in my growing computer collection.

A snippet from a 1988 to-do list written by Benj Edwards' dad that says

Enlarge / A snippet from a 1988 to-do list written by Benj Edwards’ dad that says “Get TV/monitor for Benj’s Atari 400 computer,” completed 4/14/88.

But enough about me. Let’s talk about the new Atari 400 Mini. I haven’t used it myself yet, so all we have to go on is the information provided by the company—and the company’s reputation. Retro Games has previously released full-sized remakes of the Commodore VIC-20 and the Commodore 64, and mini consoles of the Amiga 500 and the Commodore 64. In 2020, Engadget gave the company’s “THE64 Mini” mixed reviews, praising its looks but complaining about its joystick and poor game selection. We’ll admit preconceived bias and hope the 400 Mini fares much better. Even if the joystick ends up a dud, Retro Games says you can provide your own USB stick or controller.

I also hope THE400 does well because Atari 8-bit fans have a tough time with group identity in the span of retro tech history. Few Americans aside from Atari 400/800 owners have heard of the platform (though the platform did very well in Eastern Europe). The Atari 8-bit series didn’t sell nearly as well as competitors like the Commodore 64 in the US (although Sean Lennon had an Atari 400 as a kid—cool trivia).

And even though the Atari 400/800 series provided the template for Commodore to imitate with the VIC-20 and C64, Commodore undercut Atari in price with cheaper parts, which contributed to Atari’s crash in 1983 and drove Texas Instruments out of the home computer business. More recently, the Commodore 64 has had several retro re-releases since the Commodore 64 Direct-to-TV in 2004. The Atari 400/800 platform has had none until now.

Why I hope the Atari 400 Mini will bring respect to Atari’s most underrated platform Read More »