The Apple IIGS Megahertz Myth

There’s many legends in computer history. But a legend is nothing but a story. Someone tells it, someone else remembers it, and everybody passes it on. And the Apple IIGS has a legend all its own. Here, in Userlandia, we’re going to bust some megahertz myths.

[A side note before proceeding… I see all you Hacker News people filing in. I haven’t had the time recently to properly format this with the numerous source links, citations, and footnotes that exist in the video version. I’ll try to get those filled in here ASAP. For now, you might want to experience it all in the video version.]

I love the Apple IIGS. It’s the fabulous home computer you’d have to be crazy to hate. One look at its spec sheet will tell you why. The Ensoniq synthesizer chip brings 32 voices of polyphonic power to the desktop. Apple’s Video Graphics Controller paints beautiful on-screen pictures from a palette of thousands of colors. Seven slots and seven ports provide plenty of potential for powerful peripherals. These ingredients make a great recipe for a succulent home computer. But you can’t forget the most central ingredient: the central processing unit. It’s a GTE 65SC816 clocked at 2.8 MHz—about 2.72 times faster than an Apple IIe. When the IIGS launched in September 1986 its contemporaries were systems like the Atari 1040ST, the Commodore Amiga 1000, and of course Apple’s own Macintosh Plus. These machines all sported a Motorola 68000 clocked between 7 and 8 MHz. If I know anything about which number is bigger than the other number, I’d say that Motorola’s CPU is faster.

“Now hold on there,” you say! “Megahertz is just the clock speed of the chip—it says nothing about how many instructions are actually executed during those cycles, let alone the time spent reading and writing to RAM!” And you know what, that’s true! The Apple II and Commodore 64 with their 6502 and 6510 CPUs clocked at 1 MHz could trade blows with Z80 powered computers running at three times the clock speed. And the IIGS had the 6502’s 16-bit descendant: the 65C816. Steve Wozniak thought Western Design Center had something special with that chip. In a famous interview in the January 1985 issue of Byte magazine, Woz said,

“[the 65816] should be available soon in an 8 MHz version that will beat the pants off the 68000 in most applications, and in graphics applications it comes pretty close.” End quote. That’s already high praise, but he continues further: “An 8 MHz 65816 is about equivalent to a 16 MHz 68000 in speed, and a 16 MHz 68000 doesn’t exist.”

If you read this in January of 1985 you’d have little reason to doubt Woz’s words. He built the Apple I in a bedroom from a box of scraps and when given real resources followed it up with the Apple II. Even when faced with doubters, he’s got the confidence that comes from engineering the impossible.

“Some of the Macintosh people might disagree with me, but there are ways around most of the problems they see.”

But that “should” in “should be available” was doing a lot of work. Eighteen months later when the IIGS finally shipped, there was no 8 MHz ‘816. It was as nonexistent as Woz’s imaginary 16MHz 68000. 8MHz chips were barely available three years later. What happened? Woz promised us 8 MHz of 68000-crushing glory!

If you poll a random vintage computer user they might posit the idea that Apple held the IIGS’ processor speed back during its development so it wouldn’t compete with the Macintosh. There’s an unsourced claim on Wikipedia that limiting the CPU speed to 2.8MHz was a deliberate decision followed by a note—which is sourced, thank you very much—that the original CPU was certified to run at 4MHz. And that’s true—there’s many IIGSes that have CPUs labeled for 4MHz. This idea’s made its way through various newsgroups and webpages for decades, so it’s no surprise that it made its way into a Wiki article too.

But this theory never sat well with me. People making the claim that Apple restrained the IIGS’ CPU speed for marketing purposes rarely provide sources to back it up. I understand their logic—Apple spent the majority of its marketing and monetary might making the Macintosh the machine of the future. Because the Mac was Steve Jobs’ baby, you end up with declarations like “Steve Jobs hobbled the IIGS so it wouldn’t compete with the Mac.” It’s a common take, especially because it plays into a lot of people’s dislike of Jobs. But there’s one major problem with it: all of Steve Jobs’ power at Apple was stripped away in May of 1985 after the months of executive turmoil that led to the company's first quarterly loss. The IIGS didn't launch until 16 months later.

So why were IIGSes with chips rated at 4 MHz not running them at that speed? Why 2.8 MHz? Isn't that… weirdly specific? Did an 8 MHz machine really get put on ice due to executive meddling? To solve these mysteries I descended into the depths of Usenet, usergroup newsletters, magazines, and interviews. My journey took me through a world of development Hell, problematic yields, and CPU cycle quirks. And this walk down the Apple chip road starts with the wonderful wizard named Woz.

The Apple IIx

It’s the summer of 1983 and Steve Wozniak has just wrapped up a two year leave of absence from Apple Computer. It all started with a six week stint in the hospital after crashing his Beechcraft Bonanza airplane on February 7, 1981. Amnesia from a hard concussion short-circuited Wozniak’s short-term memory. After his recovery he took a leave from Apple to go back to UC Berkeley and finish his degree in computer science and electrical engineering… and run a few rock festivals. By June of 1983 Woz felt he was ready to return to work, and asked Dave Paterson, the head of Apple II engineering, for a job—but this time, in the trenches.

His timing was excellent. Even though the Lisa was taking headlines and the Macintosh was shaking up R&D, the Apple II was making all the money. Brisk sales of the IIe along with the imminent launch of the IIc meant the Apple II division was busier than ever even if they weren’t getting all the glory. And while Steve Jobs was heralding the Mac as the Next Big Thing, designing a next-generation Apple II as a contingency plan was just good business.

At the heart of this proposed machine was a brand new CPU. Bill Mensch’s Western Design Center was busy developing the 65816, a 16-bit update to the venerable 6502 architecture. This chip would bring 16-bit computing to the Apple II while promising backwards compatibility. Users wouldn’t lose their investment in applications, add-in cards, or accessories. Alongside the new CPU was a special coprocessor slot that allowed the user to install a board with a 68000 or 8088. The goal was to build a bridge between the 8- and 16-bit world, so the project had code names like Brooklyn and Golden Gate.

This project would be known publicly as the IIx thanks to Wozniak discussing it on CompuServe or at user groups. But as late ’83 rolled into early ’84 the IIx project stumbled over multiple roadblocks. The coprocessor slot added layers of complexity that distracted from the mission of architecting a new Apple II. But a major complication was the 65816 itself. Apple expected engineering samples in November 1983, but didn’t actually get them until February 1984. What’s worse, those late chips were buggy, unstable, and ultimately unusable. WDC delivered a second batch of chips a few weeks later, but they were no more reliable than the first.

Even if Apple abandoned the coprocessor slot, the project couldn’t go forward without a CPU, and Apple cancelled the IIx project in March of 1984. Now before you aim your ire at Steve Jobs, A+ Magazine, in their IIGS development history, says it was the team leads who suggested canning the troubled machine. With no managerial appetite for a next-generation machine, the Apple II team pivoted from a moonshot to something more achievable. Dan Hillman and Jay Rickard started a project to consolidate most of the discrete chips of an Apple II into a single chip called the Mega II. When they finished the design six months later they weren’t quite sure what to do with it. Would they reduce the cost of the IIe or reduce the size of the IIc?

Imagine their surprise when Woz suggested a second shot at a 16-bit Apple II. The conditions seemed right to give it another go. Less expensive 16-bit computers like the Atari ST were looming on the horizon and the Mac’s hot start was slowing down. By October 1984 Apple finally had a good supply of working 65816 CPUs to design a system. And the Mega II would free up a lot of board space to add new graphics and sound chips. But just as important as chips and specs was a new, focused mission statement. This computer would be an Apple II by Apple II fans for Apple II fans. Woz, Hillman, Rickard, Harvey Leitman, and Lee Collings spent the rest of 1984 hammering out the specifications and solving hard problems like how to speed up memory access without breaking existing software.

Now we’re finally back to that Woz interview I quoted earlier. Byte published it in two parts across the December ’84 and January ’85 issues, and based on the average time to press I reckon it took place in October 1984. By this point the IIx is dead and buried and he’s talking about the new 16-bit Apple II, now codenamed Phoenix. His excitement about an 8MHz ‘816 is palpable, but, again, Woz was careful to say it “should be available soon.” Woz left Apple in February 1985 when the ink for this interview was barely dry. He had a dramatic fight with John Sculley after the Apple II was snubbed at the annual Apple shareholder’s meeting in January 1985. Apple II sales supplied seventy percent of Apple’s revenue in 1984 and Woz’s Apple II compatriots felt they weren’t getting their due. Steve Jobs may not have dialed back the IIGS’ clock speed, but he did shovel endless piles of money towards his pet projects at the expense of the Apple II. Even if Woz had stuck around the 8 MHz ‘816 of his dreams was years away. The IIGS wouldn’t sniff 8 MHz until Applied Engineering released the 7 MHz TransWarp GS accelerator four years later in 1989.

The Need for Speed

If you go looking online for photos of Apple IIGS logic boards, there’s a decent chance you’ll see a 4MHz GTE G65SC816 CPU. Most IIGSes had 4 or 3 MHz CPUs running at 2.8 MHz regardless of the chip’s rated speed. Why?

First, we must understand where that clock signal comes from. The IIGS, like many computers of its era, derives its base clock from a crystal oscillator. The one in the IIGS vibrates 28.636 million times per second, or megahertz. The VGC divides that 28.636 megahertz in half, and 14.318 MHz is then supplied along with a stretch signal to other parts of the system. I bet you've already noticed that these frequencies are multiples of the NTSC colorburst frequency of 3.58MHz. Keep that in mind, it’ll be relevant later.

This 14.318 MHz clock travels to a special ASIC which—depending on your board revision—is called the Fast Processor Interface or Control Your Apple chip. One of its responsibilities is dynamically controlling the CPU’s clock speed. The IIGS normally runs in Fast mode at 2.8 MHz, but the user can switch to Normal mode in the Control Panel which reduces the speed to 1.023 MHz. It’s like the turbo switch on PCs, except controlled by the FPI. This lets the IIGS run speed-sensitive software at the same speed of an original Apple II. But even in fast mode there are times the CPU needs to slow down to access other parts of the system.

The CPU, ROM, and Fast RAM are on the 2.8 MHz, or fast side, while the Mega II chip, slots, sound, video, and so on are on the 1MHz, or slow side. When the CPU is running in fast mode and needs to talk to something on the slow side the FPI dynamically throttles the clock signal to a 1 MHz cycle time to let the CPU synchronize with the Mega II side. This is usually invisible to the user because the system still executes the majority of its cycles at the faster speed, but it means the CPU is not always executing as fast as it could. I haven’t even touched on the eight percent performance penalty from the cycles the FPI spends refreshing RAM.

There’s nothing inherent to this design that limits the system to 2.8 MHz. The Apple IIGS Hardware Reference, Wayne Lowry’s Cortland Custom IC Notes, and Daniel Kruszyna’s KansasFest 2011 presentation about the FPI lay it out clearly that the IIGS’s fast mode could support higher speeds. In principle a redesigned FPI or CYA could use a divider other than 1/5 for the clock signal. A 1/4 divider of 14.318 MHz yields a 3.58 MHz clock speed, which should be well within the capabilities of a 4 MHz chip. And once again, that “should” is doing a lot of work. So why didn’t it run at that speed?

The Birth of the 65C816

The 65C816 and IIGS are inextricably linked, and the ‘816’s story starts in 1981 when GTE’s Desmond Sheahan approached Bill Mensch and Western Design Center about designing a CPU. GTE wanted to manufacture their own chips for their key phone systems, so they licensed both a CMOS manufacturing process and a 6802 CPU design from Mitel. But Mitel’s CPU design team wasn’t up to the task, so GTE asked Mensch to step in. Mensch offered two options: a two-chip 6802 solution or a 6502-based microcontroller. Either one could be manufactured on the CMOS process GTE licensed from Mitel. Sheahan convinced the GTE brass that the one-chip solution was the way to go, and the 65C02 was born.

GTE became the first licensee of the 65C02 thanks to WDC designing their 65SC150 telephony microcontroller. Eventually WDC would license the core to other companies, like NCR, Synertek, and Rockwell. The result was WDC netting a royalty every time one of these licensees sold a standalone CPU or microcontroller with its 65C02 core. Functional 65C02 silicon was available in 1982, and the revenues from licensing deals were finally filling up WDC’s coffers. This is when—according to Mensch’s Computer History Museum oral history and other sources—Apple approached him about a 16-bit successor to the 6502.

The prospect of a big player like Apple being the launch client for a new CPU was hard to resist. Further information on the early days of the ‘816 is fairly elusive, but looking at both historic and contemporary interviews with Mensch and Woz reveals Apple’s influence on the chip’s design. One influence was compatibility. When designing the ‘816 Mensch improved the 6502 bus architecture to eliminate little weirdnesses like false read cycles on store accumulators. This consequently broke Steve Wozniak’s Disk II controller, which had been designed to rely specifically on that exact weirdness.

Now there’s two ways to solve this problem: redesign the floppy controller, or add the weirdnesses back in. Apple tried the first one; redesigning its floppy controller into a single chip called the Integrated Woz Machine. This chip was made independently for the Apple IIc to control its built-in floppy drive. Among its many improvements was eliminating the false read cycle requirement. The Apple IIx could have used an IWM, but the UniDisk drives that took advantage of it wouldn’t be out until 1985. Therefore existing Disk II interfaces and drives still had to work with the IIx. If you were to claim Apple II compatibility, but not be able to work with one of the most popular boards, well, there’d be protests lining the sidewalks of De Anza Boulevard. Other cards might also depend on specific 6502 timing behaviors. Mensch eventually honored Apple’s request that the 65C816 be fully compatible with the NMOS 6502’s timings.

Apple wouldn’t be Mensch’s only licensee for the ‘816—the core can be found in countless embedded systems and microcontrollers. Licensees could extend the core to customize it into a microcontroller or a system-on-chip, or add features specific to their needs. A great example is Ricoh, who licensed the ‘816 core and added several functions, like dedicated multiply and divide units along with special DMA functions. All these add-ons were at the request of a pretty famous customer: Nintendo. The result was the Ricoh 5A22, the CPU inside the Super NES.

One of the famous tales of the ‘816’s development is how it was laid out without computer aided design. As told by Bill Mensch, he designed the ‘816 on a card table at WDC’s humble headquarters in Mesa, Arizona. His sister Katherine did the layout on mylar sheets. Many semiconductor designers had moved on to computer aided design tools to help design and lay out their circuits. Mensch is rightly proud of this feat, but the decision to lay it out by hand with mylar and rubylith wasn’t without consequences.

Let’s circle back to that interview with Woz. GTE published data sheets in 1985 for the G65SC816 which detailed 2, 4, 6, and 8 MHz variants of the chip. Apple, as the prime customer, would’ve had these data sheets far in advance. This would be consistent with Woz’s belief that faster variants were on the way, but for purposes of designing the IIGS they had to settle for 4MHz chips. Multiple photographs of IIGS prototype logic boards with 4MHz chips are on the web, and 4MHz parts shipped in production IIGS computers. But the promise of a faster CPU is very tantalizing to a computer designer, and I’m sure it wasn’t just Woz who was excited about future faster CPUs.

But when would GTE actually produce those faster variants? That’s the question. One source is a 1986 interview with Mensch published in the Fall/Winter issue of COMPUTE!’s Apple Applications magazine. This interview took place before the IIGS announcement, likely sometime in the summer of ’86. Mensch states their average part on a 3 micron process should be 4MHz, and an upcoming 2.4 micron process would yield 6MHz parts. I’d wager that the 8 MHz parts would rely on a 2 micron process at that reduction rate. Some 6MHz parts did exist, and Bill Mensch’s Cortland-badged prototype IIGS has one. A photo of it was shown at his 2021 VCF East panel, and I will note that it’s a WDC branded chip bearing a date code of the first week of February 1987. Whether Mensch put this chip in there himself or it was a sample installed by an Apple engineer is not explained. Nor is it known if its FPI chip actually drives it at a higher speed. But this shows that faster ‘816s did exist. So what was getting in the way of these faster chips actually being used?

Yields, Bugs, and Errata

This brings us to the 65816’s next quandary: yields. This might be a surprise to you, given that the CPU is found in embedded devices worldwide. But a common thread through many of the reports I’ve read about the early days of the ‘816 is that WDC and its fabrication partners struggled to deliver 4MHz and faster chips on time, in volume, and at spec.

Mensch positioned the 4MHz chip as the ‘816’s volume product and said as much in that earlier interview with COMPUTE!.

“Our typical product is 4MHz. We sell 2MHz into the retrofit market, but our typical run-of-the-mill is 4MHz.”

But in reality the IIGS shipped with a mixture of 3 and 4 MHz parts which actually ran at 2.8 MHz in fast mode. Which brings us back to the question of why a machine designed around a 4MHz part would ship with an almost 30% haircut in clock speed. Conspiracy theories aside, could there be a technical reason why the IIGS would run slower than its CPU’s rated speed?

In the same Fall/Winter 1986 issue of COMPUTE!’s Apple Applications where Mensch talked about future plans for the ‘816, David Thornburg interviewed Steve Wozniak about his role in developing the IIGS. The subject of the 65816 came up and Woz delved into some details about its clock speed.

“Our early ideas for the computer had it running around 8MHz. Soon we found we had to back off to about 5.5MHz, and then to 2MHz for that version of the processor. In the end the product came out around 3MHz, which is a good compromise.”

This is consistent with his comments about the desire for 8MHz more than a year earlier in the Byte interview. Woz doesn’t mention what factors made them back off on the clock speed, but during my research I learned a lot about the troubles the ‘816 faced in terms of scaling and yields.

One problem was GTE’s ability to supply chips—not just to Apple, but to other customers. The IIGS would be shipping by the tens of thousands when it launched, and this necessitated a good quantity of chips on hand. Dave Haynie—yes, the Amiga’s Dave Haynie—had trouble in 1985 sourcing 4 MHz 65816s for a potential Commodore 128 successor. He posted about this experience on Usenet in March of 1990.

“At that time, they had fully specced 8MHz parts, yet in ’85, GTE (the only company actually MAKING 65816s) had all they could do to make enough 4MHz parts. Rumor is that Apple managed get enough for the IIGS by actually having a special 2.8MHz version tested.”

He further elaborates with:

“When the GS came out, the only company making '816s was GTE. The main reason I couldn't get any 4MHz ‘816s in quantity was that Apple bought them all. They could make a real deal on 2MHz parts, since the yield on 4MHz was so low, they had more of those than they knew what to do with.”

Haynie also comments in other posts about how early samples of the ‘816 were delivered at 500KHz—yes, kilohertz—and maybe that’s a clue as to why Apple was unhappy in the Apple IIx timeframe.

Yields are a common woe in semiconductor manufacturing and Haynie’s comments about yields line up with what we see in the real world. GTE’s three micron process apparently had problems producing enough 4 MHz chips in volumes to satisfy its customers. Many IIGSes have a 3 MHz G65SC816 despite this speed rating not showing up in GTE’s data sheets. My guess—I can’t find any confirmation for this, but it's what makes the most sense—is that these were chips that couldn't be binned as 4 MHz, so GTE labeled them as 3MHz. Insisting on 4MHz would have meant shipping fewer computers, and the IIGS was delayed enough as it is. While VLSI supplied some 4MHz 65C816 CPUs later in the IIGS’ life, the vast majority of CPUs found in these computers were made by GTE—or, eventually, by California Micro Devices, which purchased GTE’s Microcircuits division in September 1987 after GTE decided to get out of the business. Apple was also negotiating with NCR as a second source, but according to Mensch and others, the deal fell apart before the IIGS shipped.

But let’s say for the sake of argument that GTE was able to produce as many 4 MHz chips as Apple or anyone else wanted to buy. Based on the FPI’s clock divider mechanism and a 14.318 MHz source clock, Apple had a logical clock speed target of 3.58 MHz using a 1/4 divider. That’d still be a compromise over Woz’s dream machine, but it’d be faster than what we got. And if (or when) faster processors became available, the FPI clock divider could be adjusted for them.

Yet those faster processors weren’t materializing; at least, not in any volume. Yields were a factor, yes, but the faster speeds revealed other problems. Trying to learn more about this took me down a rabbit hole of Usenet posts, Applied Engineering documentation, AppleFest event reports, and 6502 development forums. All of them pointed to a common factor: the REP and SEP opcodes. When designing the new 16-bit native mode for the ‘816, Bill Mensch added many new opcodes to enable new features and capabilities for programmers. Two of these new opcodes—called SEP for Set Status Bits and REP for Reset Status Bits—control flag bits for the processor’s status registers. These are crucial to the dual 8/16 bit nature of the ‘816 and how it can switch between 8 and 16 bit operations on the fly.

Unfortunately these opcodes proved problematic at higher speeds. Multiple posts relay statements from WDC or programming guides that timing problems with the layout mask prevented these instructions from completing in their allowed cycle times. These problems only got worse as they tried to shrink the mask down to smaller process nodes. I’m loath to take what amounts to second and sometimes even third-hand accounts from newsgroups and forums at face value—they don't call it the Net of a Million Lies for nothing, after all. But I’m willing to believe the overall theory based on other corroborating evidence (like this WDC data sheet note from 1991). If you look at an Apple IIGS accelerator like the TransWarp GS or the ZipGSX, you’ll notice that they’re not just a CPU and some cache memory. The TransWarp GS has a bunch of support chips and gate array logic, while the ZipGSX has a full-blown ASIC on board.

The GALs for the TransWarp GS were reverse engineered long ago, and Reactive Micro lays it out plainly: GAL3 handles opcode detection and speed control. This matches up with posts relaying comments from Applied Engineering about stretching clocks to handle problematic REP and SEP opcode timings.

Analyzing these posts also reveals the temperature of the Apple II community circa 1990. Apple announced a revised IIGS at San Francisco just before AppleFest in September 1989, and the reaction from attendees was muted at best. Unfortunately there was no CPU speed increase, but base memory was increased to 1MB and the new ROM added more toolbox routines and some feature enhancements. There was someone whose reaction was anything but muted, though, and it was one William David Mensch.

According to multiple accounts of the event, during his keynote address Jean-Louis Gassée said that there would be no speed increase for the revised IIGS because of difficulties securing faster 65816s. Mensch was in attendance and was understandably displeased with this statement. He approached one of the crowd mics and said that he was the designer of the CPU and had in his hand a bag of 12 MHz ‘816s. He proclaimed that if Apple ordered the 12 MHz chips, he would ship them. Jean-Louis reportedly made a comment about wanting a reliable production run, and the two men got into a heated back-and-forth before Gassée left and Mensch was escorted out. No recordings of this keynote exist, or if they do they’re locked away in the Apple Archives at Stanford University.

The version of the story Mensch recounts at his 2021 panel at VCF East largely follows what I’ve read in contemporary reports, except with one difference. He includes an anecdote about how he got Jean-Louis’ Apple coffee cup. He mentions running into Gassée after the keynote, and says that Gassée was very upset and threatened to, quote, “kick [Mensch’s] butt.” No punches were actually thrown, and no butts were actually kicked, and the story peters out without really explaining how Mensch got the coffee cup, but this story shows just how bad Apple and WDC's relationship had become.

Now, you’re a smart viewer, so I bet you know how chickens and eggs work. A company like Apple doesn’t just buy bags of chips; they buy Frito-Lay’s entire annual output. Mensch could have samples of silicon, but WDC wasn’t the one making the chips in volume; its licensees like GTE were. If the fab (say, GTE/CMD or VLSI) can’t guarantee a production run of, say, 50,000 chips, the order doesn’t happen. The evidence in front of us—the delays during IIx development, the inability to deliver 4MHz parts at volume, and the opcode issues that prevented scaling—would certainly justify skepticism of WDC’s ability to work with a fab to deliver faster chips at volume. There were still possibilities for a faster IIGS, though, and these would play right into an Apple II fan’s belief that Apple held it back.

Accelerators, ASIC Incorporated, and Mark Twain

But let’s say you weren’t buying tens of thousands of chips like Apple was; maybe you only needed a thousand or two. Were smaller customers being taken care of? Ray Settle of the Washington Apple Pi Journal was also at the fall 1989 AppleFest, where he reported on the confrontation between Mensch and Gassée. Afterwards, he mentioned a visit with an Applied Engineering representative. Settle still hadn’t received his TransWarp GS, and the AE rep pinned the blame on unreliable 7 MHz CPUs. Another attendee report posted on Usenet by Michael Steele featured similar comments. Keep in mind that the TransWarp was first announced in 1988, didn’t ship until the fall of 1989, and struggled with speed and supply restrictions through 1990. This is further supported by A2-Central’s accelerator reviews in February 1991, where it’s mentioned that AE resorted to offering 6.25 MHz versions of the accelerator because of supply issues—and reliability issues—with 7 MHz chips. Zip Technologies also took a while to ship their 7MHz ZipGSX accelerator, which finally appeared in October 1990, almost a year after the TransWarp GS.

But we don’t have to settle for second-hand reports. Enter Applied Engineering employee John Stephen III. In an October 1989 Usenet post  he mentions the problems with getting 7MHz chips, and alludes to the REP/SEP timing issues. But the other interesting thing he mentions is that getting high speed 10 MHz ‘816s to run at their rated speeds required boosting their input voltage well beyond the standard 5 volts. This made the chips run hotter and often resulted in crashes. And I see all you overclockers in the audience nodding your heads.

Scaling problems aren’t unusual when you move to a smaller process node, and sometime a layout needs fixes—or even a complete redesign. The original mask laid out by Katherine Mensch on the WDC card table had reached its limits. Redoing the mask wouldn’t be easy or cheap, especially when higher speed ‘816s were a smaller piece of the revenue pie. Royalties from the 65C02, 65802, and slower embedded ‘816s were paying the bills. Mensch was also busy working on a 32-bit iteration of the architecture: the 65832. But this proposal never made the jump from datasheet to shipping dock.

Interestingly, this is where a new player steps in. While researching the yield problems I encountered numerous posts on comp.sys.apple2 about an “ASIC 65816.” This tripped me up at first, because there are numerous application-specific integrated circuits that contain an ‘816 core. But no, it turns out that a company named ASIC was creating a new 65816 processor using gate arrays. And they promised that this redesigned ‘816 would reach speeds of 20MHz and beyond.

Imagine my surprise when I saw the name Anthony Fadell mentioned in these posts. Could that be the same Anthony Fadell—Tony Fadell—who was in charge of the iPod? Sure enough, I found an excerpt of Tony’s book, Build, where he talks about designing a 65816 for his startup company, ASIC Incorporated! Now we’re on to something. This gave me enough clues to dig up the November/December 1989 issue of The Michigan Alumnus magazine, where an article tells the tale of the fateful summer of 1989. Fadell was an undergrad student at the University of Michigan, and he spent his summer internship designing nuclear control computers at Ronan Engineering. When he met William Hayes the two hit it off immediately. Fadell was a big Apple II fanboy and yearned for more power. Hayes knew chip design and had connections with other chip designers. The two decided that they would design a better version of the 65816 using a sea-of-gates array. Fadell borrowed $5,000 from his uncle, Hayes leveraged his connections to get cut-rate pricing on supercomputer time, and the two reverse engineered a 65816 design in six weeks. After finding a fab willing to manufacture their design, Fadell was prepared to pitch his prototype to potential patrons.

The Michigan Alumnus article is coy about who the chip was for, but it mentions Fadell flying to Texas in September 1989 to meet with a manufacturer of Apple IIGS accelerator boards. There he negotiated a purchase order and a two year contract, catapulting him to the role of CPU vendor at the age of twenty. With these clues we can deduce that Fadell’s customer was Applied Engineering. If all had gone according to plan, ASIC's chips should have started production in early 1990, and formed the basis of the next generation of TransWarp accelerators. There’s that word again—should.

Ultimately, no TransWarp GS or ZipGSX accelerators ever shipped with ASIC’s design. The chips did exist—multiple trip reports from 1990 AppleFests mention Fadell demonstrating his CPUs in TransWarp accelerators. And in 2022, Fadell posted a photo on Twitter of an ASIC 65816 which he claims would still work today—I’m guessing this is one of the 17 MHz chips used in the AppleFest demos. But posts about ASIC fizzled out after the spring of 1991, which coincides with Fadell graduating from Michigan. The ultimate fate of the chip isn’t really known—did Fadell face legal challenges from WDC? Did Applied Engineering give up on him? Or was it because—as described in an interview with WIRED—he was just too busy roaming the halls of General Magic trying to score a job with former Apple superstars Andy Hertzfeld and Bill Atkinson? My guess is the latter, since General Magic hired him later that year.

In the same tweet as the photo of the chip, Fadell said that he, quote: “sold some to Apple for a new Apple IIGS revision before they canceled it!” Many IIGS superfans have heard tell of the cancelled final revision of the IIGS code named Mark Twain. Featuring an internal floppy drive, a hard disk, and a redesigned logic board the Mark Twain is what many thought the ROM 03 IIGS should have been. It’s entirely probable that Apple fitted some prototypes with faster CPUs. But when media outlets like InCider/A+ magazine were given a top secret demo a mere month before its cancellation, the clock speed was still the same. And the few Mark Twain prototypes that escaped Apple’s development dungeons were equipped with the usual 4MHz chip. This is where the business side and the technical side butt heads.

The Mark Twain was under development in late 1990 into 1991 and then mysteriously no longer under development as of June 1991. Rumors of Apple killing the entire Apple II line hung over the product like a dark cloud, and the dev team had hoped to prove they were greatly exaggerated. Despite John Sculley’s statements promising Apple’s full support for the over five million strong installed base, the lack of marketing and technical improvements to the Apple II over the years meant his words rang hollow. Apple had just introduced the Macintosh LC as their low-cost color Mac, and IBM compatible PCs were getting cheaper and more capable. If Apple released the Mark Twain without any CPU speed boosts, it would’ve appealed mostly to the Apple II’s cost-conscious institutional education market. Would existing IIGS owners buy one instead of just getting an accelerator card for a fraction of the price? And how many new users would it bring to the platform? The Mark Twain would’ve been like the Amiga 1200: a decent improvement, but ultimately too little and too late. The March 1991 release of the Apple IIe card for the Mac LC also put another nail in Mark Twain’s coffin, because many users—especially educators—bought a IIGS for backwards compatibility with classic Apple II software. If you’re the number cruncher who had to choose between spinning up a run of 50,000 Mark Twains that cost a lot more to build than 50,000 IIe cards for Mac LCs that are already in production, already in a warehouse, or already sold to customers, which would you pick?

Now this is where you can rightfully criticize Apple for holding back the Apple IIGS. A Mark Twain with even a 12MHz CPU from ASIC would’ve been a legitimately powerful, well equipped computer for its price. But that would’ve been one last campaign in a war long since lost. Maybe ASIC could have helped, but the good ship Apple was sailing straight into a storm called the beleaguered era. Costs, staff, and projects were starting to spiral out of control, and the Mark Twain would’ve only delayed the inevitable death of the Apple II.

Sanyo, Nintendo, and the End of Acceleration

Even if Apple didn’t see fit to release a faster IIGS, accelerator cards kept the enthusiast market alive for a few more years. Upgrade guides for the TransWarp gave tips on how to install higher-rated ‘816s to get even more speed. This usually required buying samples of higher speed processors from WDC, changing crystals, upgrading the cache, and acquiring new GALs. Once you hot-rodded your card you’d often need to supply more juice over the 5V rail to keep things stable.

All this hackery was finally put to bed in 1992 when new 65C816s rated at 14 MHz hit the market. These chips took Usenet posters by surprise, especially after the ASIC saga. Fabbed by Sanyo, the 14 MHz models could run cool and stable at 5V and apparently solved the issues with the REP and SEP opcodes. Sanyo achieved this by abandoning the die laid out by Katherine Mensch and creating a new layout from scratch. Why Sanyo chose to invest in this is unclear—I found a lot of speculation that they wanted to build a PDA based on the ‘816. Sanyo’s a giant manufacturer, so I’m sure they found a use for it. Maybe WDC felt the heat from ASIC, or maybe they saw ARM and 68K pivoting to the embedded market and finally bit the bullet to stay competitive.

Existing accelerators could be upgraded by changing out their CPU and clock crystal, but by this point the IIGS was but a fraction of the ‘816s out in the wild. Optimistic estimates of the number of IIGSes shipped hover around 1.25 million. The other ‘816 system that most people know—the Super NES—sold 1.3 million units in 1990 just in Japan according to numbers compiled by NeoGAF user Aquamarine, who claims to have gotten them directly from Nintendo's Kyoto offices. The system sold nearly 50 million units worldwide in its lifetime. Mensch is very proud of the SNES using the 65816 and speaks highly of working with Ricoh, the manufacturer of Nintendo’s chips. Seeing as the 5A22 was a custom job by Ricoh, I wondered if they fixed the SEP and REP problem during its design. It wouldn’t be beyond them; they did tweak the 6502 to dodge patents when designing the NES’ 2A03. I haven’t seen the SNES assembler community complain about issues with REP and SEP with the basic 3.56 MHz part, but that doesn’t mean problems don’t exist. Same with emulation and FPGA groups, though I’d still defer to experts in that field. Checking a high-resolution decapped scan of a 5A22, the CPU core looks very similar to the first-generation layout seen on the Programming the 65816 book—I’m guessing it still has the REP/SEP flaw.

And like the IIGS, the SNES could take advantage of accelerators thanks to add-on processors in cartridges. The most well known is the SuperFX, but Ricoh also made a faster 65816. Better known as the Super Accelerator-1, the Ricoh 5A123 runs at a 10.74 MHz clock speed—three times faster than a 5A22. You’ll find it in fan-favorites like Super Mario RPG and Kirby Super Star. SA-1 games started shipping in 1995, which is late in the Super NES’ lifecycle. I haven’t found a decapped scan of the 5A123, but nintendo was shipping 1-Chip SNES consoles around the same time. The 5A122, which combined the CPU along with the SNES’ custom PPU chips, formed the heart of the 1-Chip console. Decapped scans of that chip exist, and its layout looks very similar to the post-Sanyo core. I’d bet that the 5A123 has that same core, and doesn’t suffer from REP and SEP issues.

ARM, Möbius, and The Road Not Taken

Apple discontinued the IIGS in December 1992. Even though the IIe hung on for another year or so, the platform truly died that day. Even in the timeline where WDC had 8MHz chips ready in 1983, and Apple put the GUI on the IIx in 1984, I still think the Apple II as we knew it would’ve died eventually. There’s several limitations that would necessitate an eventual migration to a new architecture.

The primary roadblock is the Mega II side of the machine. This is supposed to be what makes it an Apple II, but it drags the rest of the machine down with it. Depending on the video system for clock generation and timing was a practical engineering choice that became hard technical debt for almost all 1980s computer architectures, especially with ones that traced their roots to the 1970s. The GS is an excellent example of trying to maintain compatibility while adding capability, but it had an obvious ceiling.

But something that gets lost in the 65816 versus 68000 arguments is that CPU architectures have qualities beyond raw speed. You might care about power consumption, memory mapping, or ease of acquisition. And all these factors are a balancing act depending on your application. The 68K’s massive flat memory space was a big point in its favor, as well as its native support for user and supervisor separation. These don’t matter as much to someone who’s writing hand-crafted single-tasking assembly language apps, but they very much matter to someone building a multitasking UNIX workstation.

And that’s not to say that 68K isn’t good for assembly applications. It’s got a great reputation among assembly programmers. But as the world turned to compilers and development environments the 68K architecture was primed for the rising popularity of languages like C. More people were getting into programming, and it’s unrealistic to expect them all to have the potential to become machine language maestros like Rebecca Heineman or Nasir Gebelli. C compilers exist for 6502 and 65816, but it's fair to say that these architectures weren't particularly suited to the intricacies of C.

Another sticking point for high-speed 65816s is the speed of memory. Assuming you threw the entire Mega II side of the IIGS away and built a new ‘816 computer from scratch, a 14 MHz example would need very fast main memory to operate without a cache. In the early 90s, that kind of memory was barely available. How Woz would have built his 8 MHz IIGS in 1985 while affording a decent amount of memory is a rather inconvenient question.

Apple wasn’t the only company facing the limitations of their early designs. Commodore and Atari decided to pivot to the 68000, like Apple did with the Macintosh. Tech debt eventually caught up with them too, especially Commodore, but the problems weren’t unsolvable—the Amiga tried to migrate to PowerPC, after all. Even the Macintosh and IBM PCs had eventual reckonings with fundamental planks of their platforms. Another company was facing the same conundrum: Acorn Computers. The similarities to Apple are there: they were dependent on a 6502-based architecture and had a crucial education market with a significant installed base. Acorn did ship a computer with the 65816—the Acorn Communicator—but when Sophie Wilson visited WDC in 1983 and saw Mensch and crew laying out the 65816 by hand, it struck her: if this motley crew on a card table could design a CPU, so could Acorn. Thus Wilson and Steve Furber forged their own CPU: the Acorn RISC Machine.

Though today’s Arm and WDC sell very different CPUs, they do have something in common: they’re fabless semiconductor designers that license their cores and architectures to other companies. Apple is of course Arm’s most famous partner: they joined Acorn and VLSI to form Advanced RISC Machines in 1990 to create the ARM610 CPU to power the Newton. But what you might not know is that Apple’s involvement with ARM originates with a desire to replace the CPU in the Apple II. The little known Möbius project helmed by Paul Gavarini and Tom Pittard in the Advanced Technology Group was an ARM2-based computer that could emulate 6502, 65816, and 68000 code. Gavarini and Pittard started the project in 1986 and were demoing and compiling benchmarks in 1987—right on the heels of Acorn releasing the Archimedes! There’s little information about this on the web, with Tom Pittard’s bio and Art Sobel’s ARM pages being some of the surviving hints to its existence.

Based on my knowledge of how ARM2 works, I believe the emulation performance of Möbius is wholly derived from the ARM2’s memory system. ARM’s designers were inspired by the 6502’s efficient memory access and optimized the 8 MHz ARM2 to wring as much performance out of the available memory as possible. By using 4 MHz 32-bit-wide fast page memory, pipelining, and special load-store instructions, ARM2 could perform burst transactions at twice the speed of random ones. With a theoretical maximum of 32 MB/s bandwidth in fast page mode, this was eight times the maximum bandwidth of an 8 MHz 68K shackled to 2 MHz DRAM. This strategy would peter out eventually because memory speed couldn’t keep up with CPU speed, but hey—that’s what cache is for!

I’m not sure if Möbius would have been the Apple II’s savior—Acorn’s Archimedes wasn’t 100% backwards compatible despite including a BBC Basic interpreter and eventually a BBC Micro emulator. But with EcoNet network adapters and expansion podules to connect old Beeb peripherals the Arc could ingratiate itself with Acorn’s existing installed base. Could the Apple II have been reborn with an ARM CPU? Maybe. Nobody mentions how well Möbius integrated the rest of the Apple II architecture like slots or video or sound. And say what you will about Apple’s troubles with Western Design Center; ARM was barely a blip on the radar in 1988. Apple wasn’t going to upturn their cart for an unproven architecture from a competitor. Möbius was a skunkworks project; it could’ve taken years to turn its demo into a shipping product and the 68040 was already on the drawing board in 1988. But it was still worth it: Möbius’ benchmarks convinced Larry Tesler that ARM could save the Newton from the disastrous AT&T Hobbit processor. And hey—without ARM, Apple wouldn’t have the iPod, iPhone, and Apple Silicon today. So it worked out in the end.

The End of an Odyssey

What a ride, huh? Thanks for making it this far down a fifty-plus minute rabbit hole. I can’t claim that this is the final take on the subject—so many of the players aren’t on the record, but I’m pretty confident in saying that Apple did not artificially limit the IIGS’ clock speed during its development for marketing purposes. Now, I’m not a fool—I know Apple didn’t push the IIGS as hard as it could, and it was very much neglected towards the end of its run. If the REP/SEP flaws hadn’t existed and GTE could’ve shipped stable 4 MHz chips in volume, I’m sure Apple would’ve clocked them as fast as possible in 1986.

I’ll admit that I initially started this deep dive out of spite. The idea that “Steve Jobs deliberately held the IIGS back, bleh bleh” is everywhere, but it's all just people saying things in an endless game of telephone, with no actual evidence. That’s enough to grind anybody’s gears, but what’s worse are people who I respect uncritically repeating these things in videos and blog posts. It hurts me to see videos with millions of views repeating old Internet urban legends pushed by partisans filtered through Wikipedia articles with bad citations.

But that spite subsided quickly once I started untangling the messy web of people and circumstances that wrapped around this story. I realized that what I wanted wasn’t to prove anybody wrong. What I wanted was to get people to think about these stories and why they became legends. Of course, you could flip that right back around at me and say “who made you the guardian of love and justice?” And that’s a fair point. But my goal here isn’t to push an agenda, but to get a better of understanding of how things happened and why history went the way that it did. I’ve provided my evidence, and it’s up to you to judge if my argument is compelling enough.

And even then, one of those people who needed a refresher on computer legends was yours truly. I’ve done my share of repeating things based on bad sources, just because I had a fanboy desire to defend something. Or because I just thought a story was neat and didn’t look too deeply into it. It’s not so much about somebody who’s being wrong on the internet; it was the realization that I could be somebody who’s being wrong on the internet! It was humbling, really. As the years go on I’ve realized that there’s so much out there to learn, even if I thought I already was an expert. A perfect example is that Bill Gates BASIC easter egg, where I got tripped up by an oft-repeated legend until I actually dug into it. And as vintage and retro tech enthusiasts, are we not people of science? Are not our minds open to new ideas? We’re into this because we enjoy the history and personal connections, and we should all be excited about digging deep and not just repeating the same old story.

Even though the IIGS may not have been able to unleash its full potential, it’s still an amazing machine even at its base speed. If you haven’t had the chance to play with one, give it a try. And turn on accelerator mode in your emulator to get a feel for what could have been.

Installing OS/2 on a PS/2 Model 80: 5000 Subscriber Special

Hi, I’m Dan. You may remember this PS/2 Model 80 from such videos as the IBM 386 Tower of Power. Back in January I threatened you with the good time of installing OS/2 on this beast, and today’s your lucky day. Here, In Userlandia, we’re getting warped.

OS/2. Other than some hands-on time at vintage computer festivals in the past few years I’m a complete noob when it comes to the quote “Better Windows than Windows.” I never saw a live Warp 3 system during the Nineties, and I touched Warp 4 once or twice in the Aughts. But I do have a fascination with operating systems and how we interface with computers. And when I found this Ten Minute Guide to OS/2 Warp at the thrift store a few weeks ago it reminded me that yes, I promised you an installation of OS/2 on this mighty Model 80. And I’m a nerd of their word, so let’s cash in that IOU and take this machine to Warp.

Today I’ll be installing OS/2 Warp 3 Connect Blue Spine with the OS/2 Bonus Pack. Sorry to disappoint anyone who wanted to see me swap forty-plus floppies—I’m not that much of a masochist to risk one bad floppy torpedoing the entire install. What sets the Blue Spine edition apart from this boxed Red Spine edition is that Blue Spine comes with Windows 3.1. Red Spine was pitched to users who already owned Windows, and ergo save IBM and that customer the cost of a Windows license. Using a Blue Spine CD-ROM will save me from finagling more floppies, and Connect includes local area networking. I’ll also need to apply Fixpacks at some point, but I’m leaving that for another day.

Returning viewers might recall that this Model 80’s equipped with a 90MB hard disk and no CD-ROM drive. Instead of messing around with other disks and drives I’m using a BlueSCSI to emulate a hard disk and CD-ROM. This requires booting from the reference disk and sitting through minutes of clunks and chirps while it auto-configures. But my patience was rewarded as it successfully identified the new drives. Huzzah! Next, installing from CD requires booting up from a pair of floppies. After I chose my desired installation type—I’m an advanced user, and I need an advanced installation—the installer partitioned the hard drive image and forced the first of many restarts.

Another reason to choose Advanced Install is to set up IBM’s Boot Manager, which lets you boot multiple operating systems from various drives or partitions. It’s fairly robust and easy to use boot loader for the era, You can also pick the file system for your hard drive: MS-DOS FAT or OS/2’s High Performance File System. HPFS provides nice features like long filenames and large volume support, so I’ll choose HPFS.

I’ve got some time to kill while the installer installs, so let’s chat about OS/2. What I won’t be doing is an exhaustive history of OS/2 in this video. If you’re interested in good modern takes on OS/2’s turbulent history you should check out the excellent retrospectives by Another Boring Topic and RetroBytes. This is more about exploring what it’s like to run OS/2 on a 386. I missed out on OS/2 back in the day, mostly because Microsoft and Apple consumed so much air during the mid-Nineties in the US. When I started reading mainstream computer mags in 1995 the hype train for Windows 95 was full steam ahead. I saw OS/2 ads in PC/Computing but the actual editorial content barely discussed it. A three-way comparison between OS/2 Warp, Windows 95, and Windows 3.11 in the August 95 issue was about the most exposure I got to Warp’s headline features.

What little I read about OS/2 was more than I’d seen in person. OS/2 was completely absent from my schools, even in the tech labs that ran powerful software like CAD programs. I grew up in Digital country, where we used DEC terminals to search library card catalogs that ran on VMS. Accessing email at school required telnetting into a UNIX shell to run Pine. Maybe the ATMs for our local banks used OS/2, but I never knew for sure. Then again, I had the same lack of experience with other OSes like NextStep, Solaris or IRIX.

So far the installation is going well, which wasn’t always the case. Warp’s installer was a major improvement over 2.0’s, but I’ve still seen several sorrowful stories of setups gone awry. To be fair, installing on a PS/2 is basically doing this on easy mode. Users on clone systems might not be so lucky depending on their BIOS and cards. Straying outside of IBM’s hardware garden in early versions of OS/2 was a path to pain, although it did get better in Warp 3 and 4 because each version included more drivers. I don’t think Warp completely shook OS/2’s reputation for brittle installers, but I’d love to hear the tales of your installation successes or failures.

Let’s check back in on the installer. The first phase is nearly complete, and after a reboot it’ll boot from the hard drive to start the second phase. This is where I learned a quirk about the MCA Tribble SCSI adapters—apparently they can only boot from drives that are less than a gigabyte. This and other imitations can be lifted if you swap in newer ROMs from a Spock adapter. I mention this because I originally started with a 1GB disk image and found myself stuck at a blinking cursor after the first phase completed. I was more successful after switching to an 800MB disk image and re-running the installer.

At the start of phase two I’m asked to confirm hardware devices like graphics, sound, printers, and ports. Once again installing on a PS/2 made things easy—it’s all detected and ready to go. The next step is selecting OS/2 software components, and I’ll select them all for a complete install. Lastly, since I’m installing Connect I must choose my networking stacks. Might as well YOLO and install them all! Interestingly, it didn’t auto-detect my Western Digital ethernet card even though it’s configured in the PS/2 BIOS. I thought I’d have to load a driver from a floppy, but after a quick Google I learned that the bundled Standard Microsystems EtherPlus MCA driver was compatible. I selected it and the installer carried on.

While phase 2 continues let’s investigate what’s inside an OS/2 Warp box. This Red Spine edition—along with most of the boxed operating systems and office suites you see on my shelves—was part of a software collection I acquired from somebody decluttering their garage. As far as I can tell this is a complete copy that’s never been used. The top layer of this box cake is a sub-package for the OS/2 Bonus Pack. Inside is a license agreement, a thin guidebook, and an old disclaimer scroll warning you about the dangers of the internet. After thumbing through the inserts you’re left with sixteen floppy disks, which is almost as many as the OS itself. I don’t envy whoever drew the short straw to babysit floppy installs back in the day. I’ll babysit a CD-ROM Bonus Pack instead.

Next is a 400 page-plus OS/2 Warp User’s Guide. Printed in glorious black, white, and Pantone 286 blue, it explains the basics of installing and using OS/2. It’s pretty thorough, with multiple chapters on troubleshooting and fixing compatibility problems. OS/2 Warp included an interactive tutorial, so you might not need most of the getting started part of the manual, but it’s there in case you’d like some light evening reading.

If that’s not enough reading material, then check out the Fall/Winter 1994 edition of IBM Sources and Solutions. This tome cataloged OS/2 compatible hardware and software along with qualified vendors and consultants. Some fun IBM ads are sprinkled throughout, but the most entertaining bit was finding all the defunct stores and consultants in my area. RIP to Egghead and Software Etcetera. And I have to call out the copy editor for letting misspellings of Worcester, Massachusetts escape their red pen. For shame, for shame!

It wouldn’t be a software box without registration and support cards. The OS/2 registration card is pretty straightforward—fill out the fields and and drop it into a mailbox to tell Big Brother… er, Big Blue all of your demographic info and usage habits. You can rest easy knowing that you’ve sold your personal data to a major corporation for absolutely nothing in return. There’s also a technical support card entitling you to sixty days of telephone support starting from your first call. I dialed the 800 number just for laughs and it goes to what I presume is a scam to get seniors to sign up for a knockoff Life Alert device. Bummer.

IBM also saw fit to include ads for several third party software products. Golden ComPass is an assistant for CompuServe that downloads emails, news, forum posts, and so on for offline consumption. Computer Associates’s CA-Realizer is a graphical BASIC development environment. Watcom VX-REXX is another visual development tool, this time for IBM’s REXX scripting language. And Adobe wanted you to buy some fun font packs to leverage OS/2s built-in Adobe Type Manager. $80 for 64 fonts doesn’t sound too bad for the time, though I’d classify most of these typefaces as gimmicks.

Lastly, there’s the requisite legal paperwork and last-minute technical bulletins. The license agreement isn’t as interesting as this warning about XDF-formatted diskettes, or a tip sheet for installing OS/2 on top of an existing Windows setup. But the most fascinating one is this folded sheet asking you to call IBM if you’re installing OS/2 on an IBM Aptiva. Based on the Aptiva’s launch date my hunch is these boxed copies might be missing some necessary drivers.

After digging through all this paperwork we’ve reached the treasure hidden at the bottom of this beastly box: a cache of 21 floppy disks. For those keeping score, that’s eight more than the retail RTM version of Windows 95 and one fewer than Windows NT 3.1. And that number’s actually lower than it could be thanks to XDF formatted floppies cramming 1.86MB of data onto standard 1.44MB high density disks. Combining these floppies with the bonus pack makes for a whopping 37 disks, and a Blue Spine box would have even more for Windows support.

With the lengthy phase two finally complete, we’re on to the even lengthier third and final phase: OS/2 Warp Connect. It gets its own special installation display and partway through it switched to a video mode that none of my LCDs liked. I had to dig out a CRT to track its progress, and whatever mode it used was more unfilmable than usual. So in lieu of that, there’ll be some Bonus Pack installation footage instead. Check out that happy little PC!

And while that installs, this wouldn’t be a 5,000 subscriber special without a 5,000 subscriber special channel update. I’m humbled by the positive reception to my videos, and I genuinely appreciate the time you spend watching, discussing, and sharing my work. I resisted YouTube for a long time because I’ve always had a rocky relationship with recording and editing video. That’s a bit ironic considering that in 2007 it was Slowbeef, Maxwell Adams, and yours truly who respectively made the first, second, and third video Let’s Plays on the SomethingAwful forums. I hung around as one of the original LP superstars until I bowed out around 2012 to concentrate on my convention artist alley tables.

I’ve been writing about computers here and there for almost twenty years, though most of that material has vanished after various websites went offline. If you’ve been listening since the Icon Garden days you might remember my solo Macinography podcasts, some of which might be adapted into future videos. Although my style’s changed a bit since then, what I launched as Userlandia in 2021 is broadly similar to that work. But after a year and change of Userlandia not getting much traction as a blog/podcast I conceded to the reality that the audience for what I’m doing is on YouTube.

I made the switch to video with what amounted to a glorified slideshow for the 2022 VCF Midwest podcast. It was also my first time using DaVinci Resolve. My prior video editing experience was with Camtasia or iMovie, and the one time I tried Premiere didn’t leave me with a good impression. But I picked up Resolve pretty quickly—as it turns out my skills as an audio editor translate pretty well to editing video. But as time went on I realized I needed to take my production to the next level and record actual footage. Thus began my crash course in motion picture filming. My video quality has improved compared to my early attempts, but I’ve still got a ways to go.

Most of my work starts with a script. For something like a Computers of Significant History video I start by researching a subject, taking notes, and writing an outline. Scriptwriting can take weeks, depending on the complexity of the subject and whether I need to run things by an editor friend for a double-check. Once I’m happy with a script I record and edit audio, which takes around three to four hours for a half-hour track. Then I compile a shot list to figure out what needs to be filmed and what gaps can be filled with stills, clips, or gag reels. My camera moves are fairly basic—pans, tilts, slides, zooms, and focus pulls. All the footage and resources are cut together to form a video, and then I polish it up with some background music. This process is largely the same for my trip report videos, except the footage is recorded before scriptwriting. And this video is a bit of a hybrid—I wrote some script ahead of time, recorded some footage, and then wrote some more during and after the installation.

Sticking to a script has its pros and cons, but it’s probably the best for me. I have a lot of respect for people who can speak extemporaneously to a camera, or narrate off-the-cuff over footage they’ve edited together. Being on camera is difficult for me, and until recently I didn’t have enough room to really try it. Over the past month I’ve picked up a wireless microphone and rearranged my filming room to create more space, so I may try being on camera in the future. Another complication is my filming camera: a Sony a99ii. It’s great for stills photography, and its mediocre video capabilities weren’t really a concern when I upgraded from my a99 mark 1 in 2018. But now that I’m recording more video I’m crashing right into its limitations. It overheats when recording long 4K clips and its 1080p output is plagued with moire on screens and textures. There’s also many restrictions on video autofocus, and tracking moving objects or my face is a real challenge. At some point this year I’ll migrate to E-Mount—probably around my trip to Japan in November—but I’m making do for now.

All these factors influence my production style. My goal is to produce long-form narrative stories, like telling my personal history of personal computing with Computers of Significant History. Or examining a specific machine like the SE/30 or this Model 80 from an angle you might not have seen before. I don’t think I’d bring my best material if I filmed my unscripted thoughts. And I don’t think that kind of video is bad—far from it! What I am saying is that it doesn’t play to my strengths. There’s tons of talented people making great walkthrough and talk-through material, and they all deserve your attention.

But I believe one of the best pieces of advice for anyone in a creative field—whether you’re an artist, writer, videographer, whatever—is to make the things you want to see. I was raised on a healthy diet of The New Yankee Workshop, Shadetree Mechanic, and The History Channel. YouTube is full of people making shows like these but for computers, which spoils me for choice. But I love watching people who know what they’re talking about sharing their expertise and enthusiasm with others while showing they can learn and grow. That’s what I want to bring to the table.

LGR remains the gold standard in this space for a reason: he’s constantly tweaking and improving his production while consistently releasing new material. Every video might not be to every viewer’s taste, but his frankly astonishing ability as a one-man band to publish polished pieces on a near-weekly basis means if you’re not feeling this week’s subject the odds are pretty good you might vibe with next week's. Clint makes what he wants to make and even when things go sideways he still commits to the bit. This is on top of the fact that he’s constantly iterating his production in small but noticeable ways. He never assumes he knows everything and he's open to suggestions and constructive criticism towards improving his work. I was really flattered when I got some comments on the Model 80 video that compared me favorably to LGR. We share a lot of common influences and tastes, after all. But I’m not trying to be LGR, because he’s already doing a great job of being LGR. I want to do my own thing.

Lastly, LGR is one of many creators who’re willing to spotlight other people’s channels and products or lend their expertise when somebody’s in a bind. Folks helping folks succeed is honestly one of the best aspects of this community. This stuff isn’t zero-sum—helping somebody with their production or shining a spotlight on a new creator won’t detract from the success of your own work. Keeping the community’s enthusiasm high will lift the boats of view counts and watch time for everybody.

I also want to share some plans about future projects I have coming down the pike. The biggest one is a nearly hour-long video with a working title of “The Apple IIGS Megahertz Myth.” It’s about the development of the 65C816 CPU and the IIGS leading to a fraught relationship between Western Design Center, Apple, and the Apple II community. I’m pretty happy with the state of the final draft—I finished the script in March right before my trip to England—but I need to record the audio and assemble all the video coverage. One reason I’ve put off recording it is because I’ve been using Invisalign for the past ten months. My treatment will be done by the time you hear this, but it’s put a real damper on recording sessions. You’re not supposed to take out your aligners for more than an hour at a time, and recording with trays on my teeth was a no-go. Hopefully I’ll get video production going by the end of June and maybe release in late July or early August. Aside from that mega-project I’ll aim to produce something a little lighter to fit in-between that and VCF Midwest. Yes, I’ll be there and I’m planning my usual coverage.

How about Computers of Significant History? That’s a project with a definite end point, though I don’t see myself getting there for a while. I have three machines in the queue: HP Minitowers of the year 2000, the Titanium PowerBook G4, the Blue and White G3, and my first self-built computer. I’ve already acquired the hardware and the next step is writing the scripts and producing the videos. My plan is to use them as a springboard to discuss online communities and consumer operating system transitions. Obviously I’ll run out of PCs for this timeline, but I’m planning a series about software which I’ll probably call Software of Considerable Importance. Just in time for the thirtieth anniversary of Windows 95!

The last bit of literal and figurative business I’d like to bring up is Patreon. People have been asking me about ways to support the show, and I said I would consider something like Patreon after passing 5,000 subscribers. Well, here we are. I’m under no delusions that I’ll be able to quit my day job and live off this hobby’s revenue. But what I’d like to do is get this channel on the path to self-sufficiency, so Patreon funds will be invested back into video production. Computers and parts cost money, and so does investing in cameras, hardware, lights, and music licenses. My initial focus for the launch will be early video access and occasional bonus content. Maybe I could post work in progress scripts. If you’re one of the people that’s been bugging me about it, you can find it at patreon.com/userlandia.

And now, three installation phases later, we’ve arrived at OS/2’s desktop: the Workplace Shell. The subjective experience on this Model 80 with a 20MHz 386DX, 387 FPU, 16MB of RAM, and an XGA-2 graphics card is a slow one. It takes minutes to bootstrap—which is mostly my fault for loading up on network services—and once you get to the desktop you should let it sit until the disk activity light calms down. It’s not unusable, but you’ll encounter some slowdowns and stopwatches. Mac OS 10.0 Panther feels snappy by comparison.

Disk access times aren't too bad thanks to BlueSCSI, but throughput is held back by the 386 and Tribble card. Some folders take forever to open if they’re full of objects with lots of metadata, like a template folder, because the system has to parse and interpret every object’s properties. If all you asked of this machine was to serve files or host BBS nodes its performance would probably be acceptable. I’ll give it credit for being performant enough to play the video equivalent of canyon.mid: macaw.avi. IBM loved touting OS/2’s multimedia bona-fides, though much like the rest of OS/2 Warp these multimedia features would be happier running on a 486.

But what I lost in speed I gained in reliability. OS/2 had a reputation for stability thanks to preemptive multitasking and memory protection. This foundation wasn’t just for OS/2 native apps—it enabled OS/2’s Windows subsystem to run Windows programs in an isolated process. When your monster Excel spreadsheet inevitably crashes it won't take the rest of OS/2 down with it. If you had lots of RAM you could even run each Windows program in its own instance, so that same Excel crash would only crash Excel and not your other Windows apps. OS/2, DOS, and Windows programs could run side-by-side without stepping on each others’ toes. Yes, there is the synchronous input queue problem, which could cause the user interface to lock up while the underpinnings chugged along. But that was… mostly addressed in Warp 4 and back ported to Warp 3 via fix packs.

If you want to try some native OS/2 apps the Bonus Pack’s got you covered. IBM Works is a light productivity suite featuring a basic word processor, spreadsheet, database, and personal information manager. And while they’re not on the level of Lotus SmartSuite they are fully-featured OS/2 apps that leverage IBM’s System Object Model. You can easily embed an auto-recalculating spreadsheet chart in a word processor document, which was all the rage at the time. When you need a bit of downtime you can switch over to OS/2 Solitaire, which presumably is sixteen more bits better than Windows Solitaire. Too bad those extra bits can’t buy whimsy because Windows still has the edge when it comes to art design.

The Bonus Pack also includes an IBM internet connection kit, which bundles online services and ISP connections along with dial-up networking. But the most interesting piece is IBM WebExplorer, an in-house web browser developed specifically for OS/2. The history of this browser could easily be its own video, but it’s probably best known for letting websites customize the throbber animation. It was also remarkably fast at downloading and rendering webpages for its era. But WebExplorer was doomed to fail. The egalitarian ideal that anyone could make their own web browser was no match for the reality that developers and users would gravitate to one or two popular choices. IBM would abandon WebExplorer in favor of Netscape Navigator for OS/2.

The GUI glue that binds all these programs together is the Workplace Shell, and its user experience is… reasonable. IBM got religion about objects in the 90s and launched the System Object Model, or SOM, with OS/2 version 2. Icons are objects, printers are objects, drives are objects—I had a hard time finding something that wasn’t an object! You can install new object classes like the light table slide objects from the Bonus Pack’s Multimedia Kit which created thumbnail previews for photos and videos. Third parties could leverage SOM too—Stardock’s flagship product is called Object Desktop because it started out on OS/2.

SOM turned out to be a dead end—forcing every user interface problem into an object metaphor was a bit of a stretch. There’s other UI shortcomings of the time, like the absence of a visual task switcher and the directory browser tree windows. But there’s a lot to like about Workplace Shell if your alternatives were the Windows Program and File Managers. It’s very good at remembering the positions of your icons and windows. Everything is right-clickable with robust and well-organized context menus. The system can restore your programs and workspaces on a restart. Shadows and Program Objects behave like aliases on the Mac and Shortcuts on Windows. You can group objects together into a Work Area folder which when double clicked will automatically open all of the included documents and programs. The LaunchPad isn’t as good as a taskbar in Windows or a Dock in NextStep but it’s leagues ahead of Windows 3.1’s nonexistent launcher.

Exploring all these features made me feel, oh, 20% more productive. If Mac OS is a quirky artist and Amiga OS is the auteur filmmaker, then OS/2 is the business causal middle manager. So many elements of this system feel corporate, for better or worse. The UI is called the Workplace Shell. The deletion object is called “the shredder,” and not because the designers were disciples of Oroku Saki. IBM’s marketing and training material all focused around typical business tasks. Games were scarce. Multitasking was pitched as a way to more things done and still have time to break for lunch. Networking was about servers and directories and meetings.

Pitching a system that was work-focused wasn’t unheard of in an era when “workstation” was a thriving market segment. Workstations were serious business, and they ran OSes with higher system requirements for tasks where crashes could prove costly. Domain OS, VMS, Solaris, AIX, IRIX, WinNT, the list goes on. And it’s not like microcomputer users couldn’t use the benefits of preemptive multitasking, protected memory, and networking. Amiga users had preemptive multitasking out of the gate! But doing all three in the limited memory and CPU power of most 1980s PCs was too great of an ask.

But consumer PCs in 1995 finally had enough memory and CPU power to multitask, and average users who tried multitasking in Windows 3.1 were met with constant crashes and freezes. In that context OS/2 Warp could’ve been a contender. Its system requirements weren’t all that different from Windows 95’s, plus it had strong compatibility with existing DOS and Windows programs. It also beat Win95 to release by nine months. An opportunity was there for Warp to overtake Windows. The collapse of IBM’s Personal Systems division sealed its fate despite OS/2 Warp 4.0’s launch in September 1996. How did IBM squander so much money, talent, and technology?

A constant source of trouble for OS/2 was its reputation as a memory hog. In the late 80s and early 90s the US-Japan DRAM tariff war meant average PC users couldn’t afford enough RAM to run OS/2 without performance-degrading virtual memory. By the time computers caught up to OS/2’s requirements IBM’s position in the market had drastically changed. Users who made the commitment to OS/2 found it difficult to find native programs to exploit its power and stability. Big Blue had a hard time convincing developers to stick with OS/2 due to expensive developer tools and slow market growth. Microsoft was dominating with Windows 3.x, and used its power to forge aggressive license agreements to freeze other OSes out of the OEM market. And we’re not even touching IBM’s misguided microkernel misadventures which resulted in the abortive OS/2 Warp for PowerPC.

Yet even if OS/2 was preinstalled on more PCs there were other factors that would doom Warp. IBM had about zero skill in pitching OS/2 to the broader consumer market. Its Get Warped advertising campaign was a tired, out of touch exercise that excited nobody. The Fiesta Bowl sponsorship cost IBM millions of dollars with nothing much to show for it but broadcasters cracking jokes about OS/2. And their plans for a Warp Speed-themed launch event were thwarted when negotiations with Paramount’s legal department broke down. Compare that to Microsoft, who pulled out all the stops to bring the Rolling Stones on board to start up Windows 95’s marketing juggernaut. But let’s say IBM managed to market OS/2 better and captured ten or twenty percent more marketshare. It wouldn’t have survived Lou Gerstner’s radical reorganization of IBM which saw the eventual divestment of its consumer PC business.

Yet despite all the bungling and false starts IBM did manage to produce in Warp a stable, powerful OS that was beloved by the users that found it. These fans—and slow-moving conservative businesses with sunk costs—kept its spirit alive even after IBM pulled the plug. Maybe I would have been one of those fans if I had used it back in the day. Yes, there were several things about it that frustrated me, but overall I found OS/2 Warp 3 to be remarkably charming. Warp 4 fixes a lot of those frustrations while keeping its foundation as a powerful, reliable system. I can see a path where it evolved into something that satisfied both the home and business user in the same way Windows NT became the Windows we begrudgingly tolerate today. That is, if IBM had paid any attention to markets other than office productivity. My gut tells me that they wouldn’t have prioritized development of an API like DirectX. But I’ll set aside that argument for another time.

I suppose I should call it a day for my adventures with OS/2. If you made it this far, I’d like to thank you for sticking with me and indulging this 5,000 subscriber celebration. I’m amazed the installation went as well as it did. There weren’t any real problems aside from that 1GB drive limitation, which wasn’t OS/2’s fault. Installing a different OS on a weekend afternoon is something I liked to do back when I was a computer obsessed teen thumbing through stacks of computer books. If you enjoyed this, please let me know and consider subscribing if you’d like to see more. Thank you for reading, and I’ll catch you next time here in Userlandia.

The VCF East 2024 Review

It’s spring. Do you know where your vintage computer enthusiasts are? Here, in Userlandia, they’re returning to VCF East.

Welcome back, my friends, to the computer show that never ends. That’s because Vintage Computer Festival East—the great gathering of obsolete computer fans on the eastern seaboard—is upon us once again. As I bombed down the bag-o-change expressway—better known as the Garden State Parkway—I pondered the questions that must be facing every returning attendee. Last year’s event surpassed all expectations in terms of turnout and things to do. Would there be better management of crowds and people traffic? Would the new consignment arrangements maximize the movement of merch? And what about food service and parking? All very valid concerns. Jeff Brace, the lead organizer, held some livestreams leading up to the show promising many improvements for the 2024 event. But would those promises match up with reality? The only way to know is to return once again to the Jersey shore. And although I’m not a YouTube celebrity, I did stay at a Holiday Inn Express during my trip, which I’m sure qualifies my credentials as a commentator.

The first change—for me at least—came at check-in time. Folks who shelled out eighty-ish bucks for a three-day pass now got a real badge and no longer suffered the indignity of wristbands for three days. A secondary gate on the west side of the campus was open during the day which made the secondary parking lot less of a penalty box for those who couldn’t arrive at the crack of dawn to snipe a spot. The merch booth was moved out of consignment and into a dedicated room, making it easier to buy a T-shirt without waiting in long lines. And the cafeteria was fully dedicated to lunch—more on that later.

One thing that hasn’t changed much is the schedule, which is still packed full of panels, exhibits, classes, and consignments. So packed, in fact, that tables and panel slots filled up faster than ever before. Arranging exhibit halls and organizing panel times isn’t as simple as just placing tables and putting people here and there. Some tables need to be near each other, others might need to be placed in certain areas to account for electrical needs. People traffic is another concern because crowds need to flow through rooms and bad table placement can cause choke points. All these factors play in to the exhibitor layout, especially when the show is still using the same exhibit spaces.

Hall A seems to have become an unofficial vendor room, with tables focusing on folks who had cool things to sell. Eli’s Software Encyclopedia is back with another bounty of big boxes for browsing. I saw many titles that weren’t there the year before, so whatever warehouse he’s scouring for these relics is still paying out. Next door was author Jamie Lendino selling his books about computer and gaming history. Atari fans will be delighted with the available tomes for purchase.

Tech Dungeon is back, this time with a selection of joysticks! These new controllers for old computers are all made with arcade-quality parts, which means they won’t collapse from exhaustion after some strenuous exercise in Summer Games. Jeff’s Vintage Electronics returns with tables layered with cards and connectors. If you were hunting for an ISA sound card, ethernet card, or some oddly specific part he might have just what you need. Meanwhile, Emmy Bear Retro had all manners of storage tech: Greaseweazels, Zulu SCSIs, and Goteks were on hand for reasonable prices to help you replace failing drives or image your disks before they succumb to the ravages of time.

Hall B is where the exhibits begin, and we’re welcomed by RCA computer systems. Josh Bensadon’s traveling exhibit has been making the rounds from show to show—you might remember it from my VCF Midwest video last year—but this time I had enough time to try RCA’s early attempt at a home game console: the Studio Home TV Programmer.

Want to give your beau a retro gift? Happy Hardwear had an arrangement of retro-themed pixel jewelry. Floppy disk earrings or a necklace is the perfect way to say I love you to your favorite geek. Or if you’re more interested in gifts for your beloved CoCo or Commodore, Retro Innovations returns with their array of add-ons.

Amiga of Rochester’s table was busy performing life-saving operations all weekend long. One particular A4000 board saw extensive rework and troubleshooting on Saturday night.

Across the hall FujiNet had their new Macintosh version on display. It joined the Commodore, Apple II, and Atari versions to show how this little device keeps getting more powerful. And once you bought one, you could attend one of the FujiNet sessions to learn how to get the most out of it!

You can’t have a theme about graphical interfaces without mentioning GEOS. Most people associate GEOS with Commodores, but the 8-bit GUI was ported to multiple platforms. You could swap between four different architectures using Jonathan Sturges and Alex Jacocks’ neat picture-in-picture setup.

Nicholas Mailloux’s Eighties Luggables table had some suitcase-sized semi portables running games and productivity apps. Hey, if you squint hard enough a compact Mac counts as a luggable computer.

Most people think of serial terminals as a text-only affair, but Ethan Dicks’ graphical terminals will wow you with their ability to move images over RS232. Maki Kato’s Motorola 88000 systems let you play with working examples of a rarer RISC workstation. The Core Memory crew was back with a display that seems to be getting cooler and flashier at every show. And one of the corners was dedicated to HeathKit computers, with Glenn Roberts and Alex Bodnar’s tables featuring many of Heath systems and restorations.

If you’re interested in pen plotters, Paul Rickard and Erin Sicuranza’s was drawing cool artwork all weekend at the aptly named The Plot Thickens. Plotters are making a bit of a comeback and as a former plotter user I approve. You could even buy some of their finished work at consignment.

South America was famous for its unauthorized copies of various microcomputers, and Ricardo Setti displayed many examples at the appropriately named Clones of South America. Apple II, Commodore, and Sinclair clones were produced by several Brazilian and Argentinian companies. Most of these used pirated ROMs and therefore were quite illegal, but despite their illicit nature they encouraged many an aspiring programmer or user to start a career in computing.

Coming all the way from the Netherlands is the Home Computer Museum, with a display featuring two Dutch PCs. Philips sold PC clones in America under the Magnavox brand but as far as I know we never got their P2000 microcomputer or their MSX machines. These machines have a Dutch flavor that pairs perfectly with their licorice and stroopwafels.

Over in Hall C Ryan Burke took the “rise of the GUI” theme to heart. This gaggle of graphical Apple computers was the largest single exhibit of the show. Start your six-table-long odyssey with the Lisa, then partake in the timeline of compact Macintoshes. Your reward for making it through this expansive exhibit is this cool custom G4 cube with a retro Apple paint job.

Apple wasn’t the only company that influenced the GUI, and the aptly named History of the GUI exhibit displayed its origins from electronic typesetting to early toolkits running on UNIX workstations.

And though you might not think of DEC minicomputers as visual powerhouses, there were several on hand to demonstrate what Digital Equipment can do for digital graphics. Doug Taylor’s Tektronix terminal displayed renders from a PDP-11, while David Gesswein’s PDP-8 spooled ASCII art to a plotter.

Amiga fans had lots to love at this year’s show, with multiple exhibits catering to Commodore’s colorful computer. Dave Test and AmigaBill’s accessory showcase spanned three tables covered in modern add-ons for Amigas. RGB lights and custom cases aren’t just for modern PCs! PiStorms, Vampires, and even the AmigaOne PowerPC tower are available to let you test drive an Amiga with a turbo boost. When you’re done checking out modern accelerators, head over to the GVP table to chat with GVP veterans Robert Miranda and Pete Keretz. They had a full deck of Amiga expansion cards from their tenure at GVP on display. Disk controllers, SCSI cards, retargetable graphics, and accelerators supercharged many Amigas back in the day. And if you wanted to see the cards in action, the fellas had upgraded Amiga demos running all weekend.

If you need a reference for TRS-80 expansion cards, Pete Cetinski’s table had two towering displays showing the many ways you could add functions to your Tandy. With some of these cards you can go where no Trash-80 has gone before!

System Source was back again with a spotlight on IBM. This massive IBM 1130 is what counted as a “midrange” computer back in the days of mammoth mainframes.

Behind the Screens set up shop again with its usual Weather Channel and Prevue Guide systems, but this year saw a new addition to their cable company contraptions: a working cable modem system. The gear inside this rack delivered broadband internet to many American homes at the turn of the millennium. All it’s missing is a Road Runner sticker. Meep meep!

More classic video fun could be found at Dave’s Retro Video Lab. His monster Sony camera attracted a lot of attention. I owe Dave an apology for not being able to chat as much this year—we’ll meet up next time!

Hey, it’s the lovable tramp, and he’s here to tell you all about the greatness of the IBM PCjr. Just don’t ask him about that chiclet keyboard. All kidding aside, the PCjr’s more capable than you think. Dan Fitzgerald set up this booth for you to try it yourself instead of repeating internet hearsay.

Friend of the show BigBadBiologist’s booth had several neat projects from their workbench on display. The key attraction was the cordyceps Mac, which had its Motorola 68030 attached to the logic board by a series of wires. IIIDIY’s station presented an eclectic collection of Apple rarities, like a Twentieth Anniversary Mac, a PowerCD, and a Mac TV connected to a Super Nintendo.

More Mac mayhem was provided by Collin Mistr—who you might know better as DosDude—and his table of hacked and modified systems. There’s no live upgrades this year, but you can see the results of his handiwork with an iMac that’s had a G4 CPU transplant.

This year’s prize for most obscure system might go to Edgardo Saez’ Seequa Chameleon. It’s a dual CPU luggable with a split personality. Those dual CPUs let the Chameleon run Z80 CP/M or 8088 MS-DOS in one box. Alas, this two-in-one combo didn’t get Seequa much traction in the market.

Brave beta testers in the crowd could test drive two cancelled operating systems at Katherine Ahlksog’s OS What-Ifs. The two systems on display were a land of contrasts: Mac OS Copland was a notorious disaster while Windows Neptune was more of a quiet detour. Publicly available Copland builds are a hot mess, and the crowd played a fun game of guessing how long it could run before crashing.

Taking over a whole corner of the hall was Totally Normal Computing, and everyone’s favorite gang of Mac Mavens returned with all sorts of new ideas. Sean of Action Retro brought his modern BeBox to spread the good news about Haiku. Mike’s Mac Shack had an Apple IIc+, the rarest and fastest of all Apple IIcs. Steve from Mac 84’s Mac-controlled LaserDisc player gathered a lot of attention, though I found this wacky split keyboard to be particularly fascinating. And Ron from Ron’s Computer Vids rounded out the crew with his very helpful collection of boards and adapters.

The last set of exhibits is over in Hall D, and the first one up is J&M Consulting. LED keyboard kits can brighten up your Commodore 64 or make your Speccy display a literal spectrum. There’s also the Retro Chip Tester Pro which can test RAM, dump or program ROMs, or even sniff out PALs and GALs.

Next door is the MIT AI Lab recreation team, simulating a PDP-10 from the Massachusetts Institute of Technology’s artificial intelligence lab in the ‘60s.

Never Obsolete: the Race to the Bottom paid tribute to eMachines, everybody’s favorite discount PC punching bag. A complete example of an eMachines Celeron with matching accessories will tug at the heartstrings of those who got their first PC experience thanks to these affordable computers.

And last, but not least, Henry Rietveld had a Nabu PC connected to RetroNet. If you managed to pick up one of those new old stock Nabu PCs, you can bring it back online with RetroNet and the Nabu Internet Adapter. Thanks to Cloud CP/M you can do fun stuff like play MSX games! Radical.

Events and Panels

If it ain’t broke, you don’t fix it, and that’s the sense I get from VCF’s event scheduling. There’s tweaks and tune-ups, but the folks in charge aren’t reinventing the wheel every year. Workshops, classes, and panels fill out a three day schedule catering to all aspects of vintage computing. It starts at the Computer Destruction Lab, where Atari 8-bit computers were the stars of this year’s computer classroom. Sessions included programming classes, free play, and deep dives into hardware architecture. Glitchworks once again offered soldering tutorials and DIY build sessions featuring their usual array of kits. This year’s spotlight was an updated version of the 8085 single board computer, and what computer is more personal than the one you assembled yourself?

Something I haven’t mentioned in recaps of past events is that a ticket to VCF East allows entry into the Federation’s computer museum, which hosts dozens of computers for you to see and use. Almost all of the microcomputers are functional, even some real rarities like a GRIDcase laptop. A working Xerox Star anchors a showcase of graphical user interfaces, and it joins a Lisa, Macintosh, and NeXT cube in demonstrating how pointing and clicking evolved over the years.

Some computers are lucky enough to get dedicated displays with artifacts of their time, like this Atari Mega STE set up as a MIDI digital audio workstation. Most systems are set up on space-saving shelving in a grouping that generally correlates with their time and contemporaries. IBM PCs, Amigas, Ataris, CPM machines, and even a few Brits are available to play games, process some words, or partake in some programming. And before you think there’s only Johnny-come-lately microcomputers, there’s plenty here for big iron enthusiasts. The museum’s headliner is a UNIVAC-based guided missile computer that dwarfs everything else in the room. Flanked by Wang, DEC, and Data General minis, the museum has restored these old beasts to paint a picture of the pre-microprocessor period.

Next to the museum was a room hosting a live restoration by Dave of Usagi Electric. He was tasked with reviving a Control Data Hawk hard drive, which is no easy job given the complexity of these old beasts. A crowd of onlookers watched as he carefully cleaned the platters and serviced the mechanical bits. Although it took something like eight hours, he was successful in bringing it back to life. Bravo!

Running between all these activities burns a lot of energy, and before I knew it I was hankering for some lunch. In an effort bring in new dining options the VCF staff reached out to several commercial food trucks but were turned down because of profitability concerns and local complications with the township. I’m not in the food truck business, but color me surprised that hundreds of computer geeks plus many families couldn’t meet the threshold for profitability.

I was hoping the fire team from the last swap meet would return with their scrumptious chili dogs, but apparently they were already booked for the weekend. The show reached out to OCEAN Community Action Partnership, a local organization that helped with hurricane Sandy relief. Their kitchen truck served hamburgers, hot dogs, breakfast sandwiches, and snacks all weekend long for very reasonable prices. The bacon egg and cheese bulkie really hit the spot after a long morning waiting in line for consignments. This isn’t flashy food, but it was good service with good food for a good price and I definitely appreciated it. I believe the proceeds go right back into OCEAN’s community relief efforts, so I hope they made out well over the weekend.

Joining the classes and workshops are the show’s many panels and roundtables. Some focused on the show’s theme of graphical interfaces, but the overall schedule had something for everybody. Whether it’s programming techniques, history, collecting, or oddly specific deep dives, you’re bound to find at least one to put on your must-see list. Dave McMurtrie of the Commodore International Historical Society hosted interviews with two Commodore alumni: Andy Finkel and Al Charpentier. Both were key characters in the development of the C64, and McMurtrie teased out may interesting stories and anecdotes. Ron Nicholson regaled the crowd with tales of his tenure at Apple. As a member of the original Mac team he has an insider’s perspective of the wild and crazy days of Apple’s pirate flag era.

Friday's streamer and creator’s panel, hosted by Sean of Action Retro, pulled a fantastic combination of AmigaBill, AshSaidHi, Ron McAdams, and LadyAiluros. These are peers who understand the struggles of growing a channel, blog, or podcast, and the guests bounced between funny stories and advice to those who might want to break in. Saturday’s roundtable hosted by Dr. Rebecca Mercuri with Joyce Weisbecker and Rebecca Heineman explored their careers in the early days of computer game development. There’re no dull moments in these group discussions, which goes to show how just letting interesting people talk to each other is a great recipe for learning and entertainment.

Managing the crowd at these panels is always a concern, and I wondered if the seating setup carried over from last year would be OK. Thankfully there was enough space to accommodate everyone who wanted to watch, even during the popular roundtables. What wasn’t carried over from last year was live-streaming. Dropouts and crashes plagued last year’s event livestreams and the AV team decided that the juice wasn’t worth the squeeze. This let the folks behind the recording setup concentrate on recording the panels versus troubleshooting internet issues. Don't fret if you missed out on a panel—they’ll be posted on the VCF YouTube page for people who couldn’t make it.

Consignment Considerations

Last year’s consignment was, shall we say, overwhelming. The tsunami of people and product flooded the cramped spaces of InfoAge’s kitchen and rec rooms, forcing staff and attendees to cope with unexpected complications. In the aftermath of the event VCF’s consignment crew were open about the fact that they underestimated the demand. While total disaster was averted, the arrangements were no longer fit for purpose. Changes would have to be made for 2024.

The first order of business was ditching the hacky Google Sheets intermediary and creating a new self-service inventory management system. After registering an account on the online portal you could enter your items for sale with quantities, descriptions, and price tags. You could even print out your own price tag barcodes at home if you wanted to skip the label line at the show. If you needed to update prices or correct mistakes you could do it from your phone or one of the terminals in the hall. And when I reported a particularly gnarly bug in the site the team hot fixed it within an hour. That’s one advantage of a self-hosted product like NexoPOS over a service like Square. Overall it was a massive improvement from last year’s registration.

The second item on the to-do list was securing more space, and 2024’s consignment moved from the confines of building 9010-C to the Monmouth County Fire Museum’s engine house. On paper this was a brilliant idea—it’s a bigger building with more square footage and lots of outdoor space for people to line up and cars to unload. But you know what they say about the best laid plans, and the crew ran into a major issue. Apparently a large part of the engine house’s concrete floor collapsed before the show. This rendered nearly half of the floor plan unusable for consignment. VCF’s volunteer team dealt with it as best as they could, and they still managed to carve out more space than last year despite this curveball.

Floor space wasn’t the only improvement gained by moving to the firehouse. Its position near the side gate made load-in easier than ever. People could drive their cars up to the loading doors or park beside the building for a stairway-free and hallway-free unloading experience. After getting barcode labels from the volunteers sellers could place their stuff at any open spot. According to the schedule consignment drop-off was open at 5 PM, but shelves were starting to get a bit crowded when I unloaded at 5:15. I’m guessing the doors were opened slightly earlier.

After a busy set-up day on Friday I prepared for an early start for Saturday’s buying bonanza. I learned my lesson from last year and arrived early on Saturday morning to secure my spot in line. The gates opened at 7:30 AM and I was among the first people to line up for the 9 AM opening. Lining up outside meant the crowd wasn’t as cramped as last year, but Saturday wound up being the coldest day of the weekend and few people were dressed for the occasion. I bet somebody could’ve made some decent coin selling hot drinks or hand warmers to the ever expanding line. But time passed quickly and at 9 AM the gates opened to the hungry horde. The buying experience was painless—grab your find, bring it to the register, and pay with cash or card.

Lines weren’t completely eliminated this year, thanks to the aforementioned floor collapse throwing a wrench into the gears. But the volunteers did a good job at crowd control and aside from the morning rush the lines and waits were pretty reasonable. A checkout line wound up snaking through the narrow aisles in-between the back shelves, but folks in line were good about letting people through to browse. This cleared up as the day went on and by the afternoon traffic was flowing freely in and out of the hall. Another smart idea was the free stuff shelves, which were positioned in one of the loading doors. They were still covered by a roof, yet people could walk right up from outside and take free things without having to wait in line or wade through a crowd. Smart!

I checked in on the hall at various points over the weekend to monitor the vibe and watch for new arrivals. While browsing I noticed fewer bargains on rarer or unique items compared to last year. Not to say they didn’t exist, but I didn’t see anything quite on the level of the $100 A600 from last year. I believe there’s a few reasons for this, most of which are outside of VCF’s control.

First, sellers might be doing a little bit of fishing by putting rare or unique items out there with a higher price in the hopes that someone’s fear of missing out will get them to open up their wallet. If nobody takes the bait, the seller will lower it to garner a few more bites. The new POS system’s database made it easier to add contact information for negotiations, so haggling or trading was more accessible than before. I actually got a phone call from someone who wanted to do a trade, though I declined because I already had what they were offering. It’s not as smooth as negotiating at a swap meet, but it gets the job done.

Second, I think sellers are less willing to offer low prices due to flippers. Wouldn’t you be annoyed if you saw someone buying a machine you listed at $100 and saying they were gonna flip it for double the price? I know people want to keep the spirit of community in mind by not maximizing their profits, but allowing flippers to exploit people is probably worse for the community. Pricing wares somewhat below instead of significantly below market value deters the flippers while still feeling like a deal to most buyers.

Third, sellers are factoring the show’s commission into their prices. This year the cut was 18%—that’s up 3% from last year. I think people generally recognize the value of the commission, because it helps fund the show and run the hall. I certainly value it, because it saves me from vendor complications like being chained to a table and dealing with sales tax. But people are cognizant of the commission as a cost of doing business and are pricing accordingly. If you’re looking to get $300 in your pocket for an A500, then you’d probably price at $350 to pass the commission on to the buyer.

Pricing observations aside, merchandise moved at a healthy pace throughout the weekend. At closing time on Sunday there were rows of empty shelving and only a smattering of parts and systems left over. Exhausted but happy volunteers were satisfied that things went mostly to plan. I didn’t go nuts and buy anything super expensive, because there was no oddly specific PC that I needed to buy at any cost. A complete Atari 1040ST setup was very tempting, but I wound up passing on it. Instead I bought a few smaller but still neat things: a PicoGUS, a boxed copy of GEOS, and a copy of Springboard Certificate Maker for the C64.

Assuming the floor problem is fixed for next year, what other issues remain? One concern I have is that consignment is vulnerable to bad actors. The show’s been incredibly lucky that, as far as I can tell, everyone’s been acting in good faith. Volunteers were minding the doors and I believe there was a security camera set up to monitor the area. When things were misplaced, staff helped sort things out. But there’s vulnerabilities ripe for exploiting if people aren’t careful. Nobody was checking merchandise on the way out the door, so what protections were there against sticky-fingered individuals from just… walking out with stuff? I realized this at the end of the show when I walked in and took my two unsold towers back to my car with nary a peep. What was stopping me from taking somebody else’s tower? I’m not saying that consignment suffered from loss and theft, just that my QA security research brain is poisoned to always be on the lookout for flaws.

I also think there’s room for improvement in the general layout and handling of items in consignment. Right now consignment is a kind of free-for-all, where people put stuff wherever they can. This results in smaller, but still valuable, items being overlooked. Denser shelves or racks could help with this. Space concerns often meant people’s stuff got spread out throughout the hall—I had computers on three different tables. Sellers also moved other people’s items around to give their own items a better chance of being seen. Many computers wound up getting stashed on the floor which cut down on their visibility. You have to wonder how many people overlooked matching accessories or useful add-ons. RetroTech Chris bought a PS/2 model 70 and missed out on my PS/2 compatible SIMMs because they weren’t nearby and weren’t obvious. It worked out in the end because I mailed them to him after the show, but how many others missed out?

If I’m permitted a spin on the what-if machine, I wonder how things would go with a radical redesign. Basically make the hall into a giant computer store. Have the staff tell consignors where to put their things. Sort and organize systems into one area, software in another, have a section just for parts, and so on. This is wholly impractical, for a variety of reasons—most of them staffing, others because you can’t predict how much of something you’ll get—but I can dream. Honestly, we just need more space so everyone can keep their own stuff together.

My pie-in-the-sky dreams aside, 2024’s consignment was a massive improvement over last year’s. It’s not perfect, but nothing ever is. I won’t let that be the enemy of the good, and the changes made for this year made it very good indeed.

A Change Can Do You Good

One of the most challenging aspects of putting on a yearly show is mitigating mistakes or grappling with growth without killing the appeal that draws people to your event. My concerns from last year—consignment issues, food, and crowd control—were largely addressed. The Vintage Computer Federation did a good job executing these year-over-year improvements, and its reward is a crowd that consistently returns.

As I wandered across the InfoAge campus on Saturday—usually the busiest day of the weekend—I sensed that things were a little less busy than last year. There were still plenty of people roaming the exhibit halls, but it never quite reached the frenzied status of last year. I haven’t heard any post-mortem reports from the staff yet, but I wouldn’t be surprised if absolute numbers were better than 2022 but either on par or slightly below 2023.

Last year’s attendance numbers were goosed by appearances from two big-name personalities making their first visits to the show: Adrian Black and David Murray. I wager some attendees came specifically to meet these two guests, and it wouldn’t surprise me that some of those first-timers didn’t return this year. That’s not unusual at all for conventions, because not every first-timer gets converted to an annual visitor. Not every guest is able to return to a show because of schedule conflicts or personal reasons, and honestly I don’t think the show wants to become too dependent on VIP guest for turnout.

My take is that VCF East isn’t prioritizing growth at all costs, and that’s a good thing. They’re joined at the hip with InfoAge, for better or worse, and its headroom is limited. Although I don’t think it’ll happen any time soon, the boom in retro / vintage computing will cool to a certain degree, and VCF has been around long enough to know that growing too fast can backfire. If they continue their current approach they’ll be able to weather the ups and downs.

A show is nothing without its community, of course, and the people that continue to show up to VCF East give it a flavor that you can’t find at other events. Although you’ll find many familiar faces, there’s always new exhibitors and panelists who want to tell you all about their technological passions and pasts. After all, connections are what conventions are all about. It doesn’t have to be one with a VIP guest—it might be the one you make with someone sitting next to you at a panel. And even if you’re not able to make it to VCF East, I’ll always encourage you to look up your local vintage computing events. It’s a great way to make a new kind of local area network connection.

An American Visits British Computer Museums

Fancy a holiday? Here in Userlandia, I’m hopping across the pond to check out Britannia’s vintage computer scene.

The United Kingdom! It’s the royal nation famous for its tea, biscuits, and computers! Well, it might be famous for computers if you’re a nerd like me. The list of influential PCs that hail from the British Isles is the computer equivalent of the Line of Kings: the ZX Spectrum, BBC Micro, Amstrad CPC, and on and on. And if you’re a filthy Yank like me the odds of you seeing—let alone using—one of these machines are just about nil. There’s a few examples lurking about in the States, but to get the authentic experience with a packet of crisps you need to take a jumbo across the water to see Britannia.

And since I was overdue for a vacation, that’s exactly what I did! I’d never been to the UK, and visiting London was tops on my travel to-do list. Luckily for me JetBlue’s price war with British Airways has made flights from Boston to Britain are more affordable than ever before. I took advantage of these low, low prices and cashed in some TrueBlue points for a ticket down to London town. Now all I had to do was plan a jam-packed week of holiday fun. A day trip to Cambridge to visit the Centre for Computing History was a must. Same goes for Stroud and the Retro Collective. I could spend other days visiting the British Museum, the Tower of London, and the Royal Air Force Museum. Top it off with London’s many fine restaurants, parks, and historic landmarks and I’ve got a solid itinerary for my first time in the UK.

Of course there’s more to traveling than boarding an airplane. Getting around London is easy enough because the tube could take me anywhere I needed to go. But traveling outside London requires a bit more effort. Since a Rick Steves Guide to British Computer Attractions doesn’t exist, I had to pick apart sticky details like train and bus schedules. And if you’re planning on going to the same places, maybe some of my travel experiences could be helpful to you. So join me as I make my way through the English countryside to explore some fine museums of British computing.

Cambridge and the Centre for Computing History

My first destination was The Centre for Computing History in Cambridge. Traveling from London to the museum without a car isn’t too hard. Thameslink and Great Northern operate trains to Cambridge from King’s Cross station, and I recommend buying advance tickets online to save time and money. Express trains run throughout the day and the ride’s about fifty minutes long. The Centre is located about a mile and a half from Cambridge station, and the route’s perfectly walkable if that’s your thing. The number 3 bus from the train station can get you pretty close, but you’ll still need to walk almost half a mile to reach the museum.

Since walking is my thing, I made my way through some quiet Cambridge neighborhoods until I arrived at the Centre for Computing History. After paying ten quid to get in the door I turned right into a lobby lined with glass cases featuring numerous computing artifacts. Some are interesting one-offs, like a prototype ZX Spectrum. Another case contains clones of various systems, like our Americanized Timex versions of Sinclair computers and French Apple II clones. But the centerpieces are tributes to the titans of Cambridge computing: Sinclair and Acorn. Both companies called Cambridge their home, and the curators have done a good job chronicling the evolution of their products.

The lobby is also home to the world’s largest microprocessor: the Megaprocessor. Designed and built by Cambridge local James Newman, the Megaprocessor is a way for people to get up close and personal with the architecture of a CPU. Fans of blinking lights will love watching the logic units working in real time, and it’s not just for show—you can test it out by playing a game of Tetris!

After a proper warmup from the lobby’s opening act it was time to head down the hallway and see the main event. The Centre’s Main Gallery houses the majority of the museum’s interactive collection which stretches from the dawn of data processing to our modern wireless era. A majority of the computers on display are functional, which is pretty impressive given the rarity of certain specimens. Sections of the hall are dedicated to various eras and facets of computing like video games, office work, computer graphics, and portable computing.

While my primary motivation for visiting the Centre was seeing and using British and European computers, its scope goes well beyond the hometown heroes. Their mission is to present an overall history of computing from its earliest day to day, and manufacturers from across the globe are represented in contexts that explain how they competed and evolved over time. The walls are lined with big iron and mainframes, creating a corridor of time through the world of data processing. New England gets some representation in Old England thanks to a Digital Equipment exhibit with working terminals and a PDP-11. The minis transition to microcomputers and workstations, with a line of office-oriented computers showing how things got more powerful while costing less money.

This leads nicely into a novel exhibit: the 1970s office. It’s like stepping into the jungle exhibit at the zoo—we’re observing these lumbering beasts in a simulacrum of their native habitat. A PET and TRS-80 command respect from the lesser office tools as they pay tribute to their new silicon overlords. I can feel the malaise washing over me from every corner of this room. After leaving this simulacrum of the seventies we roll straight into the innovations of the eighties and nineties. Take a deep dive into portable computing with this evolutionary display of back-breaking Osborne luggables to the book-sized Toshiba Libretto. Other exhibits examine how multimedia, interactivity, and networking defined the information age of the nineties. The spotlight on convergence shows how so many of our disparate gadgets like phones, music players, PDAs, and camcorders eventually formed our modern smartphone Voltron. Palms, Psions, Sonys, and more are given their due as the building blocks to the modern wonder device. Even my personal favorite, the Casio E-125, is there!

The most eye-catching displays are the colorful timelines of computers and game consoles. Kids of all ages will be impressed by the lineup of home game consoles stretching from Pong to the Gamecube. Trying out all the different controller designs is the best part of these displays. You can literally feel which ideas worked and which ideas didn’t. Across from the consoles are more video game history exhibits, like a lineup of historically significant arcade cabinets. Bright, colorful backers pepper you with fun facts while you’re munching dots in Ms. Pac-Man.

Of course consoles aren’t the only way to play games. Eight- and sixteen-bit home computers that powered the British computing revolution are all on hand to help jog your childhood memories—or give you a glimpse of what you might’ve missed. Mind a round of Chuckie Egg on the Dragon 32? This was easily the most exciting part for me—I could finally get some one-on-one time with machines I’ve never used before. Each system has a curated list of games to try, and I took turns playing a game at each machine to get a feel for their joysticks and keyboards. Some visitors might like that ability to sample every machine and compare their strengths and weaknesses. Others will zip straight to their old favorite to relive their memories of hardware they might not have touched in decades.

A central island hosts the machines that powered the second wave of home computing, where the Atari ST and Commodore Amiga are flanked by the Sinclair QL and the Acorn Archimedes. No arguments are made as to whether the Commodore Amiga or the Atari ST is the superior computer, which takes a remarkable amount of restraint. I gravitated towards the Archimedes, which was running the famous 3D Lander game. I could’ve seen myself being a big RISC OS fanboy back in the day if we had these in school. I can’t say the same about the QL, though I’ll admit I found it appealing in other ways. The QL was a famous flop—“ambitious but rubbish” as the lads would say. But while its production problems were legendary, they’re only one part of the QL’s story. Here you can see the QL put its best foot forward, and appreciate what might have been if things went differently for Sir Clive.

Interspersed through these major exhibits are smaller spotlights on specific computers and themes. Torch PCs, the Acorn Business Computer, and a RISC PC! Japanese versions of our favorite consoles! And if you want to challenge a supercomputer at chess, well, try and stop you! There’s even a mini-museum of mini-consoles to demonstrate how we can relive our old games even if our parents threw them out all those years ago.

While touring these exhibits it’s plain to see that the Centre has done an excellent job of balancing informative aspects with entertainment value. Although nerds like me will love seeing oddly specific pieces of hardware, a museum like this is meant to attract the wider public. There’s a danger in getting too niche, and the Centre’s presentation avoids this pitfall by presenting their artifacts neutrally but enthusiastically in a way both children and adults will find accessible. The final exhibit room plays this balance perfectly by recreating a 1980s classroom complete with 1980s classroom computers. This is the Centre’s tribute to the BBC Computer Literacy Project, a groundbreaking initiative to cultivate computer skills by seeding schools with computers. BBC Micros running BASIC or educational software are ready for you to experience what it was like for students in the UK to use computers in the 1980s.

I had never used real BBC hardware before, so I spent a few minutes thumbing through a how-to guide on BBC BASIC while punching in a Hello World. After getting my fill of AMX Super Art and BBC BASIC I turned my attention to another row of classroom computers. These were less common than the Micro but no less influential. A BBC Master hooked up to a Philips LaserVision player demonstrated a working recreation of the BBC Domesday Project. Next to it was an Archimedes running an early version of RISC OS. Even a working network of RM Nimbus systems was available to show how a network could benefit students and teachers alike.

After I got out of class I picked up a snack and beverage and took a break in the lobby. It was nearly 4 PM, and I’d seen just about everything available to the public. My last task was to peruse the items for sale in the gift shop, and I picked up a copy of From Vultures to Vampires, the tumultuous tale of the Amiga’s post-Commodore years. Satisfied with a jam-packed day of retro computing, I left the Centre and walked towards downtown Cambridge to see one more thing before heading back to London.

On December 21 1984 a fight broke out here at the Baron of Beef pub between Sir Clive Sinclair and Acorn’s Chris Curry. The clash was dramatized in Micro Men, the BBC’s 2009 film where Danger Mouse called Bilbo Baggins colorful names and thwacked him a newspaper. Since video of the original fight doesn’t exist, we can’t tell how much the filmmakers embellished the story. But Sinclair was candid in interviews about his regret for the fight, and he eventually apologized and worked things out with Curry. I can’t speak to the quality of the Baron’s food, drink, or service—I had dinner plans elsewhere that night. But if you’re interested in computer sight-seeing, it’s just a short walk down Bridge Street in central Cambridge.

A Colossal Cave Adventure

My next destination on my tour of UK retro computing is in the Gloucestershire town of Stroud. Nestled in the outskirts of this hillside hamlet is an eighteenth century industrial mill that houses the Retro Collective, a co-op focused on preserving and presenting the technology of the past. Its founding attraction is the Cave, an exhibition of retro gaming and vintage technology spearheaded by Neil Thomas, host of RMC. It serves triple duty as a public museum, film studio, and workshop. He soon joined forces with Alex Crowley to bring the Arcade Archive on board as the site’s living arcade museum. If I’m flying 3,000 miles to England, I’m definitely going the extra 90 to make it to the Cave.

A public day at the Cave starts at 11 AM, and I wanted to arrive on time to make the most of a three hour session. Getting to Stroud without a car is easy enough—there’s frequent train service from London Paddington via Great Western Railway. But the mill is four miles away from downtown Stroud, and that’s an awfully long walk. I’m in good enough shape to do it, but that’s around 70 minutes of walking each way and I needed to save some time. So what other ways are there to get to the Cave?

The fastest way is hailing a taxicab, and there’s taxi stands at the train station and in front of Lloyd’s Bank in Stroud. I bet the same driver would pick you up later if you worked it out with them. Cycling was out, as the bike rental shops would be closed well before the time I got back to town. What about public transit? RMC Discord user MichelleB’s blog post talking about transit routes mentions the Stagecoach 67 bus as a good choice. It leaves from downtown Stroud and stops about a mile away from the mill.

As of this writing the Saturday 67 bus departs Lloyd’s Bank hourly at 28 minutes past the hour. Combined with the walk the 10:28 bus gets you to the Cave a few minutes past eleven—assuming the bus is on schedule. Taking the 9:28 bus will get you there a bit after ten, giving you plenty of buffer time to kill before the doors open. There are other buses that operate routes in the area, and I suggest using Google or Apple transit planners to see the other options. The closest eastbound stop to the mill is Toadsmoor Road. If you don’t know how these buses work, here’s a quick guide to save you some embarrassment. When you board the bus, tell the driver your stop, wait for the pay pad to light up, and then tap to pay your fare. There’s no announcements for stops, so follow along on your phone’s map to make sure you hit the stop button at the right time.

After hopping off the bus you’ll have to walk about a mile to the mill. You could walk along London Road, which has paved sidewalks, but the better route is along the canal tow path. There’s no cars to dodge and you can take in the lovely views of stone bridges and lush foliage. Just be mindful of low-lying areas—you might have to fall back to the road due to the possibility of flooded passageways. Before you know it you’ll find a sign for Chalford Industrial Estate, which leads right to the mill.

Upon my arrival I was greeted by Neil and his crew mates, Dan and Holly. They were expecting me, for reasons I’ll get into later. Other guests received a similar welcome and a suggestion for a place to start their day in the Cave. Neil knew I would be very interested in the day’s special event: a hands-on preview of Thunder Helix. It’s an arcade style attack helicopter sim in the vein of LHX Attack Chopper, Desert Strike, and Gunship 2000. The goals are easy to grasp: attack the enemy, rescue the friendlies, and bask in retro-styled dithered 3D environments. I was able to jump right in and start blowing up some targets at a silky smooth sixty frames per second. The CRT was icing on the cake. Thunder Helix should be available for early access on Steam at this point, so check it out!

After a few sorties of desert chopper action I was ready to explore the rest of the Cave. The day wasn’t fully booked, but there was still a decent crowd of folks and families enjoying everything on offer. Speakers were bleeping and blooping, keyboards were clacking, joysticks were jumping. There’s so much stuff that it felt a bit overwhelming. I caught up on the life and times of Alan Sugar thanks to an exhibit dedicated to the Amstrad founder. I marveled at the stacks of magazines and books in the library. I thumbed through the tons of titles in the recreation video game shop, which scores an S for stupendous. Neil’s inspiration is the ubiquitous UK retailer W.H. Smith, where many a lad and lass bought their cassettes and diskettes back in the day. We don’t have Smith’s in the US, so I’ll have to take the word of the many visitors who say Neil’s nailed the look. But the homage isn’t completely lost on me, because  those slatted walls and shelves remind me of Babbage’s, my own defunct software store. But form is nothing without function, and Neil went all the way by stocking the shelves with real games. You can grab a box off the shelf, tear off the shrinkwrap, and open-palm slam it into a system to get an instant hit of nostalgic bliss. Or you can scan its barcode to play them on the MiSTer Multisystem demo kiosk.

After getting my fill of browsing the shelves I was ready for another round of gaming. This is where spending time with Spectrums, Beebs, and Amstrads at the Centre gave me more time to devote to the Cave’s exclusives. If you’re a foreigner like me your priority should be machines you don’t have at home. Rare systems like the FM Towns CarMarty, Sharp X68000, and Nintendo Virtual Boy are must plays. Want to take a stand on the Amiga versus ST debate but aren’t sure where to begin? Neil’s curated games list will give you the education you need to take either side of the fence. An ongoing Galaxian high score challenge lets you try for bragging rights amongst your fellow cave dwellers. Neil really has built something special here, and he’s very open about the fact that it couldn’t happen without countless volunteer and community contributions.

But as amazing as a visit to the Cave can be, there’s a catch—a three hour time limit. I was so enchanted by the sheer amount of everything that those hours passed by in a flash. Experiencing its full depth and breadth in one visit is impossible. You could easily spend a whole day reading through the extensive library of vintage computer books and magazines. If you want to play through one of your old favorites to completion, well, the power is yours. Make sure to give yourself some time to just sit back and chill in the big chair with big telly. I had great conversations with Dan and Holly. And I even made a donation: an original copy of the first subLogic Flight Simulator for the Apple II. I found it in a massive Apple II collection I acquired a few years ago, complete with manual and registration card inside its original ziploc bag. This is how software was sold in America in 1980, and I figure it belongs in a museum instead of being locked up in a file box in my closet. Neil is a fellow flight sim fan, and I know he’ll put it to good use either in a video or on display to the public.

When the clock tolled 2, my time at the Cave was up. My day wasn’t over yet because I planned on visiting the Arcade Archive, but I still had an hour before it opened. After a lunch break at the nearby Lavender Bakehouse I headed into the Archive and found an arcade after my own heart. There’s just the right amount of dank for the Carl Carlson in you, with blacked out windows and ambient light provided by cabinet marquees. Games run the gamut from essentials like Donkey Kong to rarities like Nintendo Sheriff. All the cabs are set to free play, and you can spend as much time on a game as you’d like. I actually completed Operation Wolf on real hardware for the first time, and I admit I lost track of how many quarters it would’ve cost me to do it.

Some arcade museums lean more on the museum side of things, but Alex has gone all-in on recreating an idealized 80s arcade, and it works. Some museum pieces, like the original Pong, get appropriate info cards, but there’s less memorabilia and artifacts than the Cave. This is fine—there’s more than one way to celebrate history. The lived experiences of arcades are an important part of their history, and recreating that is just as important as restoring and maintaining the cabinets. Alex and I had a chat about his efforts towards these restorations, and if you haven’t checked out his YouTube channel you definitely should. The Nintendo Sky Skipper story in particular is super compelling—it’s every vintage collector’s dream to find a lost piece of history.

By the time 6 PM rolled around my thumbs were sore from all the button mashing. It was time to say goodbye to the mill and head back to the city. The return bus trip requires a bit more planning than the incoming one. None of the bus schedules are good for leaving at 6 PM. If you don’t catch the 67 bus at Lewiston Mill at 6:05PM, you’ll have to wait there until 7:35 PM for the next one. That would mean leaving early. And if you decide to leave early, then you’re better off taking the 54A, which stops right next to the Cave at 5:40 PM! I wound up walking an extra 1/4 mile down London Road to the War Memorial stop, where the 69 bus stops at 6:53PM. Tell the driver to stop at Russel street in Stroud and you’ll arrive in front of the train station. After dinner at one of Stroud’s fine eateries and caught my train back to London. My retro tour of England was finished.

The Best of Britain

As the locals would say, I was right chuffed with my visits to the Centre for Computing History and the Retro Collective. They’re two very different venues, but they complement each other well. If you’re angling for a more educational experience, the Centre would be your first choice. The Cave is less formal and focuses more on gaming. But both venues share a love of history and the stories behind today’s ubiquity of computing. You can’t go wrong with visiting either one.

As we get older—as these devices get older—preserving these experiences gets more difficult. Parts get scarcer and knowledge fades away. Although the Collective and the Centre operate on different scales, it’s great to see their approaches to solving common problems. How do you keep rare machines running? How do you present something from a completely different culture in context? Every museum or exhibition tackles these challenges differently and that’s part of the fun. And if these venues are out of your travel range, don’t worry! There’s plenty of computer and gaming museums across the world. Maybe there’s one right in your back yard.

I don’t know when I’ll be able to make it back to the UK. So many places were left off my list because there’s only so much you can do in a week. Hey, maybe I’ll come back some day. After all, the UK is chock full of computer museums. There’s the North West Computer Museum, the Retro Computer Museum, the Derby Computer Museum, and more than that! Looks like I need to plan another trip. Maybe I’ll see you there.

Garmin iQue 3600: How A Palm OS PDA with GPS Predicted The Future

How do you get the-ah from hee-ah? Here in Userlandia we’re piloting a car with Palm OS.

It’s hard to get lost nowadays. After forty-odd years of research and development we’ve finally arrived at our science fiction future where magic devices pulled from our pockets can pinpoint our positions precisely. But as revolutionary as they are, smartphones are just the latest milestone in a long trail of satellite navigation tools. Milestones like this: the Garmin iQue 3600.

At first glance it looks like a normal Palm OS personal digital assistant with its normal apps. But there’s something about it that most PDA users don’t know about. It has a secret—a GPS antenna! Garmin released the first StreetPilot portable automotive GPS in 1998, so it’s only logical that they were the first to combine a GPS with a PDA. The iQue was announced at the January 2003 Consumer Electronics Show and shipped six months later at an MSRP of $589. That seems high for a PDA, but consider how much it cost to buy comparable devices. A Palm Tungsten T2 was $399 and a Garmin StreetPilot 2610 was $799, so the iQue combining both of them into one device for half the price seemed like quite the bargain.

I found this one complete in box at the thrift store for $13, and I couldn’t resist bringing it home. Inside that box is the PDA itself, a dashboard mount with car charger, HotSync Cradle, software CDs, and manuals. The only item that appears to be missing is the leather flip-cover, which is too bad because half the fun of a PDA is flipping the cover open like Lois Lane when she’s jotting down a big scoop.

From a design standpoint the iQue doesn’t stand out from the crowd of silvery rectangular PDAs. It’s got the same aura as a silver Toyota Camry: practical, reliable, and tasteful. Such designs were popular because there were two kinds of people who bought PDAs: geeks and traveling businesspeople. They could take out a subtle silver PDA on location or at a business meeting and not feel like they’re waving money in their clients’ faces. If Garmin showed these road warriors a pocketable device that told them how to get from point A to point B in unfamiliar territory, they’d scream “shut up and take my money!”

What the iQue lacks in flash it makes up for in function. There’s a bevy of buttons for quickly controlling its intuitive interface. Three of the four standard Palm face buttons are there—the Date Book, Address Book, and Todo List—and the fourth cycles through Garmin’s GPS-enabled apps like the QueMap, QueTurns, and GPS status. On the side of the device is a shortcut button for recording voice clips, a scroll rocker, and an escape button for backing out of dialogs or returning to the home screen.

Garmin also included decent connectivity options. Along the top is a headphone jack, an SD card slot, an external GPS antenna port, and an infrared transceiver for wireless data transfer. On the bottom is the Palm Universal Connector for attaching cradles, keyboards, and accessories designed for Palm devices. The included USB HotSync cradle is surprisingly hefty due to its metal construction. A car cradle is also included, complete with a charger and a beanbag to keep it planted on your dashboard.

The integrated GPS receiver adds some thickness, but Garmin’s designers styled it to minimize its impact. Between the tapered back, rounded edges, and strategic bevels it still fits well in my hand and in my pocket. Popping the stylus in and out of its slot is a satisfying fidget exercise, complete with a lovely click.. Many of its contemporaries had plastic styli, so credit to Garmin for not pinching pennies here.

Inside the iQue is a 200MHz Motorola DragonBall MXL ARM9 CPU paired with 32MB of memory, which is comparable to other premium PalmOS devices of its day. A transflective color backlit screen which allows you to turn off the backlight outdoors and still see the display is also class competitive. Ditto its expandable storage and MP3 playback capability. The only downside was price—similarly specced Palm Tungstens or Sony Cliés cost $399, nearly $200 less than the iQue.

But the iQue isn’t an ordinary multimedia PDA. Garmin bragged that the iQue was the first PDA with an integrated GPS, and based on my research that appears to be true. Some mobile phones had an integrated GPS, and there were GPS devices built on Windows CE, but they weren’t PDAs. The first Windows PocketPC with a GPS, the Mitac Mio 168, wouldn’t debut for another year. So was the GPS worth the extra cost in 2003? And is it still useful today?

Inside the iQue

Out of the box the iQue wasn’t very useful because it shut down when removed from its cradle. The first order of business with any portable device old enough to drink is changing its battery. Fortunately, reasonably priced replacements are available on eBay and Amazon. This job wasn’t difficult thanks to an instructional video from NewPower99.com. Now that the iQue can hold a charge, it’s safe to set it up and go mobile.

Setup starts with Garmin’s software CD and a two-disc MapSource map library. Tough luck, Mac users, cause Garmin’s map software is a Windows exclusive. In case you’re missing the installer disc, Garmin’s website still hosts support pages with manuals and software downloads. That’s nice, considering they discontinued support back in 2008. I transferred these installers to an era appropriate host PC: a ThinkPad T21 running Windows 2000.

The downloaded Palm Desktop installer turned out to be a dud because it wanted to download online InstallShield dependencies that no longer existed. I had to fall back to the installer disc which wound up being the same version as the one from the web. After running the disc’s installer, the web installer started working—go figure. I’ve uploaded an ISO to archive.org in case you need it.

Now I need to update the iQue’s OS. Garmin’s firmware support was solid for the era; they released six updates over the course of four years. Pro tip: obey the instructions to the letter unless you want to put your device into a boot loop. After I followed all the steps the device finally started up to Palm OS version 5.2.1r6.

The next step is to install some maps, and the process is similar to other Garmin GPSes of the era. The base map has a 5 mile scale that only shows major highways and byways. To get detailed maps and points of interest you’ll need to install the included MapSource City Select mapping database. It’s a hefty amount of data, totaling over 1.2GB of disk space. Although the iQue supports 2GB SD cards, they weren’t available when it was launched. You couldn’t buy a 1GB card until January 2004 and at $499 it cost almost as much as an iQue itself. Most users methodically managed multiple megabytes of mapping memory by syncing maps as needed or swapping between map sets stored on multiple 128 or 256MB cards.

After wrapping up the installations I was ready for an iQue test. This meant going outside, folding out the flip-up GPS receiver, and waiting. This thing has a satisfying thunk when you close it, like a good flip phone. Getting its first satellite fix took a while—almost five minutes. Unlike modern devices there’s no cell tower triangulation or constellation assistance data to speed up this process. But I have good news on the 2019 GPS date rollover front! The clock displayed the correct time after acquiring a fix, which meant Garmin’s engineers either anticipated or fixed the problem—nice.

You can customize this Que button to add or remove apps from the cycle, and by default that’s QueMap, QueTurns, QueTrip, and QueFind. I added the QueRoute controls. If there’s no active route the map will follow along as you’re driving just like a modern phone or GPS. The update rate ain’t exactly great—I figure it’s two, maybe three frames per second at best. The size of the map display is decent, and more can be shown by collapsing the speed and compass overlay. Rocking the scroll wheel zooms in and out while dragging the stylus pans the map.

When you’re ready to hit the road the QueFind app will be your guide. There’s a point of interest database of interesting points like hotels, tourist attractions, roadside services, and more. If where you need to go isn’t in the database you can enter a specific address, intersection, or city name instead. After tapping Route To the iQue generates a list of turn-by-turn directions for your trip.

This experience is surprisingly close to a modern smartphone map app. Maybe not in the presentation standpoint, what with the mediocre frame rate, lack of 3D projection, and pixelated map lines. But from a function standpoint it’s all there. Need to make multiple stops? Use the Add Vias feature. Need to get around road closures or traffic? Go ahead and request a detour. Want a preview of a long trip? Simulated drives are there to guide you. Do you drive a truck, motorcycle, or taxi? Select a specific vehicle type to optimize your routing, even if you’re walking or cycling. Boats and off-roaders aren’t left out either thanks to support for nautical charts and topographical maps. Garmin didn’t hold back—they crammed all the functionality and features of a StreetPilot into a PDA without compromising on user experience.

Rounding out the app suite are programs aimed at traditional handheld GPS users. QueTracks creates a virtual trail of breadcrumbs while the GPS is active, and these logs can be saved as paths to review on the QueMap or downloaded to a desktop PC. Hunt&Fish tries to predict the best time of day for hunting and fishing based on the Solunar theory. Sun&Moon isn’t a Pokémon game; it shows the predicted time of sunrise and sunset, moon phases, and the position of those celestial bodies relative to your location and orientation.

Garmin also extended GPS functionality to other areas of Palm OS. Want to navigate to your favorite bar in the Berkshires? Just open its Address Book card and tap the Route button. Same goes for appointments in the Date Book. You can even attach your current location to a voice memo! We take these integrations for granted today, but in 2003 these were clever and cutting edge.

All these features made for a pretty capable navigation device, so long as you were driving, boating, or walking. But there’s another group that hungered for real-time GPS navigation: airplane pilots! Introduced in January 2005 for $1,099, the aviation-focused iQue 3600A added aeronautical amenities like Jeppesen charts, terrain data, and virtual flight displays. Also included was a yoke-mounted smart cradle with extra controls to navigate the map and input data without using a stylus. And because it’s still a 3600 at heart, Garmin offered the automotive adapter kit which includes the dashboard mount and the MapSource road maps for an extra $200. The 3600A automatically switches between aviation and automotive modes when placed in the appropriate cradle—nice touch!

Navigating with the iQue

Given how advanced the iQue was for its time it raises an obvious question: can it still be useful for navigation today? Let’s hit the road and find out! My regular thrifting route from Wilmington, Massachusetts to Hudson, New Hampshire is a good test route. There’s plenty of twists and turns along a mixture of highways, suburban arterials, and country roads. I know it well enough to judge the iQue’s routing quality, and since the roads haven’t changed much over the past twenty years the old maps should still work.

Planning a route starts with the QueFind app. My hunch that the circa 2004 POI database wouldn’t be helpful was proven correct when my searches returned zero results. Even back then there was always a possibility that the database wouldn’t have what you needed, so I searched for my destinations’ addresses instead and saved them as waypoints. Waypoints are actually entries in the Palm Address Book, which is a clever way to exploit an existing database.

Entering this data was easy enough—search for address, save waypoint, and repeat—but it required my full attention. Being out of practice with Graffiti didn’t help. Doing this in motion would be impossible. At least the rocker can scroll through lists and select waypoints or address book entries. I wouldn’t call the UI sluggish, but it’s definitely not snappy. After creating the waypoints I added them to a route using Add Vias and saved it in QueRoute.

After I activated the route the iQue spent about six seconds calculating turn-by-turn directions, which isn’t fast by today’s standards but it's good for 2003. It dutifully announced my upcoming turns and destinations using voice clips with progressive distances like “in 1.6 miles turn right, in 1000 feet turn right,” and so on. And this guidance comes through loud and clear from the auxiliary speaker built in to the power adapter, even when tucked away in a center console. When you approach that turn the map automatically zooms in to preview the intersection. Combine this with an overlay that displays the current speed, compass direction, and the next turn and the iQue provides a pretty good overview of where you’re going. You can also drive around without an active route and it’ll tell you the name of the road you’re on as well as upcoming cross streets. Overall I’d rate its routing as acceptable, and when I veered off-course it recalculated an update to get me back on course.

The included dashboard mount is well designed and securely grabs the iQue when placed in the cradle. When you’re ready to get out of the car a press of the release button frees it from the cradle with no fiddling required. There’s horizontal and vertical articulation to help angle the iQue for visibility, which you’ll absolutely need because the display’s viewing angles aren’t great. The transflective display puts up a valiant effort but the end result is still a little dim when it can’t catch direct sunlight. It’s still usable but this would’ve been called out in a review twenty years ago. At least the angle of polarization didn’t conflict with my polarized shades. Unfortunately the base wasn’t heavy or grippy enough to stay planted during spirited driving, and I’m sure my car’s smooth pleather dashboard didn’t help.

There’s a couple other gotchas. Even though the resistive touch screen recognizes finger taps, the touch targets like the re-center button are teensy tiny and impossible to activate without a stylus. I, uh, wouldn’t recommend doing that while driving. The iQue also locked up a few times during my journey, and I’m guessing these hangs were thermal related because it felt fairly warm when I pulled it from the cradle. I won’t hold these crashes against it; this is a twenty year old device after all. Thankfully it automatically resumed navigation after a restart.

If someone handed me an iQue in 2003 I would’ve been ecstatic. Back then I was using road atlases and printouts from MapQuest to find my way around in unfamiliar territory. Being able to get instant turn-by-turn directions with live maps from my pocket computer would make me feel like Geordi La Forge. The only thing that compares would’ve been a dedicated unit like the StreetPilot, and those just as uncommon in 2003. And even with its very real 2003-era limitations it’s amazingly close to a modern navigation experience!

Now while the iQue was the first PDA with integrated GPS, that didn’t mean it was the first PDA to use a GPS. People connected a GPS to a PDA using a serial cable to log their locations, and the GPS makers noticed. They soon unveiled devices designed to attach directly to specific PDA models, and some even offered turn-by-turn navigation. As luck turns out I have one right here: the Rand McNally StreetFinder Deluxe for the Palm V.

Released in the year 2000—three years before the iQue—the StreetFinder has so many limitations that it puts the minimum in minimum viable product. The Palm Vx’s 8MB of memory and lack of SD slot means it can barely hold any map data—and God help you if you have a vanilla 2MB Palm V. Routes and map data must be calculated ahead of time on a desktop PC and then HotSynced, which takes forever over a serial cable. And when you finally hit the road the experience is grade-A jank. There’s no voice guidance for upcoming turns, which would be okay except the pea-green screen isn’t very readable in a car. If you get lost or need to make a detour there’s no dynamic re-routing. And the map display doesn’t move along with you—it’s always north-up and when you reach the edge of the screen it scrolls to a new page. I could go on and on about the limitations of this setup, and I recognize that this was the best they could do at the time given limited memory and processing power. I’m not saying it’s completely useless; it’s that reaching for turn-by-turn navigation was beyond this device’s grasp. If you use it to log data or to show your location on a static map, it’s a similar experience to a serial GPS except it’s easier to connect and takes up less space.

Unlike the StreetFinder the iQue is capable enough that if it had modern maps it could be a viable navigational aid today. So it’s too bad that Garmin stopped publishing map updates for it. Oh egads, my trip is ruined! But what if… I were to load new, up-to-date maps on this obsolete PDA with a GPS? Ho ho ho ho, delightfully devilish, Daniel! There’s plenty of maps on the web in Garmin’s IMG format, or you can roll your own if you’re feeling plucky. I downloaded OpenStreetChest’s database for the Northeast US, which is based on OpenStreetMap data. After installation the new tile set showed up in the Map Installer plugin and I could send them to the iQue’s SD card. The downside is that these maps consume more storage, which means longer HotSync times and more card swapping. The fact that I could load new maps without any trouble is a wonderful surprise—I really thought I’d have to do some hacking to load new data. Thanks, OpenMapChest!

Garmin also created an API so other Palm OS mapping apps like PathAway could use the iQue’s GPS. And it does work! PathAway is more like a traditional map-and-compass setup, perfect for folks who like geocaching, boating, or wilderness exploration. It’s not a turn-by-turn navigation app, so it can’t replace the Que suite, but that’s okay. Because the iQue’s a general purpose device, you can extend its capabilities much like modern smartphones. A StreetPilot has advantages like a bigger screen and car-focused controls, but you’ll never be able to run games or spreadsheets on it. So how well does it wear both hats?

The iQue as a PDA

2003 was a busy year for the PDA market. Manufacturers needed new features to differentiate their devices once color screens were commonplace. They also faced increasing pressure for pocket space from cell phones, MP3 players, and digital cameras. Sony Clies added hardware keyboards, wireless networking, and digital cameras. Handspring created the Treo smartphone by combining the Handspring Visor and its VisorPhone Springboard module into one device. So Garmin embedding a GPS into a Palm PDA was a gambit in line with the overall market trend towards convergence.

The iQue is a PDA at its core, and people bought one instead of a StreetPilot because they needed to do PDA things like managing contacts and appointments. The good news is that Garmin licensed Palm OS instead of rolling their own system. The iQue runs Palm OS 5.2.1 and takes advantage of its many improvements like support for ARM processors, SD cards, audio mixing, and a collapsible on-screen Graffiti zone to give apps more screen real estate. OS 5.2 also introduced the Status Bar, which Garmin customized with Que app shortcuts, the GPS status indicator, and buttons to adjust brightness and volume.

Palm OS users will be right at home with an iQue. Basic organizer functions are fulfilled by the standard Palm app suite. All your favorites are there, like the Address Book, Date Book, Todo list, and Memo Pad. It’s a shame that the iQue came out a few months before Palm completely redesigned their suite for the Tungsten T3, but it’s still perfectly usable. Some default apps, like Expense and Mail, are missing entirely. You could use the bundled Documents to Go spreadsheet app to track your expenses—that is, until the free trial runs out and you had to cough up 50 bucks to keep using it.

The missing Mail app demonstrates the iQue’s blind spot around networking. 2003 was a big year for wireless connectivity for PDAs. SDIO Bluetooth and Wifi cards let small, pocketable devices join networks or tether to a cell phone to receive texts and emails. The iQue 3600 doesn’t support these cards at all, and you must use a cable to physically tether a phone. Even if you wanted to HotSync email for offline reading you’ll need a third-party client. A trial copy of MailToGo is on the installer disc, but I think making users pay for an offline email app is rough.

Garmin also included some non-GPS focused apps in their Que suite, and these are QueClock, QueAudio, and QueVoice. QueClock is a straightforward alarm app that lets you set single or repeating alarms without needing to dive into the Date Book. It also shows the current precise time and your next upcoming event. It should really be a world clock, too, but there’s plenty of those out there for Palm OS if you wanted one.

QueAudio is a bare-bones MP3 player that lets you listen to files stored in an SD card’s Audio directory. There’s no playlists and sorting is very rudimentary, but it does have a shuffle function. It choked on many of my modern MP3 files—I suspect it doesn’t like 48KHz sample rates—so I dug up an old one to give it a test. The speaker’s audio quality is, well, bad, but it sounds fine through headphones. It’s no iPod, but it’s on par with other PDAs.

QueVoice records voice memos when you press and hold the record button. The audio quality isn’t great, thanks to the mediocre mic and the crushing 11KHz sample rate. There’s an option to use ADPCM compression which significantly cuts down on file size without compromising the file’s already rough quality. It works, but don’t use it to record your demo reel.

There’s one last criteria to judge the iQue as both a PDA and a GPS: battery life. And it’s not great! Garmin’s public statements about the iQue’s longevity didn’t inspire confidence. Quote: “approximately 10 days if used an average of 30 minutes per day with backlight off.” I find that answer vague and unconvincing. It’s just like saying my car gets 40 miles per gallon if I always drive downhill and lay off the turbo. More useful numbers are found in the 3600A’s manual, which claims 5 hours of battery life with the backlight set to 50%.

After spending some time with a fresh battery I’m inclined to agree with that five hour estimate. That is, until I turn on the GPS and the battery drains even faster. Reviews at the time were not kind to its battery usage, with CNet calling it “gluttonous.” If you want any kind of reasonable runtime outdoors with the GPS enabled you must forego the backlight and rely on the transflective display. The GPS polling rate can be reduced to save battery, but I would only suggest turning that knob for hiking or walking. Be sure to plug it in to a power port while driving.

Take The Long Way Home

A convergence device often ends up being less than the sum of its features, but that’s not the case with the iQue. It’s a Peanut Butter Cup fusion of Palm PDA and Garmin GPS thanks to good software design. Much like the Treo gave us a preview of our data-connected computer phone future, so did the iQue for our omnipresent navigational device.

I say this a lot, but it bears repeating: part of the fun of the retro experience is using something you couldn’t afford or didn’t know about back in the day. I had PDAs. I had a GPS. But I never had a PDA GPS. And seeing just how much Garmin got right years before location-aware smartphones is uncanny. They might not have known at the time but this was the precursor to how millions—perhaps billions—across the globe navigate their daily lives.

Garmin tried to grow their share of the PDA market by making more models at different price and feature points. In addition to the Palm-based iQues they produced Windows Mobile Pocket PCs like the iQue M5, M4, and M3. If you already had a PDA you could pick up one of Garmin’s Compact Flash or Bluetooth receivers which came with the same suite of iQue apps as the 3600.

But the iQue’s time in the navigator’s seat was short lived. Garmin shipped their last PDA in 2006. Support for all iQue products ended in June 2008, which in an interesting coincidence is the same month Apple announced their first phone with a GPS: the iPhone 3G. I don’t think the iPhone had anything to do with the discontinuation of iQues—other market forces were to blame. PDAs were already disappearing because Blackberries, feature phones, and early smartphones were choking them out of their market niche. Meanwhile the portable automotive GPS business kicked off by Garmin’s own StreetPilot was booming. They became so cheap and useful that regular people were buying them to install into their cars—or buying a car that had a navigation system built-in.

Still, I’m happy to add it to my collection, even if I won’t be using the GPS much. Why have a boring regular Palm when I can have something with a twist? Even if I didn’t have one of these back in the day, I can see the appeal, and I applaud Garmin for showing us how to find our way before it went mainstream.

IBM's 386 Tower of Power: The PS/2 Model 80

Stand aside. This is a large computer and it takes large steps. Yes, it’s IBM’s biggest, baddest, and heaviest computer of 1987: The PS/2 Model 80 386. If one of these behemoths lumbers into your collection you need to be prepared for all the gotchas it brings. Here in Userlandia we’re adopting a Micro Channel mammoth.

Oh, Micro Channel Architecture. I sure talked a lot about you in the PS/2 Model 30 286 Computers of Significant History video. I had to, really, because a lack of Micro Channel was a defining trait of that PC. To understand what that machine had meant looking at what it lacked. But as I wrote the script, I realized I had a problem: I didn’t own a Micro Channel PS/2. What was I going to use for video footage?

My original strategy was to use still photos and archive footage from my various VCF trips for coverage. I had no plans to buy a Micro Channel machine due to expensive eBay listings and high shipping fees. But I’ve always wanted an MCA PS/2 in my collection, and if I had an opportunity to get one at a good price I’d take it. And wouldn’t you know it, just as I was polishing up the script an alert popped up on my local Craigslist for a working PS/2 Model 80. The machine appeared to be in great shape, with a working SCSI hard drive and a memory expansion card. $200 wasn’t an awful price either. The odds of finding a better deal elsewhere—especially for a machine in this condition—were pretty low. After an hour of hemming and hawing I finally emailed an offer to buy it.

Craigslist Ad for Model 80

Buying stuff on Craigslist or Marketplace can be an adventure, and this was one of the better ones. The PS/2 was located in Townsend, Massachusetts, a bucolic New England town forty-five minutes away from me. I met up with Aaron, who’s a technician at a computer shop. This Model 80 was one of his vintage restoration projects. He cleaned it from top to bottom, fixed a blown tantalum capacitor, and replaced the 6V CMOS battery. We had a great conversation about old computers and his cool Dodge Daytona. I handed him two hundred bucks, he handed me a pack of setup floppies, and I drove this beast home.

I’ll give Aaron extra credit for the effort he put into this machine before selling it. We’ve all been there with people demanding ridiculous sums for something that’s been in a basement for twenty years because they know what they have. A little spit shine and TLC go a long way when making a deal, and as a buyer I try to respond in kind. I don’t want to waste my time or yours, and I’m willing to show up with cash in hand for a sensible price and a good experience.

Aaron’s efforts also made this computer camera ready. The exterior looks like it's fresh out of a detailer. All the icky black foam’s been plucked from the side panel. And look at all these clean, shiny boards! The Model 80 was also mechanically ready and fully configured thanks to a new CMOS battery, a working floppy drive, and a fresh reference disk. When I flipped the power switch the machine booted right to MS-DOS, with some games and a copy of Windows ready to roll. A turnkey experience like this is a rarity in retro these days. Accounting for the value of this work in the price tag makes this a veritable bargain compared to the ones I’ve seen on eBay.

But let’s say you’re not as lucky and you stumble upon a PS/2 that needs some work. A Micro Channel machine can bamboozle seasoned vintage computer collectors with the myriad ways it differs from a regular PC. So I’m going to explore how these machines are built, what they’re capable of, how to upgrade and configure them, and the bumps you’ll hit along the way. I’ve got plenty of knowledge from books and past encounters with PS/2s, but there’s bound to be gaps in my memory. I’d also need resources like disk images, drivers, and ADFs. Luckily, I can stand on the shoulders of big blue giants like The Ardent Tool of Capitalism and IBM Museum. They’re a treasure trove of downloads, documentation, and answers to oddly specific questions. Now that you have an idea of where we’re going, let’s start by exploring the Model 80’s case.

Tower cases were still a rarity back in 1987, and IBM chose to rotate the case 90 degrees because… well, look at the size of this thing! The Models 60 and 80 were designed as floor standing towers that live underneath a desk. Yet despite considerable height and depth it’s fairly narrow compared to most tower cases. IBM was so enthralled with 3.5 inch drives that they designed the enclosure around them. Still, there is a full-height 5 1/4 inch bay—it’s just rotated 90 degrees to fit.

But a big case has a cost, and the price you pay is weight. It’s so heavy that there’s a sticker listing its weight at 40.5 pounds. And this isn’t even the heaviest Model 80 variant! Perhaps IBM took pity on the people who had to port these machines around, because they included a handy handle on the top of the case. I suppose that makes it portable in some way, but I think “liftable” is the better term.

On the bottom of the case are two fold-out pedestal feet. Because the case was intended to stand underneath a desk, there’s a risk that the tower would tip over after an accidental kick from a desk jockey. The cases’s tall, skinny design combined with a top-mounted power supply and heavy drives gives it a high center of gravity. Instead of making the whole case wider—or perhaps as a consequence of designing the narrowest case possible—IBM added these feet to balance things out. You don’t have to use them, but you probably should.

For connectivity, the Model 80 offers a standard array of ports: keyboard, mouse, serial, parallel, and VGA. This makes it easy to hook this old boat anchor up to modern-ish displays and input devices.

Finally, the pièce de résistance: the power switch. IBM may have changed the color from red to white, but it still has that trademark chunky sound. Mmm, satisfying.

Now let’s take a look inside. The side panel is secured by two captive screws, and a coin is the perfect tool to loosen them. There’s a key lock too, but this case’s keys are lost to time. If it was locked, all it takes to open it is cutting the end off a Bic pen and shoving it in.

Normally this panel would be covered in a black noise dampening foam, which over time decomposes into an awful black goo. It’s sticky and gross and irritating and it gets everywhere. Thankfully Aaron removed it all during his restoration. If you’re buying a Model 60 or 80, be prepared for a full de-lousing.

Once you look inside you’ll understand why this computer is so capacious: this case corrals clusters of cards, cages and cables. IBM built this machine to grow along with its customers, and that meant a lot of space for said customers to cram in expensive IBM components. It’s not too chaotic for an enclosure of its era, all things considered. There’s even some basic cable management features.

Storage accounts for most of the volume. The Models 60 and 80 offer “up to six direct storage access bays,” but what does that mean, exactly? That figure includes two externally accessible 3.5 inch bays, one of which is fitted with a 1.4MB floppy drive from the factory. The second bay was free for you to install a second floppy drive, a tape drive, or nowadays a Gotek.

Next is a full-height 5 1/4 inch drive bay. It could hold a floppy drive, optical drive, or a massive full-height hard disk. Installing a drive is straightforward—attach some AT drive rails, slide it in, and tighten the clamp using these thumbscrews. Good luck finding a bezel for most drives, though. Later Model 80s ditched full-height hard drives for a dual drive bracket. Buy a second one and—ta-da—you now have the advertised maximum of six drive bays.

A lot of drives will draw a lot of power, and a 242 watt power supply feeds all these components. That might not seem like much compared to modern workstations, but it was pretty good for its day. PC ATs maxed out at 192 watts, while desktop PS/2s offered something between 90 and 132 watts. All this juice flowed through fully modular cables, which was another rarity for the era.

What if you needed to replace a dead drive or a faulty power supply? Taking apart these towers is easier than you’d expect because they were designed to be serviced on site by corporate IT techs. The first step is to remove the front bezel. It comes off with a solid pull from the bottom, just like many modern PCs. The 3.5 inch bay covers are held in with clips and pop out easily by pushing in at the edge, and the same goes for the 5 1/4 bay bezel.

Like other PS/2s the floppy drives are attached to sleds. Unplug the floppy cable, press upwards on the tabs, and it slides right out of the case. Next is the hard drive bracket. This is easy—just loosen the thumbscrews and the bracket slides back for removal. Hard drives are installed using standard drive screws. Four screws secure the drive bay carrier to the chassis. Use a nut driver or a big flathead screwdriver to loosen them. Once the screws are removed, the bay carrier lifts up and out of the case. The power supply is just as easy. First, unplug the motherboard cable. Then unscrew the three screws securing the power supply to the chassis. Now you can grab this handle and pull up.

With all those components removed we’re left with a sizable motherboard. IBM made three different versions of the Model 80 planar, and this one’s a type 2. It’s distinguished by two ROM BIOS chips versus the type 1’s four chips, and three 32-bit MCA slots versus the type 3’s four slots. Google the FRU number if you’re unsure.

Processing power comes from a 20MHz 386DX CPU and 387DX FPU. There’s no cache on this board; that was reserved for the 25MHz type 3.

Memory for the Model 80 comes in the form of these planar memory cards. IBM shipped these -081s with one 2MB card, and this example has a second 2MB card bringing the total to 4MB. Up to 8MB could be installed on this planar via two 4MB memory cards. Finding them isn’t easy, so you might have better luck installing Micro Channel RAM expansion boards.

Here’s a neat curiosity: an add-on floppy controller plugged into the planar’s floppy port. I haven’t been able to identify this specific board or find any documentation for it, but based on the jumpers my hunch is that it came with a tape backup kit. Attached to its secondary port is a long ribbon cable that leads to a slot bracket with a 37 pin external floppy connector. Presumably this would work with the IBM External 5 1/4 inch drive, but I don’t have one to test this theory. I’m also curious if it supports more than two floppy drives, as the standard dual-drive ribbon cable is still attached to the primary port. If you have any ideas, leave me a note.

Video is provided by IBM’s base PS/2 VGA graphics chipset. A VGA controller, 256K of video memory, and an Inmos RAMDAC deliver standard VGA graphics modes like 320x200 at 256 colors.

Unlike my Model 30 and its Dallas clock chip, the Model 80 has a separate battery for its realtime clock and CMOS memory. It’s a common 6V lithium camera battery that costs around ten to fifteen bucks to replace. These machines won’t boot without a working battery, so make sure to put one on your shopping list.

Lastly, it’s time for slots and cards. Micro Channel Architecture is the defining feature of these PS/2s, and you’ll need to learn its intricacies before planning any upgrades. This Model 80 came with three cards: an IBM Token Ring Adapter, an IBM PS/2 SCSI Adapter, and an IBM Enhanced 32-bit Memory Adapter.

Before adding or removing any cards I suggest you make a note of your existing configuration. Adding, removing, or changing cards will force you to run the reference disk’s configuration utility at the next boot. I’ve got the memory card in slot 2, SCSI card in slot 4, and token ring card in slot 7.

Installing Micro Channel cards is pretty easy. Unlike ISA and PCI cards which are secured by regular screws inside the case, MCA cards are secured by thumbscrews that are outside the case. I find them a bit fiddly for my fingers, but the ends are slotted so a screwdriver or coin makes quick work of loosening them. Then each card can be lifted up and out of the case thanks to these plastic guide pieces.

Another difference between MCA and other types of cards is the bracket. One of the reported flaws of the classic card bracket is that it’s not great at stopping electromagnetic leakage. Micro Channel brackets are designed to slide into, well, metal channels forged in the back of the case. A set of springy fingers on each side of the bracket grip the channel sidewalls and connects a solid ground.

With all the cards removed we can see three different types of MCA slots. This planar has three 32-bit slots with matched memory extension, four standard 16-bit slots, and one 16-bit slot with Auxiliary Video Extension. Notice that despite their varying lengths that all the slots are aligned to this one key pin. A 32-bit slot has a longer connector than a 16-bit slot, but since the key pin is in the same location, a 16-bit card works in a 32-bit slot. Some 32-bit cards can be used in 16 bit slots if they have a compatibility key—just make sure the hanging edge connector doesn’t interfere with other components.

Extensions aren’t just for bit width, either. Check out this one on the front of slot 6: it’s the Auxiliary Video Extension, yet another evolutionary dead-end for PC graphics. Imagine you’re buying a PC to run CAD applications in the mid-1980s. Basic CGA and MDA graphics cards weren’t up to snuff, so you’d buy a dedicated graphics card and a matching monitor. But you can’t get rid of your original video card and monitor, because that expensive new card and monitor can’t display those old modes! So your original graphics card and monitor would live side-by-side with your new graphics card and monitor, usually with text on the former and graphics on the latter.

IBM’s fist attempt at solving this problem was the beast known as the Professional Graphics Controller, or Adapter, or Array. This triple-decker circuit board sandwich layered with chips and controllers served up analog RGB video at 640x480 resolution with 256 colors out of a palette of 4096. But that’s not all—it also emulated CGA graphics modes over the same analog output! This meant you could run Lotus 1-2-3 and AutoCAD on the same monitor without needing to flip switches or change inputs.

So when you look at the PGC and its layers of complexity, you can see the logic behind the AVE. If IBM was already including a VGA chipset on every PS/2 motherboard, why not use it? Installing an AVE card trips a presence switch that enables a digital passthrough for the onboard VGA graphics. Whenever a CGA, EGA, or basic VGA mode is requested, the planar’s VGA is routed through the AVE to the add-in card’s monitor port. If you’re thinking “Wait a minute, that sounds suspiciously similar to how 3dfx cards passed through VGA signals from a 2D graphics card,” you’re not wrong! The execution’s different, but the spirit’s the same.

There’s nothing technically wrong with AVE, but it was doomed to fail. Partly because it was a Micro Channel exclusive and Micro Channel failed, but mostly because IBM underestimated the power of Moore’s law. Makers of graphics board took advantage of the rapid pace of technological advancement to soup up VGA into more powerful Super VGA cards that could display standard modes and their higher resolution modes without passthroughs. Even IBM had to admit it was a dead end when they integrated VGA support into XGA. But that’s enough about video for the moment. Let’s examine the cards that came with this system.

First and certainly least is a Token Ring 16/4 Network Adapter. IBM’s version of Token Ring is probably the most famous—or infamous—ring topology network. Setting up a simple Token Ring network isn’t easy, because an Access Unit is required for even the simplest scenario of connecting two computers together. Stay tuned, because I have a different solution for getting this machine on a network.

Next is the IBM PS/2 SCSI Adapter. It’s better known by its codename Tribble, as it’s one of two SCSI cards with Star Trek names. The other Star Trek SCSI card—Spock—has cache memory which, logically, should yield better performance. Model 80s won’t see much of a benefit from the cache, so a Tribble card is fine. A ribbon cable attaches to the top edge connector and provides three 50-pin connectors for internal drives. The external connection is, annoyingly, a proprietary IBM HDCN-60 pin connector. That means no external CD-ROM or Zip drives without finding a unicorn cable or adapter. At least an internal BlueSCSI can serve as a CD-ROM emulator.

Lastly, we have the IBM Enhanced 80386 Memory Adapter with ROM. This long board with the longer name is a great example of IBM’s penchant for proprietary peculiarities. Model 80 planars are limited to a maximum of 4 or 8 MB of on-board memory depending on your particular variant. Further memory expansion requires a Micro Channel memory expander from IBM or third parties like Intel, Kingston, and so on.

One downside to IBM memory adapters is that they require IBM-branded SIMMs. The system checks for presence bits on the SIMMs that match specific combinations of speed and size, and if it doesn’t find them it’ll error during POST. I bought a pair of IBM 4MB 70ns SIMMs because various sites and newsgroup posts claimed the part number was compatible. 70ns SIMMs should work in an 80ns system; they’ll just run at 80ns. Yet after I installed them the PS/2 returned an 18441 Unsupported SIMM error.

The seller had validated them as working IBM SIMMs, and I had no reason to doubt them because I’ve bought plenty of good RAM from them before. And yes, I tried it with only the 70ns SIMMs installed. Maybe 70ns SIMMs only work in this card when it’s installed in a type 3 planar, which requires 70ns RAM. I eventually gave up and bought two 80ns SIMMs that matched the one I already had, and those worked fine.

Now I have 16MB of RAM, which is great for DOS and Windows 3.1, and probably acceptable for OS/2. Coincidentally, 16MB is the soft limit for RAM in a Model 80. Yet another quirk of this machine is the 24-bit direct memory access controller, which means any DMA transfers must occur in the first 16 MB of RAM. If a bus-master card or an OS performs a DMA transfer above that barrier, well… brace yourself for a crash. That’s a bit of a bummer for a machine like the Model 80 which is designed around DMA and bus mastering.

But where there’s a will to install more RAM, there’s a way. IBM released the Bypass One Problem Temporarily patch, which updates the reference disk and the ROM on this card to work around the 16MB limit. The requirements for this are a bit arcane—you’ll need at least two MCA memory adapters, one of which has ROM. I highly recommend reading a copy of the update instructions posted by Ian Brown on the comp.sys.ibm.ps2.hardware newsgroup. I updated my adapter by following these instructions and the process was tedious but straightforward. Pro tip: save yourself from dozens of disk swaps and extract the updated ADFs and SC.EXE from a disk image and copy them to your reference disk using a modern PC. If this is too hacky for you, boards like the Acculogic and Kingston memory expanders have their own memory mappers and can break the 16MB barrier without BOPT according to PS/2 wizard Peter Wendt. My advice is to avoid this problem entirely and stick to the maximum 16MB of RAM.

Imagine that you’ve finished restoring a PS/2 just like this one. You’ve replaced the CMOS battery, upgraded the memory, serviced the drives, and stuffed all those cards into its slots. It’s be a big white doorstop unless you have the key to start it: the reference disk. Every MCA PS/2 has a Reference Disk with utilities and configuration files to set up its BIOS, and you’ll need to fish it out whenever hardware changes are made. Odds are it’s not the only disk you’ll need either thanks to the architecture of IBM’s software device configuration.

Plug and play systems need a way for the BIOS or operating system to identify hardware, and IBM’s idea was Programmable Option Select. POS uses eight hardware registers to declare the card’s identity and capabilities. The first two contain adapter and manufacturer information encoded into a unique 16-bit identifier. The next four are programmable option registers that use bit masks to define the card’s option settings. They’re like a software version of DIP switches or jumpers that let you virtually select configuration options like an IRQ or address port. The last two are subaddress registers which can read or write data to additional memory on the board. I’m glossing over a lot here, but if you want more gory details about MCA’s architecture I’d recommend checking out the sources I used for research: Tube Time’s MCA Tutorial or the MCA Architecture Handbook on Archive.org.

POS’ fatal flaw—unfortunate abbreviation aside—is that it doesn’t actually tell the BIOS much of anything about the card other than the unique ID. A few reserved option bits have fixed functions defined in the spec, but otherwise you’d have no idea what a card is capable of just by looking at the registers. That’s why every card needs a matching Adapter Description File. ADFs are text files which translate the unique ID into an actual card name and the option registers into settings like IRQs and I/O addresses. Without ADF files MCA cards are worthless. Back in the day these files came on Option floppies, or you could download them from a BBS and copy them to your reference diskette.

How does software configuration work in practice? Let’s demonstrate by trading out the Token Ring card for an Ethernet card; specifically this Western Digital EtherCard Plus 10T. After installing the card I’ll need to boot up with the reference disk.

God, the memory test on this is slow.

The system recognized that the adapter configuration has changed andI’m prompted to run automatic configuration. Normally this is when you’d use Copy Option Diskette to load the ADF, but I’ve already copied the ADF to the reference disk using a modern PC. The auto-configuration process takes a few minutes, and when it’s done you’re dropped into the reference disk’s main menu.

Looking in View Configuration the ethernet card shows up in slot 7 and we have some settings to tweak like its I/O resource, IRQ, BIOS ROM, and so on. This is POS working as designed, automatically selecting the appropriate resources based on the installed cards. Seems pretty easy, so why does everybody grouse about it?

It’s not that POS doesn’t work; it’s that it doesn’t go far enough for something that breaks backwards compatibility. POS might have eliminated jumpers and switches but it didn't solve the architectural issues that made them necessary in the first place. There’s still the potential for resource conflicts with an unlucky combination of cards. It doesn’t matter if the system can auto-assign an IRQ when the only one the card can select is 7. Then there’s the tedious reference and option disk shuffling, which nobody liked then or now. If my reference disk was lost or destroyed I’d have to re-copy all the ADFs and re-do all my custom configurations. That ethernet card didn’t come with an option disk when I bought it at a swap meet. If I hadn’t found its ADF over at the Ardent Tool, it would be utterly useless.

On the one hand IBM was working within the constraints of its era, which makes some of these decisions understandable. But on the other hand NuBus and Zorro were true plug-and-play busses, and they were developed and released at the same time! At least Micro Channel gave a blueprint for mistakes to avoid when the PCI group got around to defining the PCI configuration space.

But whatever. You’ve successfully restored and configured a Micro Channel monolith. Only one question remains—what are you going to do with it? I can hear all the shouts of “GAMES! PLAY SOME GAMES!” But look at this thing—people weren’t buying them for games. This particular Model 80 is a type 8580-081, which was announced on October 30, 1990 for a list price of $6,845—that’s almost $16,000 in today’s purchasing power. That chunk of change bought you a 20MHz Intel 386DX, 2MB of RAM, a 1.4MB floppy drive, VGA graphics, and an 80MB SCSI hard drive.

With such a lofty price tag, the only way someone could afford a machine like this was to put it to work. And wouldn’t you know it, I’ve got a productivity app perfectly suited to pay those bills: AutoCAD! Behold as it slowly paints the famous Autodesk Space Shuttle demo file line by line. I won't guess how long an actual render job would take. Honestly, this machine’s performance in Windows 3.1 is more than acceptable, even within the limitations of cooperative multitasking. 16MB of RAM and a SCSI hard drive certainly help on that front. But could I be even more productive with a higher resolution display with extra colors? If only there was a way to… extend the graphics capability of this machine. Wait a minute, there is! It’s the IBM XGA-2 graphics adapter!

This card is notable because it was IBM’s last-gasp attempt to stay in charge of PC graphics standards. Introduced in September 1992, it improved upon the first XGA card by upgrading the VRAM to a standard 1MB, adding support for 800x600 resolutions, and offering non-interlaced 1024x768 modes. And XGA-2 did all this for a list price of $360, which a fraction of the $1,095 you’d pay for a 512K XGA-1 back in 1990.

Installing this card opens up a new world of video capabilities. A graphical OS is a much better experience with a large 1024x768 desktop with 256 colors and a flicker-free 70Hz. Or I can trade resolution for color depth and get 65,000 colors at 640x480 or 800x600. And fixed-function graphics acceleration routines prevents all this pixel-pushing from pulverizing performance. This chipset had all the ingredients to succeed VGA as the de facto PC graphics standard—so why did it fail?

When IBM launched XGA in 1990 the landscape of PC graphics was shifting constantly. VGA was a massive improvement over CGA and EGA, but the hunger for higher fidelity graphics seemed insatiable. Companies like Genoa, Western Digital, and ATI released so-called Super VGA cards that displayed more colors or higher resolutions than regular VGA. Some Super VGA cards were clones of IBM’s 8514/A, the predecessor to XGA. Others extended VGA in their own ways which required specific drivers for each application. SVGA wasn’t really a standard; it was just a label that meant “somehow better than VGA.” IBM’s plan with XGA was to define a real successor architecture, publish the hardware specifications, and license out the chipset. Other manufacturers adopting XGA would kill the incompatible Super VGAs, just as VGA did to the Extended EGAs of yore.

That was a sensible strategy in 1987, but IBM’s influence had waned by 1990. Video card and monitor manufacturers agreed to work together and formed the Video Electronics Standards Association in July 1989. Going forward VESA would define a vendor-agnostic method to display Super VGA modes. The first round of VESA Video BIOS extensions released in 1990 had their limitations, but it was a sign to application developers that the madness of writing a driver for every video card would eventually come to an end. Now they could request a VESA video mode and either the card’s VBIOS or a terminate-and-stay-resident program would answer the request by changing the graphics card to the desired mode.

IBM was a member of VESA and XGA cards do support VESA modes thanks to an IBM TSR. But the catch is that IBM expected developers using VESA modes to obey the spec, and—surprise—most did not. Many apps and games directly manipulated the VGA color registers while in VESA modes. XGA’s high resolution modes don’t use the VGA palette registers, so attempts to modify them results in a corrupted palette. IBM’s TSR ends up being useless. Meanwhile, UniVBE just hangs, so there isn’t really an alternative.

The practical impact of this problem on a Model 80 is minimal because the number of games than run on a 386 with Super VGA graphics can probably be counted on one hand. The two most popular would be Links 386 and SimCity 2000, and luckily they both have XGA compatibility switches. 486 PS/2 owners with XGA will face more challenges. According to old newsgroup posts the VESA emulation in Windows 95 actually fixes the palette problem, so you might have better luck running DOS games in Windows instead.

If VESA had adopted XGA, maybe things would have been different, but IBM admitted defeat in 1993 when they switched to Cirrus-based SVGA chipsets and cards. Will an XGA-2 card help your MCA PS/2? It couldn’t hurt, especially if you like to run Windows. There’s updated drivers for Windows 9x that enable more color depths at higher resolutions. If your goal is pure DOS gaming, you might want to look elsewhere. But adding it to this Model 80 feels right. This machine deserves better than VGA graphics, and now it feels more complete.

I hope you enjoyed my tour of this monument to business arrogance. I’m not an IBM fanboy, but I have to say there’s a devilish appeal to owning something that was way out of your price range when it was new. Aaron and I agreed that if I was going to own a Micro Channel machine, this one had to be it. The only way I could make this thing even more IBM is by installing OS/2, and maybe I will some day. Until then, it’ll stay in my collection as a piece of Big Blue history.

IBM PS/2 Model 30 286 - Computers of Significant History

The slide rules, the jacquard looms, the abacus—when did you first get into collecting retro tech? We might not be going back as far as Herman Hollerith and his punchcards, but we will take a look at his great-great-grandputer. How are we gonna do it? Here in Userlandia, we’re gonna PS/2 it.

Welcome back to Computers of Significant History for another personal chapter in my personal chronology of personal computing. After examining the role of the Commodore 64 and Apple IIe in my life and yours, I’d be remiss in not addressing the big blue elephant in the room. The influence of International Business Machines could be felt everywhere before, during, and after the PC revolution. “Nobody ever got fired for buying IBM” as the old saying goes, and many people's first exposure to computers was an IBM PC being plopped onto their desk.

You might recall from a previous Computers of Significant History that a Commodore 64 was my primary home computer until October 1997 when my uncle gave me his Compaq ProLinea 486. That’s technically correct, which as you know is the best kind of correct. But that's not the whole story, because there was another computer evicted from the house that fateful October evening. Tossed out the door with the C64 was an IBM PS/2 Model 30 286 that I had acquired just six months earlier in April. As a fellow collector of old tech, you might feel bad for those computers now—but they had no feelings. And the new one was much better. Clearly I hadn't learned anything from watching The Brave Little Toaster as a kid.

That's right, my very first IBM compatible PC, the one I left out in the cold on that windy October evening, was a genuine IBM. It came with a matching color VGA monitor, a modem, a Microsoft Mouse, and a Model M meyboard-er, keyboard. With 2 MB of RAM and a 20 MB hard drive, it met the minimum requirements to run Windows 3.1, though the experience felt less like running and more like walking. Still, it could run Windows! And I so desperately wanted a machine that could run Windows, even though I couldn’t do much more than play Solitaire or type something in Windows Write.

So in an effort to bring you that same experience, I found one on eBay for a sensible price. With 1MB of RAM and a 30MB hard drive, this config would’ve retailed for $1,895 in 1989 (which is about $4700 in today's dollars). It also came with a Digiboard multiport serial card which apparently connects to a semiconductor validation machine! That's for inspecting silicon wafers for defects at chip foundries. Neat! For a computer that’s apparently lived a life of industrial servitude it wasn’t terribly dirty inside or out when it arrived. And the floppy and hard drives still work, which is impressive given their reputation for quitting without notice.

If the drives were dead, replacing them would be challenging. IBM said that the PS in PS/2 stood for Personal System, but I wouldn’t be alone in saying that “Proprietary System” would be more accurate. Not Invented Here is a powerful drug, and IBM was high on its own supply. The PS/2 series is the classic work of an addict. You might be familiar with the most famous symptom: Micro Channel Architecture, IBM’s advanced expansion bus with features like plug and play (sort of) and 32-bit data widths (sometimes). But look inside a Model 30 and you’ll see no Micro Channel slots at all. There’s three 16-bit AT bus slots, which you know better as the Industry Standard Architecture. Omitting the patented protocol had a practical purpose; the entry level models needed lower costs to maximize margins and ISA was cheaper.

But everything else inside these boxes was just as proprietary as its more expensive siblings. Need more memory? You can’t use ordinary 30 pin SIMMs; you needed IBM branded SIMMs with specific pinouts for the PS/2. Floppy drive gone bad? Its power and data connections are combined into one cable with a completely different pinout than a standard drive. The hard drive is proprietary too—it uses a special edge connector and its custom ST506 controller is unique to these low-end PS/2s. Even the power supply had to be special, with this wacky lever for the power switch and a connector that’s completely different from an AT.

Thirty years on, upgrading and repairing these PS/2s is more complicated than other PCs. PS/2 compatible SIMMs command a premium on the used market, though handy hackers can rewire some traces on standard modules to make them work. If your floppy drive can’t be repaired, you’ll need an adapter to use a common one, and you’ll need to 3D print a bracket to mount it. Unless you stick to specific IBM-branded hard disks you’ll need to sacrifice one of your slots for a disk controller or XT-IDE adapter.

And this doesn’t stop with hardware. IBM rewrote its BIOS for the PS/2, so a machine like the Model 30 286 that walks and talks like an AT isn’t actually an AT and shouldn’t be treated as such. The PC AT included a Setup floppy to configure its BIOS settings, and the PS/2 retooled this concept into Reference Disks for Micro Channel models and Starter Disks for ISA models. We gripe about setup disks today, but firmware storage back then was pretty limited and a setup program took up too much space on a ROM or flash chip.

Since the 35 year old battery inside the PS/2’s Dallas clock chip had expired, I needed to replace it. Instead of dremeling out the epoxy and soldering in a new battery, I bought a replacement chip with a removable battery. Next step: the starter disk dance. This was no problem thanks to disk images from the Ardent Tool of Capitalism website. I imaged a floppy on my modern PC, popped it into the PS/2, and booted into IBM’s system setup utility. BIOS configuration is pretty painless for a machine of that era—all I needed to do was set the time. And, credit where it's due, it didn't even complain about Y2K. There’s even a system tutorial on the starter disk, which is surprisingly friendly.

Doing this setup routine reminded me of the last time I ran the starter disk. My original PS/2 came my way thanks to the generosity of one of my middle school teachers. In the spring of ’97 I was a fourteen year old seventh grader who’d earned a reputation as a computer whisperer. This was before formal district IT departments had taken over the management of my middle school’s tech stack, and computer labs were more like fiefdoms of the individual teachers who ran them. If a regular classroom had a computer, that was yet another responsibility thrust upon our overworked and underpaid educators. Precocious kids who spent too much time reading computer books could be tapped to solve pesky computer problems.

Seventh grade was also when students were introduced to their first technology classes. “Technology” was a catch-all term for classes about applied engineering. One day you could be building a balsa wood bridge and testing its weight load, while the next day you could be drawing blueprints in computer aided design software. Mr. Reardon’s computer lab was full of the early nineties PC clones that we’re all trying to recollect today. A motley collection of 386, 486, and Pentium PCs, this local area network introduced me to the magic of AutoCAD.

Across the hall was Mr. Nerrie’s shop. Kids today would call it a “maker space,” what with the fabrication machinery. There were plenty of computers mixed in amongst the lathes and saws: an old Mac Classic, a no-name 386, and a PS/2 Model 30 286. They ran software like circuit building programs, wind tunnel simulators, and bridge construction games. Though the PS/2 wasn’t a speedy machine it eventually told me all the flaws in my designs. Mr. Nerrie had picked up on my affinity for computers, and encouraged me to try board-level repairs. My only experience with building electronics was one of those Radio Shack circuit builder kits, so learning how to use a soldering iron helped me level up my hardware skills.

One morning in April I noticed the PS/2 had vanished from its desk. In its place was a 486 tower that had migrated from Mr. Reardon’s lab. Now the PS/2 was sitting near the outside door alongside a box of accessories.

I asked what had happened to the PS/2, and Mr. Nerrie said that it was destined for the dumpster. Then the gears started turning in my head. “Well, if they’re just throwing it away… can I take it?” After a short pause, he said “Sure, why not. It’s better off being used than in the scrap pile.” I hooked it up to some nearby peripherals and started with the starter disk. After setting up the BIOS, I formatted the hard drive and installed a fresh copy of MS-DOS and Windows 3.1. With a wink and a nod to cover this misappropriation of school property, I could give this machine a second life. Getting it home was a chore—thanks for picking me up, mom. After setting it up in my bedroom, I stared at the DOS prompt. Now that I had this PC, what was I going to do with it?

Baud to the Bone

A PS/2 Model 30 286 wasn’t exactly a barn burner when it was new, and by the time I got one in 1997 it was laughably obsolete. That was the year of the Pentium II and 3dfx Voodoo breaking boundaries in 3D gaming. The only thing more absurd than using a 286 every day in 1997 would be… well, using a C64 every day in 1997. But when you’re lost in an 8-bit desert, any 16-bit machine feels like a refreshing glass of water.

Compared to my Commodore, the PS/2 had some significant advantages. Its 10MHz 286 CPU wasn’t a Pentium, but it was far more capable at crunching numbers than a 6510. 2 MB of RAM dwarfed the 64K that gave the C64 its name. VGA graphics with 256K of video memory gave glorious 256 color video, which was sixteen times the 64’s sixteen colors. 1.4MB floppies had nearly ten times the storage of a 170K CBM disk. The cherry on top was the 20MB hard drive, which wasn’t much but was still better than the C64’s nothing. The only advantage the C64 had was its SID sound chip which blew the PS/2’s piezoelectric PC speaker away, Memorex style.

Sadly, this machine wasn’t meant for games. The true second wave of DOS PC gaming relied on the combined power of a 386 CPU, SoundBlaster digital audio, and VGA graphics. Even if I added a sound card to the PS/2’s VGA graphics, I would still be missing the third piece of the PC gaming triforce. At least the PS/2 could manage some rounds of SimCity Classic or Wolfenstein 3D. That was fine by me because game consoles filled in the gap. We still had our collection of regular and Super Nintendo games, and my older brother had recently bought a PlayStation thanks to money he earned from his job at Stop and Shop.

The PS/2 might have lacked gaming prowess, but it could do something that my games consoles and C64 couldn’t: connect to the outside world. Included in the accessories box was a 1200 baud Hayes Smartmodem. Before you “well, actually” me, I know that my Commodore could dial into a BBS. There were plenty of C64 BBSes back in the eighties. But our C64 didn’t have a modem and by the time I was old enough to dial in—well, there may have been some Commodore boards left, but I certainly couldn't find them. PC BBSes, though—there were plenty of those.

Being a terminally online teen was harder in 1997. If I wanted to surf the Information Superhighway, I had to do it after school in the computer lab. Even if I could have afforded an ISP’s monthly service costs, a 286 PS/2 couldn’t run a graphical web browser. Bulletin boards were text based and thus had much lower system requirements. And they were free, so even a broke teen like me could use them. But there were strings attached. Because most boards had only one or two phone lines, every user had a connection time limit. After using your hour of credit you were done for the day. The same went for file transfers—if you wanted to download a file you needed to upload something in return lest you be branded a leech. And the Sysop who’s actually footing the bill can ban anyone at any time for any reason.

Armed with a copy of Telix I downloaded at school and a notepad of numbers from my buddy Scott, I was ready to dial in to my first bulletin board. He recommended The Maze, so I keyed in the number: 413-684-… well, I won’t say the last four digits. After some squawking and buzzing noises from the modem the PS/2 successfully connected to this magical world. I sat and watched as a logo crafted from ASCII text slowly painted line by line across the VGA monitor. 1200 bits per second was excruciating; I was used to my middle school’s blazing fast 128 kilobit frame relay connection. After this interminable wait, I was presented with a login prompt.

I had to register an account to gain access. This meant writing an introductory post to the sysop and creating a handle. Introducing myself was easy enough because Scott was already a Maze member and he could vouch for me. But a handle, that was more difficult. Bandit or Rubber Duck were too obvious. Then it struck me: I could use a character name from a video game. Even though I was hooked on the PlayStation at the time, I still had a soft spot for the Super Nintendo. Final Fantasy was one of my favorite video game series and it was full of memorable characters. It was settled: Kefka would be my handle.

After I filled out the registration form, the board said to check in the next day to see if my account was approved. After a seemingly endless day of school, I raced home to dial in and see if my request had been granted. I powered on the PS/2, launched Telix, and mashed number one in the phone book. After waiting an eternity for the logo to draw line by line, I typed in my user name and password—and it worked! The main menu slowly painted in its options: file downloads, message boards, something called doors—was the owner a Jim Morrison fan? Navigating the board took ages because of the modem’s narrow 120 character per second pipe. In hindsight it was probably a bad idea for a 14 year old kid—even a smart and precocious one like me—to be in an environment like this. The average user of The Maze was a college or late high school student with interests to match. I was a stupid newbie which meant I made a bunch of mistakes. Shep—the sysop—was a friendly enough guy who gave me some firm lessons in board etiquette, like “don’t drop carrier” and “don’t ping the sysop for a chat every time you log in.” I learned quickly enough to avoid being branded a lamer.

But while I never got into trouble online, my bulletin board adventures managed to get me into some trouble offline. Thankfully not the legal kind, as no teenage boy would ever try to download warez or inappropriate material. No, I made the most common mistake for a computer-loving kid: costing your parents money. One day I came home from school to find my mother angrily waving a phone bill in my face. My parents begrudgingly tolerated my modem usage as long as it was early in the morning or late in the evening when we weren’t expecting phone calls. However, there was one rule I literally couldn’t afford to break: no long distance calls.

Massachusetts’ 413 area code spans a lot of area from Worcester county all the way to the New York state border. You could incur toll charges calling within your own area code, and NYNEX offered a Dial-Around plan for those who wanted a consistent bill instead of surprise charges. But most people were like my parents—they didn’t make enough toll calls to justify its price and took their chances on metered billing instead. NYNEX and the other baby Bells published a list of exchanges that were within your local dialing area. Pittsfield’s list felt arbitrary—one number fifteen miles away in Adams was free, while another number eight miles away in Lee was a toll call. So I checked every board’s number to make sure I wouldn’t rack up a bill.

My cunning scheme was undone by the fact that NYNEX would occasionally break apart exchanges and shift around which ones were in your toll-free region. These weren't unannounced, but I was fourteen and I didn't read the phone bill. One local BBS number turned into a long distance call overnight, and I wasn’t checking the toll status after marking local numbers. Unbeknownst to me I was racking up toll charges left and right. My free computer and free bulletin boards wound up costing me eighty dollars, and I had to pay back every penny to the bank of mom and dad.

Most of my memories of the summer of 1997 revolve around the PS/2 and the hours I spent dialing in to various bulletin boards. Another teacher lending me a faster modem certainly helped. Mrs. Pughn, who ran the Mac lab, lent me a US Robotics Sportster 9600 modem to use over the break. This was a far more usable experience than my 1200 bit slowpoke. Menu screens painted almost instantly. I could download a whole day’s worth of message board posts for offline reading in two minutes instead of fifteen. That saved valuable time credits that could be spent playing door games instead.

Every bulletin board had its own flavor imparted by the sysop that fronted the cash for its lines and hardware. This was especially true of The Maze, which ran CNet BBS software on an Amiga 4000. I learned that a door wasn’t a music reference but a term for games and software launched by the BBS. One of my favorite door games was Hacker, where other board users set up simulated computer systems that everyone else tries to take down with virtual viruses. I played a decent amount of classics like Legend of the Red Dragon and TradeWars 2002, but nothing quite lived up to Hacker in my eyes.

My uncle’s hand-me-down 486 came with a 28.8K modem, and that opened up even more BBS opportunities. Downloading software, listening to MOD music, and even dial-up Doom sessions were now part of daily life. But the 486 also brought the Internet into my home. By 1997 America Online had an internet gateway and could access the World Wide Web. My uncle was an AOL subscriber, and he convinced my dad to sign up too. Now that I was browsing the web at home, how could a BBS ever compete? BBSes were already declining in 1997, but 1998 was when things really fell apart in the 413. One by one boards went dark as their owners traded BBS software for web servers. I spent the summer of ’98 crafting my first Geocities homepage and getting sucked into my first online fandom: Final Fantasy VII.

By the fall of 1998 the Maze would shut down and my regular usage of BBSes died too. The web was just too compelling. Some 413 regulars tried to set up an IRC channel called 413scene on the EFnet network, but it didn’t last beyond 1999. I still remember boards like The Maze, Mos Eisley, and The Void like the way people remember old nightclubs. Handles like Menelaus and Boo Berry still stick with me even though I have no way of contacting them. If you were active in the 413 scene in the late nineties, send me a message. Maybe we crossed digital paths.

The Big Blue Meanies

Once upon a time a personal computer was any kind of small system meant for a single user. But somewhere along the way the term came to mean something much more specific: an IBM compatible personal computer. One by one the myriad independent microcomputers of the 1980s succumbed to the “IBM and Compatible” hegemony. Even if you stood proud aboard another platform, it was only a matter of time until IBM and the armada of cloners aimed their cannons at your ship’s market share.

But despite having created the leading personal computing platform of its day, IBM wasn't as in control as they thought. Lawsuits could stop the blatant copies of their BIOS made by the likes of Eagle—but not clean-room reverse engineering. Now cloners are grudgingly tolerated because they followed IBM's standards. But the sheer effrontery of a cloner thinking they can dictate a standard?! Preposterous! We are IBM! We are personal computers!

So when Intel announced the 80386 CPU in October 1985, everyone in the industry—especially Intel—expected IBM to adopt the new 32-bit processor. The 286’s segmented memory model was unpopular with programmers, and the 386 addressed those criticisms directly with an overhauled memory model. And Intel managed to do this without breaking backwards compatibility. This was perfect for the next generation of PCs—great performance for today and 32-bit capability for tomorrow. But the mood in Armonk was muted at best.

Intel planned to ship the 386 in volume in June of 1986, but IBM’s top brass was skeptical that Intel could hit that date based on prior experience with the 286. They also thought that was too soon to be replacing the two year old PC AT. Big Blue’s mainframe and workstation divisions thought a 32-bit personal computer would encroach on their very expensive turf. This was in direct conflict with IBM’s PC pioneer Don Estridge, who saw the potential of the 386 as he watched its development in late 1984 into early ‘85. He  wanted to aggressively adopt the new CPU, but he faced tough internal barriers. Estridge was losing other political battles inside IBM, and by March 1985 he lost his spot atop the Entry Systems Division to Bill Lowe. After Estridge tragically died in the crash of Delta flight 191 in August 1985, there was no one left in the higher echelons of IBM to advocate for the 386. They were committed to the 286 in both hardware and software. And so IBM gave Intel a congratulatory statement in public and a “we’ll think about it” in private.

But you know who didn’t have any of those pesky internal politics? A feisty little Texas startup called Compaq. Intel wanted a partner to aggressively push the 386, and Compaq wanted to prove that they were more than just a cloner. This was a pivotal moment in the evolution of the PC, because Compaq and Intel weren’t waiting around for IBM to advance the industry standard. Compaq launched the DeskPro 386 in September 1986. It was effectively a slap in the face from Compaq CEO Rod Canion and chairman Ben Rosen, daring IBM to release a 386 machine in six months or lose their title of industry leader. Such a challenge could not go unanswered. Seven months later in April 1987 IBM announced the Personal System/2, a line of computers that thoroughly reimagined what the “IBM PC” was supposed to be. Big Blue would exert their overwhelming influence to change the course of an entire industry, just like they did six years earlier. Or, at least, that was the plan.

The PS/2’s development history isn’t well documented—contemporary sources are thin on details about the engineering team, and there’s no present-day oral histories or deep dives from primary sources about its development timeline. According to the book Big Blues: The Unmaking of IBM by Wall Street Journal reporter Paul Carrol, there wasn’t a single 386 chip inside IBM when Compaq announced the DeskPro. I do know that Chet Heath, the architect of Micro Channel, started designing the bus back in 1983. So when Bill Lowe was forced to react to Compaq’s challenge, he saw an opportunity to pair Micro Channel with the 386. IBM announced the Model 80 386 along with its siblings in April of ’87, but it didn’t actually ship until August. That was almost a year after Compaq launched the DeskPro 386. Compaq issued that six-month challenge because they were confident they'd win.

Then again, IBM didn’t see it that way. They weren’t designing just a single computer, but a whole family of computers. The PS/2’s launch lineup—the Models 30, 50, 60, and 80—covered every segment of the market from basic office PC to monster workstation server tower. Besides Micro Channel the PS/2 would usher in new technologies like ESDI and SCSI hard drives, VGA graphics, bidirectional parallel ports, high density 1.4MB 3 1/2 inch floppies, and tool-less case designs. IBM was sure the PS/2 would redefine the trajectory of the entire industry. And, to be fair to Big Blue, that's exactly what happened. Just… not in the way they'd hoped.

Aesthetically the PS/2 was a clean break from IBM’s previous PCs. IBM always had a flair for industrial design, and a PS/2’s bright white chassis featured sharp angles, repeating slats, and a signature blue accent color. They wouldn’t be mistaken for any old clone, that’s for sure. These sharp dressed desktops were also considerably smaller than previous PCs. Integrating more standard features on the motherboard meant fewer slots and cards were needed for a fully functioning system. Disk drives took up a lot of space, and using 3 1/2 inch floppy and hard disk drives let them save many cubic inches. Ultimately, the Model 30's slimline case was almost half the size of a PC AT by volume.

A consequence of this strategy was abandoning the standard AT motherboard layout. Now each model had its own specially designed motherboard—excuse me, planar—and the number of slots you got depended on how much you were willing to spend. The entry-level Model 30 only had three 8-bit ISA slots. The mid-range Model 50 bought you four 16-bit MCA slots. The high-end Models 60 and 80 came in tower cases with eight MCA slots each, three of which were 32-bit in the model 80 to unleash the power of the 386. It’s ladder-style market segmentation that only an MBA could love.

IBM had a very specific customer in mind when they designed the Model 30: businesspeople who spent all day in WordPerfect or Lotus 1-2-3 and weren’t particularly picky about performance. IBM took the aging PC XT, dumped it in a blender with some PS/2 spices for flavor, and the result was a slimline PC with just enough basic features for basic business work. And if IBM sold a few to the home office types, well, all the better. The stylish new Model 30 was your ticket to the modern business world, and your friends at IBM were happy to welcome you aboard their merry-go-round of service agreements and replacement parts.

A launch day Model 30 with an 8086 CPU, 640K of RAM, 720K floppy drive, and 20 MB hard disk retailed for $2,295 in 1987. If you were strapped for cash, ditching the hard disk for a second floppy would save you $600—which you could use to buy yourself a monitor, because those weren't included. And don't go thinking you'd get full-fat VGA graphics for that price, either—the Model 30 had MCGA, a skim version of VGA that was never widely supported.

PC XT clones were still a thing in 1987, but they were advertised as bargain machines, which the Model 30 was decidedly not. Flipping through the PS/2’s launch issue of PC Magazine shows the state of the market. Michael Dell’s PCs Limited would happily sell you a Turbo XT clone with EGA graphics, 20MB hard drive, and color monitor for $1699. Or you could get an 8MHz 286 for the same price as a Model 30 with hard drive. If you were feeling more adventurous, the back-page advertisers were offering monochrome Turbo XTs for $995 and under. Things aren’t much better when we compare the Model 50 and 60 against Turbo AT clones, let alone the Model 80. CDW offered Compaq DeskPro 386s with 40 meg drives for $4879. A base Model 80—with 1MB of RAM, a 1.4 Meg floppy, and a 44MB hard disk—was just under $7,000.

And IBM’s prices didn’t get any better despite improvements to the Model 30. Fourteen months later, the Model 30 286 came out. With 512K of RAM, VGA graphics, and—oops, no hard drive—it could be yours for just under two thousand dollars. If you wanted a 20MB hard drive, that was a mere $600 extra! These prices looked even worse a few months later, when the cloners would sell you 10MHz AT with the same specs and a color VGA monitor for $1899. The price for a Model 30 286 with 1MB RAM and 20MB hard disk fell to $1795 in late 1989, but the competition was getting even tougher. By 1990 you could get double the memory, double the hard disk space, and double the processor speed for the same price. All IBM could muster in 1990 was a 40MB hard drive option before discontinuing the Model 30 in 1991.

It’s a pretty cool coincidence that IBM announced the Model 30 286 on the same day that Compaq and eight other manufacturers announced the 32-bit Extended Industry Standard Architecture bus. The new Model 30 286 was seen as an admission by IBM that they couldn’t quite kill ISA despite Big Blue’s protestations. For all of Micro Channel’s vaunted technical superiority, IBM had a hard time convincing others to adopt the bus. The main impediment was royalty fees. Building MCA machines required a license from IBM plus a royalty of up to five percent of your revenue for each machine sold—and back payments for all prior machines sold with the AT bus. Some clone makers did end up licensing MCA. Tandy was the first to sell a third-party MCA machine, and ALR and NCR produced a decent amount. But vendors like Dell backed out because MCA was either too expensive or difficult to implement. And even if you did put the money and effort into it, Micro Channel clones were slow sellers.

Peripheral makers weren’t doing any better. Micro Channel introduced the concept of vendor IDs, which told the computer who made the board and what kind of board it was. These IDs were a requirement for MCA’s self-configuration ability. But IBM slow-rolled the assignment of those IDs, leaving board makers like AST in the lurch when IBM didn’t answer their phones—and that's not just an expression, IBM literally didn't answer when AST called them. Even when IBM got around to assigning IDs, sometimes their left hand issued ones that their right hand had already issued to different vendors, resulting in device conflicts. For a while there was a risk that being “Compliant” would be no better than just assigning yourself an ID number and hoping it wouldn't conflict.

By 1992 Micro Channel’s window of opportunity had closed despite IBM adding it to their RISC workstations. EISA gained a foothold in the server market, VESA launched a new Local Bus standard for graphics cards, and Intel was busy drafting the first version of PCI. The PS/2 wasn’t a complete failure, because IBM did sell a lot of them, but its signature feature ironically worked against their plans to reclaim control of the PC market. Its real legacy was its keyboard and mouse ports along with the VGA graphics standard, because IBM didn't keep them in its hoard along with Micro Channel.

By the fall of 1994 the Personal System brand was dead. The entry-level PS/2s, the PS/1, and the PS/ValuePoint systems for home users fell in battle and were replaced by the Aptiva line of PCs. Higher-end PS/2s were replaced by the “PC Series” computers, which is totally not a confusing name. The clones had won so decisively that they had evolved beyond that simplistic moniker. Compaq, Gateway, Dell, and others joined with Intel to define the standard for PC hardware, and the era of “IBM and compatible” was well and truly dead. It was replaced by the Wintel platform of x86 Intel processors running Microsoft Windows. It was like the end of Animal Farm—or maybe Server Farm. Looking from IBM to the cloners and back to IBM, and not being able to tell which was which.

Mr. Big Blue Sky

I still have some misplaced fondness for IBM, even though they haven’t manufactured a PC for nearly twenty years. One part comes from that summer of 1997 where my scrap-heap PS/2 was my way of connecting to a new and unfamiliar world. Another part is industrial design. They’re not Apple, but you can look at an IBM product and know that it was designed by people with principles. The last part is the fact that IBMs were Serious Computers for Serious Business, and having one meant you were a Serious Somebody.

But in order for “nobody ever got fired for buying IBM” to be true, IBM had to remain a safe bet. And while PS/2s were generally reliant and performant machines, IBM had stepped on several strategic rakes with Micro Channel. That, plus uncompetitive pricing, meant that IBM products weren't automatically the safe choice. The mark against clones was that they could be incompatible in ways you didn’t expect. Or, at least, that was true in the mid-80s, but by the time 1990 rolled around BIOS developers like Phoenix, Award, and American Megatrends had everything sorted out. Even if the cloners were competitors, they could work together when it counted and develop standards like EISA and ATA to fill in their technological gaps. If IBM products couldn’t actually do the job you wanted them to do, what what was the point with sticking with Big Blue?

So consider the Model 30 286 in this scenario. Because it used 16-bit ISA slots and was an unassuming office machine, it was able to wiggle into more places than a Micro Channel Model 50 could. That’s why the Model 30 286 sold as well as it did to business and government customers, even when faced with stiff clone competition. But even those sales dried up when 286 PCs stopped being competitive. The release of Windows 3.0 and the ascendancy of PC multimedia energized the home PC market, which is where IBM historically had problems—hello, PCjr. It’s not that they didn’t try—see the PS/1 and PS/ValuePoint computers—but, like today, most people were more price sensitive than brand loyal. When they were flipping through mail-order catalogs of Dells and Gateways or going down the aisles of Nobody Beats the Wiz, they weren’t going to spend an extra 20% on the security blanket of an IBM badge. After all, "nobody ever got fired" only really applies to jobs. This was reflected in IBM’s balance sheet, where they posted billions of dollars of losses in 1993.

Thankfully for IBM there was still a place for them to flex their innovative and proprietary muscles, and that was the laptop market. The ThinkPad started out as a PS/2 adjacent project, and the less price sensitive nature of business travelers meant IBM didn’t have to worry about economizing. Featuring tank-like build quality wrapped in a distinctive red-on-black industrial design, the ThinkPad actually stood out from other PC laptops thanks to innovations like the TrackPoint and the butterfly keyboard. But the ThinkPad is best saved for another episode.

I can't recommend a Model 30 286 for, say, your primary retro gaming PC. It’s slower and harder to fix than contemporary AT clones and isn’t quite up to snuff for enjoying the greatest hits of DOS gaming. 386 and 486 PS/2s might have enough CPU power, but finding Micro Channel sound cards is a pain and the same proprietary pitfalls apply. But that doesn’t mean they’re not worth collecting—just have one as your secondary machine or run software that plays to its strengths. They look sharp on display, especially if you can scrounge up all the matching accessories. Besides, Micro Channel machines are historical artifacts that provide a window into another weird, forgotten standard that lost its war. Kinda like Betamax, or SmartMedia cards, or HD-DVD.

Now that we’re thirty years beyond the turmoil of Micro Channel and the clone wars, a PS/2 is no longer bound by the rules and desires of its creator. A computer may just be a tool, and unfeeling Swedish minimalists might say that our things don’t have feelings, but we’re humans and we have feelings, damnit. And an object can express our feelings through its design or function. So while my specific PS/2 didn’t go on a globetrotting adventure to find its way back home, I think spiritually it’s found its way back, and I’ve lovingly restored it just as the Master did to his toaster. While some might scoff at a Model 30 and say “no true PS/2 has ISA slots,” the badge on the front is pretty clear. Although its tenure in my life was a short one, this system was pretty personal to me.