Dropbox Drops the Ball


You never know when you’ll fall in love with a piece of software. One day you’re implementing your carefully crafted workflow when a friend or colleague DMs you a link. It’s for a hot new utility that all the tech tastemakers are talking about. Before you know it that utility’s solved a problem you never knew you had, and worked its way into your heart and your login items. The developer is responsive, the app is snappy, and you’re happy to toss in a few bucks to support a good product. But as time goes on, something changes. The developer grows distant, the app eats up all your RAM, and you wonder if it’s still worth the money—or your love.

That’s my story with Dropbox, the app that keeps all your stuff in sync. I still remember the day—well, my inbox remembers the day. It was June 2nd, 2010, when my coworker Stephen strolled into my cubicle and said “Hey, I started using this Dropbox thing, you should check it out.” Stephen has a habit of understatement, so from him that's high praise. Minutes later I registered an account, installed the app, and tossed some files into my newly minted Dropbox folder. It was love at first sync, because Dropbox did exactly what it said on the tin: seamlessly synchronize files and folders across computers with speed and security. A public folder and right-click sharing shortcuts made it easy to share images, files, and folders with anyone at any time. I could shuttle documents back and forth from work without relying on a crusty old FTP server. This utility was a direct hit to my heart.

How Dropbox Beat Apple at File Sync

Of course, remote file sync wasn’t a new concept to me—I’d used Apple’s iDisk for years, which was one of many precursors to Dropbox. Mac users could mount an iDisk on their desktop and copy files to Apple servers with just the classic drag and drop. Applications could open or save files to an iDisk like any other disk drive. Yet despite this easy-breezy user interface, the actual user experience of iDisk left a lot to be desired. Let’s say you have a one megabyte text file. Your Mac would re-upload the entire one meg file every time you saved it to an iDisk, even if you only changed a single character. Today, "ooh we had to upload a full meg of text every time" doesn't sound like any sort of problem, but remember: iDisk came out in 2000. A cable modem back then could upload at maybe 512 kilobits per second—and yes, that's kilobits, not kilobytes. So a one-character change meant at least a sixteen-second upload, during which your app would sit there, unresponsive. And this was considered super fast, at the time—not compared to the immediate access of your local hard disk, of course, but trust me, dial-up was much, much worse. The sensible thing was to just download the file from your iDisk to your hard drive, work on it, and then copy it back when you were done, and that was no different than FTP.

Needless to say, Apple felt they could do better. Steve Jobs himself announced major changes to iDisk in Mac OS 10.3 Panther at the 2003 WWDC Keynote.

“We’ve enhanced iDisk significantly for Panther. iDisk, as you know, is for our .Mac customers. The hundreds of thousands of people that signed up for .Mac. And iDisk has been a place where you can manually upload files to the .Mac server and manually download them. Well, that’s all changing in Panther, because in Panther we’re automatically syncing the files. And what that means is that stuff that’s in your iDisk will automatically sync with our servers on .Mac—in both directions—and it does it in the background. So what it really means is your iDisk becomes basically a local folder that syncs. You don’t put stuff in your iDisk to send it up to .Mac, you leave it in your iDisk. You can leave a document in your iDisk, open it up, modify it, close it, and the minute you close it, it will sync back up to .Mac in the background automatically.

So you can just leave stuff in your iDisk, and this is pretty cool. It’s a great way to back stuff up, but in addition to that it really shines when you have more than one computer. If I have three computers here, each with their own iDisk, I can leave a copy of the same document in the iDisk of each one, open up the document in one of those iDisks, change it and close it, and it’ll automatically sync back through .Mac to the other two. It’s really nice. In addition to this, it really works when you have untethered portables. You can be out in the field not connected to a network, change a document in your iDisk, the minute you’re connected whether you walk to an AirPort base station or hook back up to a terrestrial net, boom—that document and its change will automatically sync with .Mac.”

It’s hard not to hear the similarities between Steve’s pitch for the new iDisk and what Drew Houston and Arash Ferdowsi pitched for Dropbox. But even with offline sync, iDisk still had speed and reliability issues. And even after Apple finally ironed out iDisk’s wrinkles, it and iCloud Drive still trailed Dropbox in terms of features. Apple had a five-year head start. How could they lose to Dropbox at the "it just works" game?

Houston and Ferdowsi’s secret sauce was Dropbox’s differential sync engine. Remember that one meg text file from earlier? Every time you overwrite a file, Dropbox compares it against the previous version. If the difference is just one byte, then Dropbox uploads only that byte. It was the feather in the cap of Dropbox’s excellent file transfer performance. Its reliability and speed left iDisk in the iDust. Yet all that technowizardry would be worthless without an easy user experience. Dropbox’s deep integration into Windows Explorer and the Macintosh Finder meant it could integrate into almost any file management workflow. I knew at a glance when file transfers started and finished thanks to dynamic status icons overlaid on files and folders. Clumsy network mounts were unnecessary, because Dropbox was just a plain old folder. Best of all, it was a cross platform application that obeyed the rules and conventions of its hosts. I was so smitten with its ease of use and reliability that I moved a full gig of files from iDisk to Dropbox in less than a week.

Dropbox fulfilled iDisk’s original promise of synchronized web storage, and its public launch in September 2008 was a huge success. A free tier was available with two gigs of storage, but if you needed more space you could sign up for a fifty-gig Dropbox Plus plan at $9.99 per month. Today that same price gets you two terabytes of space. And Plus plans weren't just about storage space—paying users got more collaboration features, longer deleted file recovery times, and better version tracking. And yes, I realize that I'm starting to sound like an influencer who wants to tell you about this fantastic new product entirely out of pure unsullied altruism. Trust me, though—that’s not where this is going. Remember: first you fall in love, then they break your heart. Dropbox's core functionality was file syncing, and this was available to freeloader and subscriber alike.

Dropbox Giveth, and Dropbox Taketh Away

This isn’t an uncommon arrangement—business and professional users will pay for the space and version tracking features they need to do their jobs. But in March 2019, Dropbox dropped the number of devices linked to a basic free account from unlimited… to three. The only way to raise the device limit was upgrading to a Plus plan. Three devices is an incredibly restrictive limit, and basic tier users were caught off guard. My account alone had seven linked devices: iPhone, iPad, MacBook Pro, desktop PC, two work computers, and work phone. Dropbox’s intent with this change was clear—they wanted to shed unprofitable users. If a free user abandons Dropbox, that’s almost as helpful to their bottom line as that same user paying to upgrade.

Speaking of their bottom line, Dropbox Plus plan pricing actually went up to $11.99 per month soon after the device limit change. To keep a $9.99 per month price, you have to commit to a one year subscription. There’s also no options for a lower priced tier with less storage—it’s two terabytes, take it or leave it. In comparison, Apple and Google offer $9.99 per month with no yearly commitments for the same two terabytes. Both offer 200 gigs for $2.99 per month, and if that’s still too rich they offer even cheaper plans. Microsoft includes one terabyte of OneDrive storage when you subscribe to Office 365 for $6.99 a month, and if you’re already an Office user that sounds like a sensible deal. If you’re a basic user looking for a more permanent home, the competition’s carrots look a lot better than Dropbox’s stick.

Even paying users might reconsider their Dropbox subscriptions in the wake of behavior that had left user-friendly far behind, and was verging on user-hostile. Free and paying users alike grumbled when Dropbox discontinued the Public folder in 2017, even though I understand why they cut it. People were treating the Public folder as a webhost and filesharer, and that was more trouble than it was worth. But compared to the device limit, killing the public folder was a minor loss. Photo galleries suffered the same fate. Technically savvy users were annoyed and alarmed when they noticed Dropbox aggressively modifying Mac OS security permissions to grant itself levels of access beyond what was reasonably expected. And even if paying users didn't notice the device limits or the public folder or the photo album or the security misbehaviors... they definitely noticed the new Dropbox client introduced in June 2019.

Dropbox Desktop

This is what Dropbox thought people wanted. From their own blog.

A zippy utility was now a bloated Chromium Embedded Framework app. After all, what's a file sync utility without its very own Chromium instance? While the new client introduced many new features, these came at the cost of resources and performance. Dropbox wasn’t just annoying free users, it was annoying paying customers by guzzling hundreds of megabytes of RAM and gobbling up CPU cycles. With an obnoxious new user interface and, for several months, irritants like an icon that wouldn't let itself be removed from your Dock, the new client made a terrible first impression.

The Apple Silicon Compatibility Kerfluffle

The latest example of Dropbox irritating customers is their lateness in delivering a native client for Apple’s new processors. Apple launched the first ARM-based Macs in November 2020, and developers had dev kits for months before that. Rosetta emulation allows the Intel version of Dropbox to run on Apple Silicon Macs, but emulation inflicts a penalty on performance and battery life. With no public timelines or announcements, users grew restless as the months dragged on. When Dropbox did say something, their response rang hollow. After hundreds of posts in their forums requesting an ARM-native client, Dropbox support replied with “[Apple Silicon support] needs more votes”—definitely not a good look. Supporting an architecture isn't a feature, it's part of being a citizen of the platform! Customers shouldn't have to vote for that like it's "add support for trimming videos," it's part of keeping your product viable.

Niche market software usually takes forever to support new architectures on Mac OS or Windows, but Dropbox hasn't been niche since 2009. I expect better from them. I’ve worked for companies whose management let technical debt like architecture support accumulate until Apple or Microsoft forced our hands by breaking compatibility. But our userbase was barely a few thousand people, and our dev teams were tiny. Dropbox has over fifteen million paying users (not counting the freeloaders), a massive R&D budget, and an army of engineers to spend it. The expectations are a bit higher. After multiple Apple-focused news sites highlighted Dropbox’s blasé attitude towards updating their app, CEO Drew Houston said that they hoped to be able to support Apple Silicon in, quote, "H1 2022.” More on that later.

Compare Dropbox’s response to other major tech companies like Microsoft and Adobe. Microsoft released a universal version of Office in December 2020—just one month after Apple shipped the first M1 Macs. The holy trinity of Adobe Creative Suite—Photoshop, Illustrator, and InDesign—were all native by June 2021. Considering these apps aren’t one-button recompiles, that’s a remarkably fast turnaround. On the other hand, this isn’t the first rodeo for Microsoft and Adobe. Both companies lived through the PowerPC, Mac OS X, and Intel transitions. They know firsthand that botching a platform migration costs goodwill. And goodwill is hard to win back.

Dropbox is young enough that they haven’t lived through Apple’s previous architecture changes. Apple announced the start of the Intel transition in June 2005, and shipped Intel Macs to the public in January 2006. Dropbox's public launch wasn't until September 2008, and their app supported both Intel and PowerPC from the start.  Before the Apple Silicon announcement, the closest thing to a “transition” that Dropbox faced was Apple dropping support for 32-bit apps in Mac OS Catalina. Fortunately, Dropbox was prepared for such a move: they'd added 64-bit support to the client in 2015, two years before Apple hinted at the future demise of 32-bit apps at WWDC 2017. When Catalina arrived in 2019 and axed 32-bit apps for good, Dropbox had nothing to worry about. So why is it taking so long to get Dropbox fully ARMed and operational—pun intended?

One culprit is Dropbox’s GUI. Dropbox uses Chromium Embedded Framework to render its JavaScript UI code, and CEF wasn’t Apple Silicon native until July of 2021. My issues with desktop JavaScript frameworks are enough to fill an entire episode, but suffice it to say Dropbox isn’t alone on that front. Some Electron-based apps like Microsoft Teams have yet to ship ARM-native versions on the Mac despite the OpenJS Foundation releasing ARM-native Mac OS artifacts in Electron 11.0 in November 2020. I get it: dependencies are a bear—or, sometimes, a whole family of bears. But this is a case where some honest roadmapping with your customers earns a lot of goodwill. Microsoft announced Teams’ refactoring to Edge WebView2 back in June, so we know something is coming. Discord released an ARM-native version in their Canary nightly build branch back in November. Compare that to Spotify, which also uses CEF. They too fell into the trap of asking for votes for support on issues raised in their forum. Even so, Spotify managed to get a native beta client out in July and a release version in September. CEF isn’t Dropbox’s only dependency problem, but it’s certainly the most visible. I’m sure there’s plenty of Dropbox tech support people, QA engineers, and software devs who aren’t happy about the current state of affairs, and I’ve got plenty of sympathy for them. Because I’ve been in that situation, and it stinks. Paying customers shouldn’t have to complain to the press before they get an answer from the CEO about platform support.

The Cautionary Tale of Quark

Dropbox should heed the tale of Quark and its flagship app, QuarkXPress. Back in the nineties, most Mac users were printing and graphic arts professionals, and QuarkXPress was a crucial ingredient in their creative soup. Apple announced Mac OS X in January 2000, and the new OS would feature badly needed modernizations like preemptive multitasking and protected memory. But—and this might sound familiar—existing apps needed updates to run natively under the new OS. To expedite this, Apple created the Carbon framework for their long-time developers like Adobe, Microsoft, Macromedia... and Quark. Carbonizing was a faster, easier way to update apps for Mac OS X without a ground-up rewrite. Apple needed these apps for a successful OS transition, so it was in everyone’s interest for developers to release Carbon versions as fast as possible.

The Carbon version of XPress 5.0 previewed in Macworld.

How long did it take developers to release these updates? Remember, Mac OS 10.0 came out in March 2001, and it was very raw. Critical features like disc burning and DVD playback were missing in action. Even if some users could live without those features, it was just too slow to be usable day-to-day. It wasn't until the 10.1 update in September 2001 that you could try to use it on a daily basis, instead of poking at a few apps, saying "cool" and and then going back to OS 9 to get some work done. So Microsoft’s release of Office v.X for Mac in November 2001 was timed perfectly to catch the wave of new 10.1 users. Adobe wasn’t doing the whole Creative Suite thing at the time, so apps were released on their own schedules. Adobe’s Carbon conversions started with Illustrator 10 in October 2001, InDesign 2.0 in January 2002, and Photoshop 7.0 in March 2002. Macromedia was one of the first aboard the OS X train, releasing a Carbon version of Freehand in May 2001. Dreamweaver, Fireworks, and Flash all got Carbon versions with the MX Studio suite in the spring of 2002. Even smaller companies managed it—Extensis released a Carbon version of their font manager Suitcase in November 2001!

One year after the launch of Mac OS X, a working graphic designer could have an all OS X workflow, except for, you guessed it... QuarkXPress. How long would Quark make users wait? Well, in January 2002, they released QuarkXPress 5.0… except it wasn't a Carbon app, and it only ran in classic Mac OS. Journalists at the launch event asked about OS X, of course, and Quark PR flack Glen Turpin promised the Carbon version of QuarkXPress would be here Real Soon Now:

“The Carbon version of QuarkXPress 5 will be the next upgrade. There’s one thing we need to do before the Carbon version of QuarkXPress 5 is released: We need to launch QuarkXPress 5.0 in Europe.”

Would you believe that Quark, a company notorious for slow and unpredictable development, never shipped that promised Carbon update for version 5.0? Quark customers had to wait until QuarkXPress 6.0 in June 2003 for an OS X native version. Users who'd bought 5.0 had to upgrade again. And users who'd stayed with 4.x got charged double the price of a 5.0 upgrade—and yes, that's for upgrading to 6. Ask me how I know. Quark’s unfashionable lateness to the OS X party was another log in the chimney fire of failing customer relations. Despite Quark's many virtues, they charged out the ear for upgrades and tech support, and their leadership was openly hostile to customers. Quark CEO Fred Ebrahimi actually said that if you didn't like Quark's support for the Mac, you could, and I quote, “Switch to something else.” He thought that meant QuarkXPress for Windows. What it actually turned out to mean was Adobe InDesign.

The moral of the story is that customer dissatisfaction can reach a tipping point faster than CEOs expect. You can only take users for granted for so long before they decide to bail. Quark squandered fifteen years of market leadership and never recovered. Dropbox isn’t the only cloud storage solution out there, and they’d be wise to remember that. Google Drive and Microsoft OneDrive have native ARM clients in their beta channels. Box—not Dropbox, just plain old Box—released a native client in November 2021. Backblaze also has a native client, and NextCloud’s next release candidate is ARM native too.

When I was writing this episode, I had no idea when Dropbox would finally deliver an ARM-native client. The only clue I had was Houston’s tweet about the first half of 2022. At the time, I thought that “first half” could mean January. It could mean June. It could mean not even by June. Your guess would have been as good as mine. In my final draft I challenged Dropbox to release something in the first quarter of 2022. Imagine my surprise when just before I started my first time recording of this episode, Dropbox announced an upcoming beta version supporting Apple Silicon. This beta was already in the hands of a small group of testers, and was released to the public beta channel on January 13. I had to make a few… minor revisions to this after that. There’s still no exact date for a full final version—I’ll guess, oh, springtime. Even though that challenge wasn’t published yet, I still wrote it, and pretending I didn’t would be dishonest. I am a man of my word—you got me, Dropbox. Still, that doesn’t make up for poor communication and taking your users for granted. You still got work to do.

My Future with Dropbox and Comparing the Competition

Before my fellow nerds start heckling me, I know Mac users aren’t the majority of Dropbox’s customers. Windows users significantly outnumber Mac users, and their business won’t collapse if Mac users leave en masse. But like dropping client support for Linux, it’s another sign that Dropbox is starting to slip. You have to wonder what woes might befall Windows customers in due time. After all, Dropbox has yet to ship ARM binaries for Windows, which is a problem if you're using an ARM Windows device like a Microsoft Surface or virtualizing Windows on ARM. If you really want to access Dropbox on an ARM Windows device, you’re forced to use Dropbox’s tablet app, and that’s not quite right for a cursor and keyboard environment.

Amidst all this anguish about clients, I do want to emphasize that Dropbox’s core competency—hosting, storage, and syncing—is still very good. After all, the client might be the most visible part of a cloud-based storage system, but there's still… you know… the cloud-based part. People are willing to put up with a certain amount of foibles from a client as long as their content syncs and doesn't disappear, and Dropbox's sync and web services are still top of the line. Considering how long it took Apple to get iCloud Drive to a reasonable level of service, that competency has a lot of value. External APIs bring Dropbox integration to other applications,  and if you've still got a standalone 1Password vault, Dropbox will still be useful. All these factors make it hard to disentangle Dropbox from a workflow, and I get why people are waiting and won’t switch unless absolutely necessary.

So what’s the plan? For now, I’ve switched to Maestral, a third party Dropbox client. Maestral runs natively on Apple Silicon and consumes far less resources than the official client. While Maestral syncs files just fine, it does sacrifice some features like icon overlays in the Finder. I also signed up for Apple’s 50 gigabyte iCloud plan, and in my mixed Mac and Windows environment it works pretty well. And it’s only a fraction of the price of Dropbox. iCloud’s syncing performance is satisfactory, but it still lags when it comes to workflow. Take a simple action like copying a share link. Apple’s share sheet is fine as far as interfaces go, but I don’t need to set permissions all the time. Just give me a simple right click option to copy a public link to the file or folder, please. As for Google Drive, their client software has been an absolute disaster every time I’ve used it, regardless if it’s on Mac or Windows. Microsoft OneDrive seems reasonable so far, but I haven’t subjected it to any kind of strenuous tests. If push comes to shove, I’ll probably go all-in on iCloud.

This is complete overkill when most of the time you just need to copy a public link.

I miss what Dropbox was a decade ago, and I’m sad that it might end this way. It’s not over between us yet, but the passion’s dying. Without a serious turn-around, like a leaner native client and cheaper plans, I’ll have a hard time recommending them. It’s not my first software heartache, and I doubt it'll be my last, but I’d hoped Dropbox would be different. Naive of me, maybe, but Dropbox won’t shed any tears over me. Maybe the number of people I've signed up for their paid service balances out my basic account use over the years. Enthusiasm for Dropbox has all but dried up as they’ve prioritized IPOs and venture capital over their actual users. It’s that old Silicon Valley story—you either die the hero, or live long enough to become the venture capital villain. In the meantime, I’m sure there’ll be another cute utility that’ll catch my eye—and yes, that sounds flirtatious and silly. I began this episode with a “boy meets program” metaphor, but everybody knows that fairy tales are just that—fairy tales. Relationships take work, and that includes customer relationships. If one half isn't upholding their side, maybe it's time to move on.

It's not impossible that Dropbox could win me back... but it's more likely that I'll drop them.

Happy Twentieth Birthday, iMac G4


What is a computer? A miserable little pile of… yeah, yeah, I’ve done that bit before. These days it’s hard for a new personal computer to truly surprise you. When you scroll through a site like Newegg or Best Buy, you’ll see the same old story. Laptops are the most popular form factor, flanked by towers on one side and all-in-one slabs on the other. Old-style horizontal desktops are D-E-D dead, replaced by even tinier towers or micro-PCs. The Raspberry Pi 400 brought the wedge-shaped keyboard computer combo back from the dead, which I appreciate. But seeing a brand new design, something no one else has done before? That’s a rare opportunity indeed.

Hop in the time machine and let’s visit twenty years ago today: January 7th, 2002. The place: a foggy San Francisco, California, where the Moscone Center opened its doors to the journalists and attendees of Macworld San Francisco. This day—keynote day—was a very special day, and Apple CEO Steve Jobs would present all kinds of new and shiny things. Steve warmed up the audience with the announcements of iPhoto and the 14 inch iBook, which was all well and good. As well paced and exciting as these keynotes were, everybody in the audience was waiting impatiently for Steve’s magic words: they wanted One More Thing. I can only imagine how it felt in person, but mortal geeks like me could stream it via QuickTime in all of its MPEG glory. I was virtually there, watching as Steve launched an all-new computer. That was my first exposure to the brand new iMac G4: a pixelated, compressed internet live stream. But even a video crushed by a low bitrate couldn’t obscure this reveal.

A black podium slowly rose from the center of the stage. My brain, poisoned from years of pop culture, imagined an orchestra swelling with beats from Also Sprach Zarathustra. From within the monolith came a snow white computer that could have been plucked right off the set of 2001. A 15 inch liquid crystal display stood above a spherical base; its panel framed by a white bezel with a clear acrylic ring that reflected the stage lighting like a halo. As the podium turned I caught a glimpse of the silver cylinder that connected the two together. Oohs and aahs flowed from the crowd as Steve gently moved the display with only his fingertips. He pulled it up and down, then tilted it forwards and backwards, and even swiveled from side to side. I didn’t think a screen could perform such gymnastics—it was like the display weighed nothing at all, yet when Steve let go it stayed firmly in place with no wobbles or wiggles.  CRTs could swivel and pivot, but adjusting the height usually required plopping it on a stack of old encyclopedias. Other LCDs could only tilt forwards or backwards, including Apple’s pricey Cinema Displays.

Official Apple photo of the iMac G4.

I didn’t have to suffer with low-quality video for long. Apple posted some high-resolution beauty shots of the iMac on their website after the show. Photos couldn’t convey the monitor’s range of motion, but they could show off its unique design. When you look at the gumdrop-shaped iMac G3, you can see its evolutionary connection to the all-in-one Macs that came before it. Those computers were defined by a CRT stacked on top of disk drives and circuit boards, and the cases around these elements were shaped accordingly. iMac G3s were smoother and rounder, but you can see their evolutionary resemblance to a Power Mac 5500 or a Macintosh SE. An iMac G4 looks like a completely different species in comparison. It shares more visual design DNA with a desk lamp than the Macintosh 128K.

While iMacs are all-in-one computers, the iMac G4 feels the least all-in-one of them all. A literal all-in-one LCD computer puts everything into one chassis, but the iMac G4 is more of a spiritual all-in-one. Distinct  components, like the display and the base, are tied into a cohesive whole thanks to the articulating arm. Jony Ive and his design team wanted to emphasize the natural thinness of an LCD display. So they let the thin display stand on its own, and all the computery bits were housed in a separate hemispherical base. Unusual for sure, but this form did have a function—it allowed for that lovely 180 degree swivel with no risk of bumps. Reviewers and users alike praised the original iMac for its friendliness and approachability, but the new model seemed even more personable.

Steve Jobs really thought Apple was on to something with the iMac’s new design. The new iMac was, quote, “The opportunity of the decade to reshape desktop computers.” Jobs, John Rubinstein, Jony Ive, and the rest of Apple’s hardware and industrial design teams knew that flat panel displays would radically change desktop computers. For a long time LCDs were found only on laptops or other portable devices because they were very expensive. Their advantages—less eyestrain, less power draw, thinness—came with disadvantages like slow refresh rates, poor color quality, and small sizes. Panel makers kept iterating and improving their product during the 1990s, slowly but surely chipping away at their limitations while bringing down costs. By the turn of the millennium, flat panels were finally good enough to make a play at the desktop.

Gateway Profile Official Photograph

IBM NetVista X40. Official IBM Photo.

The new iMac wasn’t the first all-in-one desktop LCD computer, much like the Macintosh 128K wasn’t the first all-in-one CRT computer. Both the Gateway Profile in 1999 and IBM NetVista X series in 2000 beat Apple to the flat-panel punch. Gateway chose to layer laptop components behind the LCD, turning a nominally thin display into a thick computer. It was still thinner than a CRT all-in-one, but it was slower and more expensive. IBM took a different route with their NetVista X40. Sculpted by ThinkPad designer Richard Sapper, the NetVista X40 evokes Lockheed’s F-117 stealth fighter with its angular black fuselage. Eschewing Gateway’s method of mounting everything behind the LCD, Sapper instead put the big, bulky items in a base and smoothly blended it into the display, forming an L-shaped pedestal. Place it next to the iMac G4 and you can see how Ive and Sapper came to the same conclusion: let each element be true to itself. Where their executions diverge is in the display’s range of adjustability—you can only tilt the NetVista X40’s display forwards or backwards. If you wanted height or swivel adjustments, you needed to shell out two hundred bucks for a Sapper-designed radial arm. Think of the desk-mounted monitor arms you can buy today, except this one suspends the whole computer above your desk.

Steve Jobs called out these competitors indirectly during the keynote by reciting the flaws of slab-style all-in-ones. Glomming the drives and electronics behind the display makes for a thick chassis, negating the thinness of a flat panel display. All those components in a tight space generated a lot of heat, which affected performance of both the computer and display. Side-mounted optical drives had to run slower, and thinner drives couldn’t burn DVDs either. Previous LCD all-in-ones also placed their ports on the side of their displays, forcing unsightly cables into your field of vision. The new iMac’s design solved all these problems while having a more functional neck than the NetVista X40.

But there was another all-in-one LCD computer that influenced the new iMac, and it came out years before Gateway and IBM’s attempts: The Twentieth Anniversary Macintosh. Coincidentally, this is also the 25th anniversary of the Twentieth Anniversary Macintosh, which was also announced on a January 7, but that was in 1997. Nicknamed the TAM, it was the swan song for Robert Brunner, Apple’s chief designer during the 1990s. Brunner’s Industrial Design Group—including Jony Ive—had been experimenting with flat-panel all-in-one designs since 1992 in a project called Pomona. Designers from inside and outside Apple contributed ideas that all shared the same core concept: Apple’s future was an all-in-one flat panel Macintosh. One of these ideas was a Mac sketched by Eric Chan and modeled by Robert Brunner. This design was inspired by and named after Richard Sapper’s Tizio desk lamp, which goes to show how referential all these designers are. You might have seen it before—it was on the cover of the May 1995 issue of Macworld. Tizio was a jet-black Mac with an LCD display attached to its base via an articulating arm—sounds familiar, doesn’t it? After reviewing many wildly different design concepts like Tizio and a Mac shaped like a vintage television, the team settled on a Brunner-designed Mac that resembled a Bang and Olufsen stereo. Jonathan Ive then transformed Brunner’s models into an actual case design, code named Spartacus.

The Twentieth Anniversary Macintosh. Official Apple photo.

When members of the industrial design team finished the first Spartacus prototype in November of 1995, they envisioned it as a $3500 computer. Sure, that’s a premium price, but it was in line with Apple’s other premium products. But when Apple marketing executives saw the twentieth anniversary of the company looming on the horizon, they saw Spartacus as an opportunity. These executives decided to make Spartacus a limited edition collector’s computer, with a maximum production run of 20,000 units. The price ballooned to an outrageous $7499, and for an extra $2500 it would be delivered to your door in a limousine and set up by a tuxedoed technician. All the pomp and circumstance was the wrong way to market this otherwise interestingly designed computer, and the TAM flopped hard.

But the TAM’s outrageous price and marketing stunts are separate from its actual worth as a computer or as a design. From a technical point of view, it was a Power Mac 5500 that borrowed parts from a PowerBook 3400 and crammed them all into a case that looked more like hi-fi equipment than a computer. But the legacy of the Twentieth Anniversary Mac was more than just the computer itself—the process that gave us the TAM also gave Jony Ive and his team valuable experience with materials like aluminum and curved plastic surfaces, as well as new computer aided design techniques. Now that Apple was in a better place at the turn of the millennium, Industrial Design surely wanted another shot at a definitive LCD all-in-one Macintosh. I can imagine a meeting between Jony and Steve where Steve asks “if you could do it again, what would you do differently?” Fortunately, Jony Ive knew the TAM and its history inside and out—remember, he designed the production model. With a second chance to create a definitive LCD all-in-one, Ive and his team took the lessons they learned since designing the TAM and vowed to do it right this time.

iMac G5. Official Apple Photo

During the iMac’s reveal, Jobs predicted that the iMac G4’s beauty and grace would redefine desktop computers for the next decade. Like wishing on a monkey’s paw, Steve’s prediction came true—just not in the way he thought it would. After only two years on the market, the beautiful and graceful iMac G4 was replaced by the iMac G5. The complicated gooseneck was out and a simple aluminum stand was in. All the computer components and the display were crammed into a two inch thick white plastic case. Apple pitched this new design as bringing the iPod’s style to the desktop, but anyone who paid attention two years ago saw this white computer as a white flag. Apple had given up on their radical design and retreated to the safety of a slab. I don’t hate the iMac G5—it’s not an unattractive machine, but I can’t help but feel a little sad about what we lost in the iMac G4.

The M1 iMac with a VESA mount. Official Apple Photo.

Twenty years later, today’s iMacs carry the torch of the iMac G5, not the G4. Even the iMac G3’s radical rainbow color choices are lovingly homaged in the new Apple Silicon design. Where’s the love for the G4’s height adjustable screen? For years the slab-style iMacs have been stuck with tilt only adjustment, though admittedly they are light enough that you can simply turn the whole computer left and right. Astute listeners and readers won’t hesitate to point out the availability of VESA-mount iMacs. Since the slab iMac’s introduction, Apple has offered the ability to attach the iMac to any standard 100 by 100 VESA mount, like a wall mount or a desk arm. Some models could be converted with an add-on kit, but most require a trip to a fruit stand or an Apple authorized service provider to perform the conversion. Some are just plain stuck with their factory stand configurations. That said, adding a desk-mounted arm does bring back a lot of positional freedom. Alas, a VESA-mounted Mac won’t have the same effortless, soft-touch action as the iMac G4. Without something explicitly designed for the iMac’s weight and balance, it’ll always be a little tight or a little sloppy no matter how much you adjust the tension.

Steve might have cited “fatal flaws” as reasons to avoid an all-in-one slab, but as time went on the iMac G4 revealed its own set of flaws. That wonderful articulating arm was complex and expensive, and it could develop a droop over time. The base wasn’t exactly well ventilated, and the G4 processor ran quite hot. Apple never managed to put the even hotter G5 chips under its dome. But the most fatal of them all was, ironically, the cohesive visual design that made it so special. That free-floating display with its freedom of movement was still bound to the laws of physics. Without sufficient weight in the base to act as an anchor, the iMac could tip over when you push or pull on the screen. Apple only needed a few pounds of ballast to make this design work when paired with its original 15 inch display. But what happens when you attach a larger display?

Compare the two screen sizes. Official Apple Photos used for comparison

iMac G4s came in three sizes: 15, 17, and 20 inches, and the latter two were wide-screen ratios. An original 15 inch iMac G4 weighs 21 pounds. Upgrading to a 17 inch widescreen brought the weight up to 22.8 pounds, which isn’t much of a difference. But the 20 inch iMac G4, the biggest of them all, tipped the scales at a staggering 40 pounds—that made it heavier than an old CRT iMac G3! All the extra weight was ballast required to counterbalance the extra large screen size. Imagine how heavy 24 or 27 inch models would be! Another flaw with the 20 inch model was the visual proportions of the display when paired with the base. The same 10.8 inch diameter base supported all three display sizes, and what looked just right with the 15 and 17 inch screens didn’t pair well with the 20 inch. A larger base would consume more space on a desk and cost more to manufacture since it would reduce economies of scale. It’s a danger of making a design centered around solving a singular problem: sometimes it just doesn’t scale.

The iMac G4 might not look like the Mac 128K, but peel back their visual differences and you’ll find a similar philosophical core. All of its pieces work together in harmony to appeal to a more elegant idea of computing. Steve pitched it as the ultimate digital hub, where you would edit your home movies, touch up your vacation photos, and act as your digital jukebox. Part of this was thanks to the G4’s Velocity Engine, but it was also because iMacs are meant to look like a part of your home. Even though it evokes the same kind of glossy-white minimalism you’d find in an art museum, I have yet to see an iMac G4 look out of place whether it’s in a garage, a workshop, or a living room. You were inviting this computer into your home, and the iMac was designed to be the friendliest of guests.

The IBM ThinkPad 701’s trick keyboard let you have a ful-sized keyboard with a teeny tiny notebook. Official Richard Sapper photo.

Separating emotions from the iMac G4 is very difficult because it is an emotional machine. It looks like a person and tries to move like one. Even if it died due to practical realities, the world is still a better place for its existence. The iMac G4 joins such illustrious examples as the ThinkPad 701’s butterfly keyboard—the good butterfly keyboard. History is littered with designs like these—great solutions that get left behind because other designs were deemed “good enough.” Or in the case of the ThinkPad 701, the problem it was engineered to solve doesn’t exist anymore. It’s harder to justify a trick keyboard when you can make a laptop with a bigger screen that weighs less than the 701.

I didn’t own one back in the day, but I did procure a well-loved example a few years ago. My iMac G4 lives on more as an ornament than a computer, operating as a digital photo frame and jukebox. Every time I look at it, I get a little wistful and think of what might have been. Somehow the iMac G4 managed to pull off what the G4 Cube couldn’t: it was a computer that was both a work of art and a sales success. Let's raise a toast to the anniversary of this confluence of design and engineering. Twenty years later, the iMac G4 is still the computer that’s the most human of them all.

The Toshiba Satellite Pro 460CDT - Nifty Thrifties

Here in Userlandia: a new home for wayward laptops.

Do you like searching for old tech? Sure, you can try Craigslist, Letgo, or even—ugh—Facebook Marketplace. But if you're really feeling adventurous, there's nothing like a trip to a thrift store. If you're someone who'd rescue a lonely old computer abandoned by the side of the road, then Nifty Thrifties is the series for you. After all, one person’s obsolete is another’s retro treasure. Like most retro enthusiasts, I’m always on the hunt for old junk. My usual thrifting circuit consists of Savers, Goodwill, and Salvation Army stores in the Merrimack River valley of Massachusetts and southern New Hampshire. I leave empty handed more times than I care to admit, but every once in a while fortune smiles upon me and I find something special.

Here’s a recent example. Back in August, I was combing through the usual pile of DVD players and iPod docks in the electronics section at the Savers in Nashua, New Hampshire. It was about to be another regulation day ending in regulation disappointment when two platinum slabs caught my eye. I dug them out and was quite surprised to find two identical Toshiba Satellite Pro 460CDT laptops, tagged at $7 apiece. Dock connectors, PCMCIA ethernet cards, and Pentium MMX stickers pegged their vintage around 1997. Toshiba always made good laptops, and Satellite Pros were business machines aimed at a demanding clientele. Both laptops were in decent physical condition, but they lacked power supplies—hence the low price. Missing power adapters don’t faze me since I have a universal laptop power adapter. Whatever their problems, I figured I could probably make one working laptop out of two broken ones. I happily paid the fourteen dollars total and headed home with my prize.

Not bad, for a machine old enough to drink.

The first order of business when picking up old tech is a thorough cleaning. “You don’t know where they’ve been,” as my mom would say. Although these didn't look too dirty, a basic rubdown with a damp cloth still removed a fair bit of grime. After cleanup comes the smoke test. We begin with laptop A, distinguished by a label on the bottom referencing its previous owner—hello, JG! After a bit of trial and error, I found the correct tip for the universal charger, plugged it in, and held my breath. After a tense moment, the laptop’s power and charge LEDs glowed green and orange. Success—the patient has a pulse!

Confident that the laptop wouldn’t burst into flames, I pressed the power button and waited for signs of life. An old hard drive spun up with a whine, but no grinding or clicking noises—a good sign. Next came the display, whose backlight flickered with that familiar active matrix glow. A few seconds later the BIOS copyright text announced a Chips and Technologies BIOS, a common one for the time. Things were looking good until my new friend finished its memory test. A cursor blinked at me, cheerfully asking: “Password?” My new friend had a BIOS supervisor password! I tried a few basic guesses—Toshiba? Password? 12345?—but JG hadn't been that sloppy. New Friend called me out with a loud beep and shut itself down.

Well, there was always laptop B. I plugged in the charger, the LEDs came on, I powered it up… and got the same result. Both of the laptops had supervisor passwords. Great. Adding injury to insult, laptop B’s display panel had multiple stripes of dead pixels. At least everything else on both computers seemed to be working. I bet they’d boot just fine if I could get around the password. This would be a delicate operation, one that required a light touch—like a safecracker.

Breaking Through The Back Door

Security for personal computing was an afterthought in the early days. Operating systems for single-user home computers were, well, single-user, and didn’t need any permissions or login security. But when laptops were invented, people asked inconvenient questions like "what happens when somebody steals one?” The laptop makers didn't have a good answer for that, so they hastily threw together some almost-solutions, like password-lock programs that ran during OS startup. In MS-DOS land, startup programs or drivers were specified in the autoexec.bat and config.sys files, and there were plenty of ways to bypass them. Even a password program embedded in a hard drive’s bootloader can’t stop someone from booting the computer with a floppy disk. It's like tying your bike to a parking meter with a rope. Inconvenient to defeat, but easy if you know how and have the right tools. There’s got to be a better way!

Well, that better way was a supervisor password. When a PC starts up, the system’s BIOS gets things moving by performing a power-on self test and configuring hardware devices. After finishing its work, the BIOS hands control over to a bootloader which then starts the operating system. A supervisor password sits in-between the self-test and hardware configuration stages. If you don’t know the magic word, the BIOS will never finish its startup routine and thus will never start the bootloader. This closes the external storage loophole and ensures only an authorized user could start the operating system.

Early supervisor passwords were stored in the battery-backed CMOS settings memory—the very same memory used for disk configuration data and the real-time clock. To clear these passwords, all you had to do was unplug the computer’s clock battery. To close that hole, laptop makers pivoted to non-volatile memory. A password stored in an EEPROM or flash memory chip would never be forgotten even if batteries were removed, went flat, leaked acid, or—as can happen if you're really unlucky—literally exploded. So what kind of lock did my new friends have?

Some light Googling revealed that Toshiba laptops made from 1994 until sometime around 2006 stored the password in a reprogrammable ROM chip on the motherboard. Because Toshiba anticipated users forgetting their supervisor passwords, they included a backdoor in their password system. An authorized Toshiba service tech could convince the machine to forget its password by plugging a special dongle into the parallel port and powering on the locked laptop. Apparently this service cost $75, which is a bargain when you're locked out of a $3000 laptop.

Now, backdoors are generally a bad thing for security. But users and administrators are always making tradeoffs between security and usability. Businesses wanted the security of the password, but they also wanted the ability to reset it. In principle, only Toshiba and its techs knew about the backdoor. But once customers knew that resetting the passwords was possible, it was only a matter of time before some enterprising hacker—and/or unscrupulous former Toshiba employee—figured out how to replicate this. And the backdoor was just one of the Satellite’s security flaws. The hard disk carrier was held in place by a single screw. Anyone with physical access could yoink out the disk and read all its data, since there was no support for full disk encryption. Odds are, Toshiba thought being able to save customers from themselves was more important than pure security.

So how does this backdoor work? It’s actually quite simple— for a given value of “simple.” Toshiba used a parallel port loopback. By connecting the port’s transmit pins back to its own receive pins, the computer is able to send and receive data to itself. It’s a common way to test a port and make sure all its data lines are working. When the laptop is powered on, it sends a signal to the parallel port’s transmit pins. If that signal makes it back to the receive pins, the BIOS clears the password stored on the EEPROM and the computer is ready to boot.

So how would you reset the password without paying Toshiba to do it, just in case they stopped supporting those laptops fifteen years ago? Just wire up a homemade loopback dongle! It's easy enough—again, for a given value of “easy.” Multiple websites have instructions for building a DIY password reset dongle. You can cut up a parallel cable, solder some wires together to connect the right pins to each other, and you'll have those laptops unlocked before you know it.

Of course, I didn't actually have any parallel cables I could cut up, no. That would have been too convenient. Since I only needed this to work once for each machine, I took a page from Angus MacGyver's playbook and connected the pins using paperclips. If you want to try this yourself, just make sure none of the paperclips touch each other, except the ones for pins one, five, and ten. Make sure to unplug the power supply first and wear a grounded wrist strap while connecting the pins. And... well, basically, read all the instructions first.

As with the best MacGyver stories, the paperclips worked perfectly. Once the paperclips were in place, I powered the machines back on, and the password prompts disappeared. Both laptops carried on with their boot sequence and the familiar Windows 95 splash screen graced both displays. I opened the locks, but that was just step one in bringing these computers back to life.

Laptop B—the one with the half-working screen—made it to a working desktop. Unfortunately those black stripes running through the screen meant I needed an external display to do anything useful. Laptop A, which had a functioning screen, was problematic in other ways. It crashed halfway through startup with the following error:

"Cannot find a device file that may be needed to run Windows or a Windows application. The Windows registry or SYSTEM.INI file refers to this device file, but the device file no longer exists. If you deleted this file on purpose, try uninstalling the associated application using its uninstall program or setup program.”

I haven’t used a Windows 9x-based system in nearly two decades, but I still remember a lot from that era. I didn’t need Google to know this error meant there was a problem loading a device driver. Usually the error names which driver or service is misbehaving, but this time that line was blank. I rebooted while pressing the F8 key to start in safe mode—and it worked! I got to the desktop and saw a bunch of detritus from the previous owner. This machine hadn’t been cleanly formatted before it was abandoned, likely because nobody could remember the supervisor password. Safe Mode meant the problem was fixable—but Windows wasn’t going to make it easy.

Microsoft’s impressive ability to maintain backwards compatibility has a downside, and that downside is complexity. Troubleshooting startup problems in the Windows 9x era was part science, part art, and a huge helping of luck. Bypassing autoexec.bat and config.sys was the first step, but that didn’t make a difference. Next was swapping in backup copies of critical system configuration files like win.ini and system.ini, which didn’t help either. With the easy steps out of the way, I had to dig deeper. I rebooted and told Windows to generate a startup log, which would list every part of the boot sequence. According to the log, the sequence got partway through the list of VxDs—virtual device drivers—and then tripped over its own feet. Troubleshooting VxD problems requires a trip to that most annoying of places: the Windows Registry.

I can understand the logic behind creating the registry. It was supposed to order the chaos created from the sea of .INI files that programs littered across your hard drive. But in solving a thousand scattered small problems, Microsoft created one big centralized one. Even though I know the registry's logic and tricks, I avoid going in there unless I have to. And it looked like I had to. Since the problem was a VxD, I had to inspect every single key in the following location:

HKEY_LOCAL_MACHINE\System\CurrentControlset\Services\VxD

After inspecting dozens of keys, I found the culprit: a Symantec Norton Antivirus VxD key was missing its StaticVXD path. Without that path the OS tries to load an undefined driver, and the boot process stumbles to a halt. An antivirus program causing more problems than it solves? Whoever heard of such a thing! I deleted the entire key, rebooted, and New Friend started just fine. Hooray! I landed at a desktop full of productivity applications and Lotus Notes email archives. According to their labels, these laptops belonged to salespeople at a national life insurance company. Don’t worry—I cleaned things up, so all that personally identifiable information is gone. Still, it bears repeating: when disposing of old computers, format the disks. Shred your hard drives if you have to.

Where Do You Want To Go Today?

1997 was an amazing year for technology, or maybe for being a technologist. No one knew then that the merger of Apple and NeXT would change the world. Microsoft and Netscape’s browser war was drawing the attention of the US Justice Department. Palm Pilots were finally making handhelds useful. Sony’s PlayStation had finally wrested the title of most popular game console away from Nintendo. Demand for PCs was at a fever pitch because nobody wanted to miss out on the World Wide Web, and laptops were more affordable and user-friendly than ever before.

If you were looking for a laptop in 1997, what would you buy? Apple was selling the fastest notebook in the world with the PowerBook 3400C, but if you couldn’t—or wouldn’t—run Mac OS, that speed wasn’t helpful to you. DOS and Windows users were reaping the benefits of competition, with big names like IBM, Compaq, Dell, HP, and of course Toshiba, dueling for their dollars. Most buyers were shopping for midrange models, and Toshiba aimed the 1997 Satellite range directly at these Mister Sensible types. The lineup started with the Satellite 220CDS at $1899 and topped out with the 460CDT at $3659 according to an October 1997 CDW catalog. That works out to $3,272 to $6,305 in 2021 dollars. The Satellite family featured similar cases, ports, and expansion options across the lineup. What differentiated the models were case colors, types of screens, CPU type and speed, the amount of memory, and available hard drive space.

If you had the scratch for a 460CDT, you scored a well equipped laptop. The bottom-line specs are all competitive for the time: a 166MHz Pentium MMX processor, 32 megabytes of RAM, and a staggeringly huge two gigabyte hard drive. CD-ROMs were standard equipment across all of Toshiba’s Satellite laptops, though there wasn’t enough room for both a floppy and CD-ROM drive at the same time. Don’t worry, because the SelectBay system allowed the user to quickly swap the CD-ROM for a floppy drive, hard drive, or a second battery. Multimedia games and PowerPoint presentations were no problem thanks to integrated stereo sound and 24-bit true color Super VGA video output.

Despite all these standard features, laptops of 1997 were still significant compromises compared to their desktop counterparts. Active matrix color TFT screens looked beautiful—but only if your eyes stayed within a narrow viewing angle. Trackpoints and trackpads may have kicked trackballs to the curb, but most users still preferred a mouse when at a desk. Memory often came on proprietary boards, hard drives were smaller and more fragile, and PCMCIA cards were expensive. Power management features in Windows laptops were rudimentary at best—standby never worked very well and it drained the battery faster than a Mac’s sleep function. But this was the tradeoff for portability. To us, today, it's obvious that these are significant disadvantages. But back then, they were top of the line. Think about the average laptop buyer in 1997: mobile IT professionals, road warrior businesspeople, and well-off college students. They were not just willing, but eager to accept these compromises in the name of true portability.

In their prime, these laptops were beloved by demanding business users. Today they’re worth only a fraction of their original price tags, fated to rot in an attic or get melted down by a recycler. So if you stumbled across one in the wild, why would you grab it? Well, it turns out these laptops are decent retro gaming machines. It’s a bit ironic, because serious gamers in 1997 wouldn’t touch a laptop. But hear me out—for playing MS-DOS and Windows 95-era games, these machines are a great choice.

Most laptops of this era fall into a Goldilocks zone of compatibility. A Pentium MMX-era PC can still natively run MS-DOS along with Windows 95, 98, or even NT 4.0. Windows is still snappy and responsive, and demanding DOS games like Star Wars: Dark Forces are buttery smooth. Unlike most older laptops, these Toshiba models have built-in SoundBlaster-compatible digital sound with a genuine Yamaha OPL-3 synthesizer for authentic retro music. Though it lacks a 3D accelerator, the Chips & Technologies graphics processor supports your favorite DOS video modes and has good Windows performance. There’s even a joystick port, although granted, it requires an adapter. External video is available (and recommended), but the LCD panel can run both in scaled and unscaled modes, giving some flexibility compared to laptops that are forced to run 320x240 in a tiny portion of the panel.

Running some games across all these eras was painless—again, for a given value of “painless.” I tried my favorite DOS games first: Doom 2 and Warcraft 2. Blasting demons and bossing peons around was effortless on this Pentium machine. Windows and DOS versions of SimCity 2000 ran A-OK, though the FM synth version of the soundtrack isn’t my favorite. But this CD-ROM machine was made for multimedia masterpieces like You Don’t Know Jack, and announcer Cookie Masterson came through crystal clear on the built-in speakers. The most demanding game I tried, Quake, still ran acceptably in software rendering mode. For seven bucks, this is one of the best retro values I’ve ever picked up—and I have two of them! It’s a testament to Toshiba’s history as an innovator in the portable space that these machines still work this well twenty five years on.

The Toshiba Satellite Legacy

Toshiba’s been a leading Japanese heavy manufacturing concern for over a century. Like Sony, their name is on so many products that it’s probably easier to list what they don’t make. With a history in computing stretching back to the mainframe era, and their expertise in consumer electronics, Toshiba personal computers were inevitable. After designing a few microcomputers of their own, Toshiba joined Microsoft and other Japanese electronics companies to form the MSX consortium. Toshiba’s MSX machines were perfectly fine, but they were mostly known only in Asian markets. If they wanted to compete on the global stage, they’d need to bring something unique to the table.

Everything changed for Toshiba in 1985 when they introduced the T1100, one of the first laptop computers. Toshiba liked to hype up the T1100 as “the first mass market laptop,” which is true from a certain point of view. It’s not the first clamshell laptop—that honor belongs to the GRiD Compass. Other clamshell-style machines followed suit, like the Sharp PC-5000 and the Gavilan SC. Don’t forget the Tandy TRS-80 Model 100 either, which was just as much of a laptop despite a flat slab chassis. So what did Toshiba bring to the table?

Each of those predecessors had some kind of compromise. The GRiD Compass was the first clamshell, but since it didn’t have a battery its portability was limited to wherever you could plug in to a power socket. Gavilan and Sharp’s offerings had batteries, but both machines had compromised displays that could only show eight lines of text at a time. What about operating systems? GRiD wrote a custom operating system for its PCs, while Sharp and Gavilan used MS-DOS. But they weren't fully MS-DOS compatible, because MS-DOS expected a 25-line display instead of that measly 8. The T1100 managed to beat them all by having a 25 line display, battery power, integrated 3.5 inch floppy drive, and full MS-DOS compatibility.

Weighing in at 8.8 pounds, the T1100 was also the lightest of the first battery-powered clamshells. Toshiba’s PC engineers pitched it as a go-anywhere machine for a demanding user, but according to project leader Atsuoshi Nishida, Some Toshiba Executives Who Would Rather Not Be Named had their doubts about whether there was a market for something so expensive. The T1100 met Nishida’s first year sales target of ten thousand units in Europe, proving that MS-DOS portable computers didn’t have to be back-breaking suitcase-sized luggables.

In 1989, Toshiba introduced the first super-slim, super-light notebook computer. They dubbed it Dynabook—the name computer pioneer Alan Kay had suggested for an always-connected, take-anywhere computer. The chief of Toshiba’s computer division, Tetsuya Mizoguchi, easily secured that name in European markets. Japan and the US were more difficult, because some other companies had trademarked that name already. In Japan, that was the ASCII Corporation. Mizoguchi called the president of ASCII, Kazuhiko Nishi, and secured a license for the Dynabook name. Unfortunately, Mizoguchi didn’t have those special connections in America. Because Toshiba wouldn’t—or couldn’t—cough up the licensing fees, models for the US market omitted the Dynabook name.

Steve Jobs running OpenStep on a Toshiba Tecra laptop.

Toshiba maintained a leadership position in the laptop market despite competition from the likes of Compaq, Dell, and IBM because they pushed the envelope on power and features. Toshiba laptops were some of the first to feature hard drives, lithium ion batteries, CD-ROM drives, PCMCIA card slots, and more. When NeXT was in its post-hardware days, Steve Jobs ran OpenStep on a Toshiba laptop, and it’s hard to find a better endorsement than that.

By the mid-nineties, competition in the laptop sector was stiff. Toshiba adapted to changing times by creating multiple product lines to attack all levels of the market. The Satellite and Satellite Pro series were the mainstream models, preferred by perpetrators of PowerPoint for their rugged construction and balanced feature list. If you desired something less weighty, the compact Portégé subnotebook gave you the essentials for portable computing in a smaller, lighter package. If the Portégé was still too big, you could try the Libretto: a petite palmtop with paperback proportions packing a Pentium-powered punch. Lastly, there’s the Tecra series. As Toshiba’s desktop replacements, Tecras had the biggest screens, the fastest processors, and a veritable Christmas list of features. All it cost you was most of your bank account and a tired shoulder from lugging all the weight around.

This strategy served Toshiba well for nearly two decades, but you know what they say about all good things. You might’ve seen the news in 2020 that Toshiba left the laptop market. Like IBM selling its PC business to Lenovo in 2005, Toshiba decided to call it quits after years of cutthroat, low-margin business. The first sell-off was in 2018, when Sharp purchased an 80% share in Toshiba’s Dynabook division. Two years later, Sharp bought the remaining 20%, completing Toshiba’s exit from the market. What used to be Toshiba laptops now bear the Dynabook name everywhere, not just Japan.

It’s not like Toshiba hadn’t faced competition before. There were just as many companies making laptops in 1997 as there were in 2018. We still have the old stalwarts like Dell, Sony, and HP, and though the labels say Lenovo the ThinkPad is always a popular choice. Don’t forget Apple’s still sniping at all of them too. Old names like Winbook, AST, Micron, and NEC may have fallen to the wayside, but Asus, Acer, MSI, and Razer have taken their place. The field’s just as crowded today as it was back then. Why did Toshiba bail out of the market they helped create?

Like IBM before them, Toshiba simply decided that they had enough of chasing razor-thin margins in a cutthroat market. Their money could be better spent elsewhere. Business gotta business, I suppose. Seeing Toshiba exit the laptop market is like seeing Minolta leave the camera business. These companies were innovators that changed the very core of their markets, and seeing them fall to the wayside breaks my heart. In the case of Minolta, they wisely sold their camera division to another company with a history of innovation: Sony. Every Sony Alpha and RX series camera sold today has some Minolta expertise inside. I can only hope that Sharp carries the legacy of Toshiba to new heights.

The future may be uncertain, but when it comes to the past Sharp might be all right. Dynabook’s website has a wealth of drivers, spec sheets, and knowledge base articles for decades-old computers. Go ahead and try to find drivers for a Compaq Armada of similar vintage on HP’s website—yeah, try. Most manufacturers are terrible about keeping any kind of support for vintage machines online, so major props to Toshiba and now Dynabook for providing some kind of long-term support.

I didn’t own a Toshiba laptop back in the day, but I’ve always had a lot of respect for what they could do. Or at least, respect for what they could do, according to the tech journalists in PC/Computing magazine. Part of the fun of reviving these retro relics is experiencing first-hand the things you lusted after and seeing if the reality lives up to the legend. Thanks to a little effort and a little luck, I was able to appreciate these machines for a fraction of their eBay prices. These Satellites are welcome in my orbit anytime.

The Mystery of Mac OS’ Mangled Image Interpolation Implementation

Here in Userlandia, I’m talking rainbows, I’m talking pixels.

Bugs. Glitches. Unintended consequences. Computer software, like everything made by us imperfect humans, is full of imperfections of its own. When weird things happen, most people just mutter and/or swear. But I'm one of the few who feels compelled to learn why. When there’s something strange in the Network Neighborhood, I’m the one you called. But there’s nothing supernatural about software. Computers do exactly what they’re told, like a vexatiously literal genie. It’s not always obvious why bad things happen to good programs. And, as with any whodunit, they may only be obvious in retrospect.

One such mystery crossed my path back in June. I ran into an interesting thread on one of my usual Mac haunts: Ars Technica’s Macintoshian Achaia forum. Forum user almops was having a weird problem with Keynote. When a specific PDF was placed into Keynote, its contents—a series of colored squares—became a smooth rainbow gradient! Don't get me wrong, rainbows look cool, but they're not helpful when you need distinct solid blocks of color. The PDFs in question had been created by a  suite of command line apps called generic-mapping-tools, or GMT, which generates maps and map accessories… like color bars. Almops said Adobe Acrobat displayed the PDF correctly, as did Chrome, and PDF viewers on other operating systems. Anything Apple, on the other hand—be it iWork, Preview, or Safari—displayed those color blocks as a gradient, ruining his presentation.

When I saw that thread, I knew I had to tackle the mystery. It’s the kind of obscure problem that calls for my very particular set of skills, skills I acquired over a long career. For fifteen years I worked for OEMs in the graphic arts industry—more specifically, in workflow software. These applications do the hard work of managing color, rasterizing vectors, and compositing transparencies so designs can be put on paper, film, or plates. I was part of the QA teams for these companies, where I designed features, sniffed out bugs, and figured out why things go sideways. This wasn't the first time I've seen an interpreter mangle something beyond recognition, but there's almost always a way to work around it. I requested a copy of the problem file, and almops sent along both the PDF they imported into Keynote and the PostScript file used to generate said PDF. Concealed in those files was code that could clarify this this calamitous conundrum of colorful confusion. Time to put on the deerstalker cap and do some old-fashioned detective work.

Layers of Quartz

This mystery revolves around Quartz, the display engine at the heart of Apple’s operating systems. Every copy of Mac OS (and iOS) uses Quartz to draw and composite on-screen graphics. The special thing about Quartz is that its programming model is based on PDF. That's why Mac OS applications can import PDFs into their documents without needing to roll their own PDF import routines. This is a legacy inherited from Mac OS X’s predecessor, NeXTSTEP. Though Mac OS’s Quartz is very different from NeXT’s Display PostScript, both systems are designed to bring the flexibility and fidelity of a print-oriented graphics model to a computer display.

Display PostScript had a lot of intricacies and gotchas—and I’m not even talking about the licensing fees. NeXTSTEP’s window server was a Display PostScript interpreter which executed PostScript code to update the display. When NeXTSTEP was remodeled into Mac OS X, Apple replaced Display PostScript with the Quartz display model. Quartz isn’t just a renderer—it’s a complete technology stack. One facet is Quartz 2D, better known today as Core Graphics. Quartz 2D is the graphics framework that does the hard work of drawing and rasterizing the contents of your windows. Those graphics are then passed on to the Quartz Compositor—also known as Mac OS’ Window Server—which composites all the windows together into a complete computer display.

Separating rendering from compositing was the trick that let Mac OS X build compatibility for legacy graphics and lead us into the future. Now the OS could easily combine the results of very different graphics APIs. Quartz 2D and the Cocoa framework was the way of the future but apps built using the Carbon framework could carry over QuickDraw routines from classic Mac OS. QuickTime and OpenGL could render video and 3D graphics. Quartz Compositor combined the results from all these graphics libraries into one coherent display. Another advantage of this model was its extensibility—new libraries and APIs could be added without reinventing the entire display model, something that was very difficult to do in classic Mac OS.

An average user on the web might say “I’m not a developer. Why should I care what Quartz 2D can do for me?”  Well, being able to print anything to a PDF file in Mac OS without shelling out big bucks for a copy of Adobe Acrobat Pro is pretty big. So is being able to import a PDF into almost any application. Since PDF is a superset of PostScript, it’s still code that needs to be interpreted by something to display a result. That something could be a viewer application, like Adobe Acrobat, PDFPen, or PDF Expert. It could be an editor, like Callas PDFToolbox, Markzware FlightCheck, or Enfocus Pitstop Pro. Or it could be a renderer, like Adobe PDF Print Engine, Global Graphics Harlequin, or Quartz 2D.  Because PDF is a codified standard, all of these applications adhere to the rules and principles of that standard when interpreting PDFs. Or, at least, that's what's supposed to happen.

An example of banding.

An example of banding.

Almops’ PDF problem was perplexing, that’s for sure. My first theory was a blend detection bug. Making gradients in older versions of PostScript and PDF wasn’t easy. In PostScript Level 1 and 2, gradients were built from an array of paths of varying color values. Think of it like arranging a series of color slices that, from a distance, look like a smooth gradation. There were a lot of problems with this, of course—too many slices, and the interpreter would run out of memory or crash. Not enough slices, and it would show hard color edges instead of a smooth blend. This is called banding, and it looks really awkward. Most interpreters detected these arrays as blends and post-processed them to improve their smoothness. Since the introduction of PostScript Level 3, making a gradient in an application is super easy. Set the start and end points along with the number of colors in-between, and ta-da—your PDF or PS file has an actual gradient object called an shfill. But there’s still plenty of old-school level 1 and 2 blends out there, and maybe that's what Quartz thought almop’s color bar was.

This theory was quickly disproven when I used Pitstop Pro’s inspector to examine individual objects. I discovered that they weren’t a series of fills, but an image! This couldn’t be—what would cause an image to transform into a gradient? An image should just be an image! Unlike a vector object, which needs to be rasterized, an image is just a series of pixels! All it needs is scaling to render at the appropriate size. What could possibly have happened to transform these chunky blocks of color into a smooth gradient?

I needed to look closer at the image’s details. I’m not talking about zooming in—I wanted to see the metadata attributes of the image. Once again, it's Pitstop’s inspector to the rescue. It was an RGB image, eight bits per pixel, and four inches tall by four tenths of an inch wide. In pixels, it was ten pixels tall by one pixel wide, giving an effective DPI of about two and a half... wait, what? ONE pixel wide?! I opened the image in Photoshop, and confirmed the ghastly truth: Almops' image was a single pixel wide. At one pixel wide by ten pixels tall, each pixel was a single block in the color bar. The rainbow, I realized, was the result of Keynote upscaling the lowest-resolution image possible.

Resolving Power

Why does resolution matter? If you’ve ever taken a photo from a random website, sent it your printer, and been horrified by its lack of sharpness, congratulations—you’ve fallen prey to a low-res image. Computer displays historically have low resolution compared to printers, much to the consternation of graphic designers, typographers, tattoo artists, cake decorators, or anyone who just wants a high-fidelity image. An image designed for screens doesn't need as much pixel resolution as one that's going to be printed, because screens can't resolve that much detail. Files used for printing often require three to four times the resolution that your monitor is capable of displaying! So how can we put a high resolution image in a page layout or drawing application, and be sure it’ll be printed at full resolution?

That's where device-independent page description languages like PostScript and PDF come in. These languages bridge the gap between the chunky pixel layouts of a display and the fine, densely packed dots of a printer. By describing the logical elements of a page—like shapes, text, and images—as a program, PostScript and PDF abstract away messy device dependencies like pixel grids. It’s up to an interpreter to rasterize PostScript or PDF objects into a format the device can understand.

Some PostScript code describing an image. An interpreter must parse this code to render it for an output device.

Remember, pixels don’t tell you anything about the physical size of an image. How big is a six-hundred-by-six-hundred-pixel image, for instance? On a six-hundred-DPI printer... it's one square inch. One very crisp and sharp square inch, because your eye can't see the individual pixels. But if you opened that same image on a one-hundred DPI computer monitor, it would display at six inches by six inches... with very obvious individual pixels. So if you wanted it to show as one square inch on both the monitor and the printer, there has to be some way to tell both the computer and the printer how large the image should be.

Well, that way is the DPI value. Take that same six hundred by six hundred pixel image mentioned earlier, set its DPI to three hundred, and a page layout application will size it at two inches by two inches. A printer will also know that image should be two inches by two inches, and it'll paint the source pixels into the device pixels, after which ink pixels will embed themselves into paper pixels, so that you can look at it with your eyeball pixels. We could scale the image up or down, but that will make the DPI go down or up. The more pixels you can pack into the same area, the sharper the image will look when printed. This isn't the same as making it bigger. If you make the image bigger but don't have more pixels to back that up, you won't get more detail no matter how many times you yell ENHANCE at the computer. 

Given the barely-there resolution of almops' image, I wondered what would happen if it got a bit of help. I opened the image in Photoshop and resampled it to 100x1000, using the nearest neighbor algorithm to preserve its hard pixel edges.  I saved my edits, updated the PDF, and reopened it in Preview. The gradient was gone! I was greeted with a nice column of colors that looked just like the original file did in Acrobat. Case closed, mystery solved! I posted a theory for the rainbowfying in the thread:

My guess is that when Quartz sees images like this, it has a special handling exception. Quartz creates a replacement true gradient blend with those pixels as the control points of the blend. My hunch is that this is used somewhere in Quartz for UI drawing performance reasons when using small raster elements, and because Preview is a Quartz renderer, well...

Trust me—if you eat, sleep, and breathe Mac graphics software, it almost makes perfect sense. No other viewer was doing something like this, so Quartz had to be doing something special and unusual. I even helped almops tweak their software to output a file that would never rainbow again—but we’ll come back to that later.

Objection!

As the weeks went by, I gradually lost confidence in this theory. I just couldn’t shake the feeling that there was a simpler explanation. The gradient shortcut theory sounded right, yes, but what evidence did I actually have? After all, the first version of Quartz was PDF version 1.4 compatible, and PDF had added support for gradient shfill objects back in PDF version 1.3. Why, then, would Apple use one-pixel strips as a shortcut for gradient generation? That didn’t make any sense. What was I missing? I had to reopen the case, reexamine the evidence, and figure out the truth.

What’s the piece of evidence that will blow this case wide open?

I compared myself to Holmes earlier, and maybe that was wrong too. No, maybe I’m more like Phoenix Wright, from the Ace Attorney games. Ace Attorney is about finding contradictions. You comb through crime scenes, present your evidence, and examine witness testimony. Even when you think you’ve found the culprit, your reasoning and deductions are constantly challenged. I had to accept that my initial conclusion could be wrong and look at the case from another angle—just like Phoenix Wright.

I recalled some complaints that Mac OS’ Preview application made certain images look blurry. Could that be related to the rainbow gradient problem? I opened a PDF file containing some classic Mac OS icons—first in in Preview, then in Acrobat Pro. These icons were only 32 pixels by 32, but they were scaled up to fill a page. Acrobat displayed clean, sharp pixels while Preview was a blurry mess—a tell-tale sign of bilinear interpolation. I opened that one-pixel-wide color-bar image and resampled it to 100 pixels by 1000, but this time I used the bilinear algorithm. The result was a familiar rainbow. That’s when it hit me—Preview wasn’t using a nearest neighbor or matrix transformation, it was using a bilinear algorithm to smooth out the color values! How could I have missed this? It was right there the whole time! I sure hope somebody got fired for that blunder.

The last piece of the puzzle was to check if Quartz 2D was in fact modifying the image contents,  or just displaying them with a filter. I dumped Quartz 2D’s output to a PDF file, using Mac OS’ built-in print to PDF function. I cracked the new file open with BBEdit, and scrolled to the image dictionary to examine the code. The image was still defined as one pixel wide by ten pixels tall, and it was still the same physical size. But there was a new wrinkle: when Preview interpreted the PDF, it added the interpolate flag to the PDF’s code and set it to true. I opened this new file in Acrobat Pro, and sure enough, there was a rainbow gradient instead of solid blocks of color. I’ve cracked the case, just like Phoenix Wright when—spoiler for the tutorial—he realized the clock wasn’t three hours slow, but nine hours fast! Cue the dramatic courtroom music.

Interpolation Interpretation

I hadn’t thought about the interpolate flag in years! But Quartz 2D is a PDF interpreter, and I should’ve known it was a possibility. Because PostScript and PDF are device independent, it’s up to the interpreter to scale the source pixels of the original image to the appropriate device pixels. Almops’ color bar consists of ten color swatches, each made of one image pixel and physically sized at four tenths of an inch. When viewed on a 100 DPI computer monitor, it would take forty device pixels to render one of those image pixels at the requested size. So where do all these new pixels come from?

Why, the computer makes them up, using the PIDOOMA method: Pulled It Directly Out Of My... uh, Algorithm. To scale one image pixel to forty device pixels, the PostScript or PDF interpreter uses a matrix transformation. Think of it like the paint bucket tool in an image editor—the interpreter samples the nearest source pixel’s color values and paints those values into the required device pixels. The interpreter calculates all the necessary values with a simple function that consumes a minimal amount of CPU cycles. Sounds great, doesn't it—but that efficiency has a cost, and the cost is image quality. If you've ever resized an actual photo using Photoshop's nearest neighbor algorithm, you know what I mean. When upscaling, continuous tone images like photographs look blocky or show jagged edges. When downscaling, fine details are smudged out, and you can get artifacts like moiréthat weird screen-door effect in repeating patterns.

To solve these problems some very smart mathematicians invented resampling algorithms to smoothly resize raster images. If you've ever looked at what Photoshop's menus actually say, you might recognize terms like nearest neighbor, bilinear, and bicubic—they’re all different ways of filling in those missing pixels. Nearest neighbor is great for images that need hard edges, like retro video game sprites, but as mentioned earlier, it’s not great for images that need smooth color transitions. Bilinear is better for continuous tone images because it interpolates two nearby pixels to create smooth color transitions. Bicubic is even better for photos because it uses four adjacent pixels, creating a sharper image at the cost of more processor power. Wouldn’t it be cool if the printer’s interpreter could apply these fancier algorithms when scaling images to print them, so you wouldn't have to open Photoshop every single time? Then all our photos would be as smooth as the music of Steely Dan!

Downsampling comparison

The original image has been downsampled using nearest neighbor and bicubic methods. Notice the lack of jaggies on the bicubic example.

Adobe heard the demands for smoothness. They released the new and improved PostScript Level 2 in 1990, which added support for color graphics. Level 2 also added countless improvements for image objects, like the interpolate flag. Setting an image dictionary’s interpolate flag to true tells the interpreter to resample the image using a fancier algorithm like bilinear or bicubic. Even if your file had the flag set to false, you could override it at any time if the interpreter had options like “enable image smoothing.” Or the renderer could just ignore the flag entirely. The PDF and PostScript specs grant a lot of leeway to the interpreter in how it, well… interprets the interpolate flag. To wit, the PostScript Level 3 reference guide has this note at the end of interpolate’s definition:

Note: the interpolation algorithm is implementation-dependent and not under PostScript program control. Image interpolation may not always be performed for some classes of image or on some output devices.

A similar note can be found in the PDF reference guide.

NOTE: A conforming Reader may choose to not implement this feature of PDF, or may use any specific implementation of interpolation that it wishes.

This explains the difference between Adobe Acrobat and Apple’s apps. Acrobat obeys Adobe’s default spec. if the image object lacked the interpolate flag, Acrobat wouldn’t apply any fancy algorithms when upscaling the image. When set to true, Acrobat applies a bilinear interpolation, which averages the values of adjacent pixels together when scaling the image. This blurs the single pixel values together and creates—you guessed it—a smooth rainbow gradient.

Acrobat respecting the PDF interpolate flag.

The original PDF file didn’t have any interpolate flags set, but Preview interpolated all images anyway—which, as per the reference guide, it's completely allowed to do. But what if I set the flag to false? I opened almops’ original PDF in BBEdit, added an interpolate flag with a value of false, saved it, and reopened the file in Preview. No dice—it was the same old rainbow gradient. Preview doesn’t care if it was missing or false—it will always interpolate.

I should’ve expected as much because Apple frequently uses interpolation in its own apps. Keynote, Numbers, and Pages apply interpolation to any images placed in your documents. Same goes for using Preview to view PDFs with embedded images. Images in Safari are interpolated when they’re scaled, usually because they lack high-res alternates. Parts of the operating system are constantly scaling, like growing icons in the Dock or dynamically scaled windows in Mission Control. Without interpolation, all those actions would be a rough, jagged mess. But does it make sense to always interpolate images in apps like the iWork suite? After all, look what happened to almops. Luckily, there is a way for almops to create PDFs that won’t go all rainbow in Keynote.

The Fix is In

If this was a one-off problem that wasn’t likely to happen again, I would just edit the image in the PDF, resize it with nearest neighbor to 100x1000 pixels, save the file, and call it a day. But that would just be a band-aid—I wanted a cure. After some research, I found a promising solution. Remember back at the beginning I mentioned that these color bars were created by a program called GMT, or generic-mapping-tools. GMT is an open source library of command line tools for generating maps and map related graphics, and a major feature is its scriptability. Unlike iWork or Preview, GMT has a lot of knobs to turn.

I knew nothing about GMT, so I Googled “GMT psscale options” and the first hit was the command’s official documentation. Turns out that there’s a flag for psscale that determines how it writes out the color bar! Everything hinges on the -N flag and its arguments. The first helpful argument is P. When the P argument is called, psscale draws the color bar components as a series of vector squares instead of as an image. This is the perfect solution for this scenario because vector objects are paths made out of points connected by curves or lines. Because they’re math and not pixels, vectors are infinitely scaleable, and drawn at the device’s output resolution.

So if this option is available, why would you want to generate a color bar as an image? GMT recommends using an image for gradients—my guess is that they don’t write smooth shades as shfill objects. Luckily, the other flag is DPI, which does exactly what you think it does. When set, psscale will generate the image at the requested effective DPI. So if you need an image, you can set -N[600] and it’ll generate the color bar at 600 DPI. Some interpreters also handle color management on raster versus vector objects differently, but that's a problem for its own episode. Lastly, if you’re using GMT’s Modern mode and you stumble upon this same problem, the same -N flag and arguments exist for the colorbar command.

The Final Cut

Well, there it is. Mystery solved—at least, for almops. I’d still like to talk to whoever it was at Apple who decided to force all images to interpolate in most of their own apps without small image exceptions. I know, I know—exceptions are a rabbit hole that’ll leave somebody unhappy. If I were to file a bug radar or feedback about this behavior, it’ll likely be closed with a “works as designed, won’t fix.” An anticlimactic end to an otherwise enjoyable investigation.

No matter how strange or inexplicable, there’s always a rational explanation—or, at least, an explanation—for why a piece of software behaves the way it does.  Even the gnarliest of bugs—the ones that crash your computer and ruin your day—can be explained. It only takes the will to decipher the clues, and maybe a little stack tracing. What separates a bug from a glitch or unintended consequence? To someone unfamiliar with the fiendishly clever intricacies of software development, almops’ rainbow problem seems like a bug. Show the rainbow problem to a developer or product manager, and you'd get a different answer.

That’s why some of your software annoyances can hang on for so long. In the case of Preview and other Apple apps, they decided that always-on interpolation provides the best image quality for photos, which is what most images are. And you know what? I agree with them! Photos are the most common type of image, by a longshot. The only flaw in Apple's plan is that you can't turn it off when it doesn’t work. A few users complaining about the occasional blurry image, versus a lot of users complaining about jaggies and moiré, isn’t a hard choice. That's not to say that the occasional blurry image isn't something to be disappointed by—but that's the thing about compromises: they don't make everyone happy.

But this time I don’t have to worry about convincing some PM that their decision is a problem. There’s something nice about figuring out a computer mystery without job-related stakes. Yes, Preview’s still going to interpolate images even when it’s a bad idea, and I can’t change that. But I managed to solve the mystery and supply a solution to prevent it from happening again. As far as I’m concerned, my job is done. Now only if Preview could interpolate an end to this episode…

Aldus PageMaker 4.0 for Windows - Nifty Thrifties

It wasn’t that long ago that you could find all sorts of big box PC software at the thrift store. But as the years go on, it gets rarer and rarer. People who don’t know about the collector’s market just toss old software in the trash. People who do know about the collector’s market are putting anything remotely interesting on eBay, which cuts into the number of bargain finds. It’s still worth the effort, though, because interesting items cross my path now and again. Luck was on my side in October when I found a boxed copy of Aldus PageMaker at a thrift store in southern New Hampshire. I brought home a piece of my career history for the cool cost of four dollars—a fraction of its original MSRP.

PageMaker found in store.

It belongs in a computer museum!

I first encountered PageMaker when I enrolled in my high school’s graphic arts program. I didn’t know a point from a pica, but I was a budding artist and a hopeless computer nerd, and so computer graphic design seemed like the best way to mash all of my interests together.. Before I knew it I was thrust into a world of Illustrator, Photoshop, and yes, PageMaker. My high school used PageMaker extensively, thanks to educational licensing deals with Adobe. QuarkXPress had completely captured the professional market by the late nineties, but it was too XPensive for us. Adobe squeezed revenue out of the flagging PageMaker by catering to price sensitive organizations like schools. Plenty of millennial designers like me graduated into PageMaker after an elementary curriculum of Print Shop and KidPix.

 If you’ve got eagle eyes, you might notice that this copy of PageMaker is for Windows. The Mac’s long reign as the king of graphic arts was thanks largely to PageMaker being the Mac’s first killer app. But Paul Brainerd, founder of Aldus, knew that staying exclusive to the Mac would limit Aldus’ growth. IBM PC users needed page layout programs too, so in 1987 Aldus released PageMaker for IBM compatibles.

One problem with porting to IBM was that PageMaker required a graphical user interface. Instead of rolling their own GUI toolkit, Aldus ported PageMaker to Microsoft Windows and included a copy of Windows 1.0 in the box. It was a bold move at the time, since Windows was rough around the edges and years away from being the dominant PC OS we all know and tolerate. Later versions utilized the stripped-down Windows runtime to provide a graphical interface without the expense of a full copy of Windows. Shipping a GUI runtime with an app wasn’t unusual at the time—Microsoft did this with Word and Excel for years, and America Online’s standalone version used the Windows runtime too. By version 4.0 Aldus expected you to supply your own copy of Windows 3.0—there’s no runtime included at all. If Windows wasn’t your jam, PageMaker was also available for IBM’s 32-bit operating system, OS/2. That might be the business equivalent of lighting money on fire, but I’m sure OS/2 users appreciated it.

Aldus wasn’t the only company bringing graphical page layout apps to the PC. Ventura Publisher and Frame Technology’s FrameMaker were just a few of PageMaker’s contemporary competitors. There was a healthy business selling to PC users, but the Mac continued to dominate the graphics and publishing industries. There was just one problem—Apple’s mid-nineties malaise meant that if Apple went down, they’d take the graphics industry ship with them. Eventually Quark and Adobe followed Aldus’ lead and ported their applications to Windows, giving them insurance against Apple’s business blunders.

What’s In The Box?

If you were one of those Windows users who bought a copy of PageMaker, what did you get in the box? The software itself comes on five 1.2 megabyte high density 5 1/4 inch floppy diskettes. In addition to these actual-floppies, Aldus offered PageMaker 4.0 on seven 3 1/2 inch 720k not-so-floppies. You could even order a copy on 360K double-density 5 1/4 inch disks, but I bet only a handful took Aldus up on that offer. I wonder which format was more popular, because computers of 1991 often had both styles of floppy drive. Since the 3 1/2 inch disks are 720K, that version needs seven disks compared to five for the larger format. Version 4.0 was the last version to offer 5 1/4 inch floppies, since 5.0 offered a CD-ROM option in their place.

Inside the box is a full complement of manuals and documentation. The first group is what I’d call the supplementary materials. Things like a quick reference for keyboard shortcuts, a printed version of the software license agreement, and a listing of Aldus-approved educational materials, training companies, and service bureaus. A printed template guide provided a handy visual reference for all the included design templates for things like business cards, newsletters, and calendars.

The most amusing of these pack-in items is a very condescending anti-piracy leaflet. It implores you to think of all the theoretical sales you’re depriving from the poor Aldus Corporation when you copy that floppy. I won’t dwell on that leaflet too long, except to point out the irony of Aldus lecturing someone who already paid them for the software in question.

Next is a quick start guide along with the primary reference manual, table editor guide, and the version supplement. The quick start guide has all the steps for installing the software, a listing of menus and tools, and a quick tutorial for making a sample document. It’s nice and all, but that’s just a warmup for the main event: the reference manual. I love the reference manual—it’s well written and comparable to a guide book you’d buy in a store. This was back in the day when manuals were written like actual books, and companies ran their documentation teams like real publishers. Manuals like these died for a variety of reasons—they were heavy, costly, and not as convenient as online help. I also think a lot of companies—especially Adobe—realized that manuals were an untapped source of revenue. It’s no coincidence that Classroom in a Book's popularity soared after Adobe excised the printed materials from their software.

Improvements in PageMaker 4.0

If you ponied up $150 of 1991 dollars to buy a PageMaker 4.0 upgrade, you got a lot of new features for your money. A lot of these were playing catch-up to QuarkXPress after it stole a lot of PageMaker’s marketshare at the end of the eighties. Still, if you were an everyday user, a lot of these features seem mighty compelling. Let’s check them out.

  • Long document support. PageMaker 3.0 had a 128 page limit per file. 4.0 introduced a 999 page limit per file, which as far as I can remember hung on until the bitter end of 7.0.

  • Color graphics support. Version 3.0 supported spot colors and you could colorize text or linework, but displaying actual color images was right out. 4.0 added support for 24-bit full color images. Better late than never.

  • Story editor with spell checking. Instead of writing in the pasteboard view, a word processor-like story editor allowed for composing long text documents without writing them in a different word processor first.

  • Search and Replace. Searching a document isn’t just for words, it’s also for styles and metadata. PageMaker 4.0 added style-based search and replace, making it easy to reformat documents without needing to manually select every instance of fourteen point Helvetica.

  • Inline graphics placement. Previous versions always required placing images in their own frames. Now you could place graphics inside of a text box. This made PageMaker easier to use for users coming from word processing programs like Microsoft Word. Inline images didn’t replace the old frame method, so you could use whichever mode you preferred.

  • Paragraph line control. Customizable paragraph break controls prevented widows and orphans from ruining the flow of your document. Hyphenation support was also new for 4.0.

  • Advanced type and text controls. PageMaker 4.0 could scale or compress type for stylish effects. 90 degree rotations for text lines were added as well. Kerning, tracking, and leading precision were also enhanced in 4.0. This was in response to QuarkXPress, which had much better type handling than PageMaker 3.0.

  • Book publication features. Previous versions of PageMaker lacked features that could help assemble books. Things like indexing, automatic page numbering with special section formats, and tables of contents were all new to 4.0. No more manually adjusting your TOC and page counts after cutting a weak chapter or page!

  • File linking improvements. PageMaker could now tell you the date and time that you placed or updated linked images and text files. It could even offer to update them automatically if it detected a new version. This was also in response to Quark, which had better link management. Alas, this is an area where PageMaker was always playing catchup.

  • Tables and chart support. A new utility could read table and chart data from various database and spreadsheet applications. Lotus 1-2-3, Microsoft Excel, and Ashton-Tate dBase were just a few of the available data sources.

Making the Page

It’s one thing to list and read about features—let’s give PageMaker a spin and check them out first-hand. Unfortunately, I don’t have a system with 5 1/4 inch disk drives to run this exact copy of PageMaker, so running a copy in DOSbox will have to do. There’s an installer app that copies all the files and configures all the settings, and it’s about as easy as 1991-era installers go. If hard drive space is tight, you can omit unnecessary printer drivers and template files during installation. One advantage of DOSox is that things are much zippier thanks to solid-state storage. Actual hardware would require a lot more time and floppy swapping, so that’s one bullet dodged. Printer drivers come on a separate disk, and PageMaker supports HP LaserJets, a generic PCL device, and a generic PostScript device. PostScript Printer Description files—PPDs—are included for common PostScript printers of the day, like Apple LaserWriters. There’s no copy protection other than a serial number, though I wouldn’t be surprised if it checked for other copies running on a local area network.

After the installation finished there was a shiny new Aldus group in Program Manager. After the first launch complained about configuration settings—I needed to add some PATH entries to autoexec.bat—PageMaker finally came to life. So many memories came back to me after perusing the palettes and combing through the commands. I didn’t even need a refresher from the manual to place an image, set some text, and print a file—just like old times! PageMaker 4.0’s user interface is remarkably bare compared to 1990s Quark XPress, let alone modern InDesign. It’s riddled with modal dialog boxes and pull-down menus—definitely signs of a pre-toolbar and tabbed palette world. Speaking of those dialog boxes, their layouts were ripped right out the Mac version. Seasoned PageMaker users will appreciate the consistency, but they definitely look out of place in a Windows environment. When it comes to toolboxes, three palettes are all you get for your computer pasteup needs: tools, styles, and colors. Make it work, designers!

Despite its age, this software can still produce legitimate work—after all, PostScript is one of its output options. Just tell the PostScript printer driver to write your output to a PS file and you’ve got something that can be opened in Illustrator or distilled to a PDF. If you have a PostScript-compatible printer, I bet you could print to it directly with the appropriate PPD. I made a quick test document with some text and a graphic, saved it to a PostScript file, and dumped it into Acrobat Distiller. After a few seconds, I had a PDF file that I could send to any print shop in the world. If you go to the blog post for this episode, you can compare PageMaker’s on-screen display versus the finished PDF, which is equivalent to a “printed” piece. PageMaker’s display is a jagged, low resolution mess, while printed output is crisp and precise. Quite the difference, no?

Despite the popularity of “What You See Is What You Get” marketing, the actual quality of our screens paled in comparison to what a laser printer could do. 1991 was still the era of separate screen and printer fonts. Screen fonts were hand-drawn with pixels to match specific point sizes, whereas printer fonts used glyphs composed from curves and lines that could be rasterized to any point size. This was necessary at the time because computers were too slow to dynamically render those outline fonts to the display. Screen fonts also had hints to help fonts look better on low-resolution computer displays. So long as you stuck to the provided point sizes, you’d be fine. But choosing a non-standard point size with a screen font transformed your type into terrible tacky trash. Eventually programs like Adobe Type Manager brought the power of outline fonts to computer displays using antialiasing techniques, so long as you had a computer powerful enough to use it without lag.

Graphics also used low-resolution previews to save precious memory and CPU cycles. Vector graphics were infinitely scalable when printed, but all the user saw on screen was a blocky, low-resolution proxy. Raster images could also use a proxy workflow thanks to another Aldus invention: the Open Prepress Interface, or OPI for short. A designer would place a low-res proxy image into their document along with an OPI link to a high resolution file. At print time the raster image processor follows the link and overwrites the low-res image with the high-res one. By using OPI, all the heavy high-res files could live on a big, beefy server and reduce the time it takes to spool files to a printer or imagesetter. Because of these limitations, I frequently printed scrap proofs to double check my work. When InDesign launched with a high resolution preview mode for images and graphics, it was a revelation.

To Aldus’ credit, they ate their own dog food—the manuals and boxes were designed with PageMaker and Freehand. The jury’s out on whether they used SuperPaint for the raster graphics. Even with all the improvements included in PageMaker 4.0 and 5.0, nothing could really stem the bleeding of users to Quark XPress, because XPress’ frame-based toolset and mathematical precision were just that good. It made layout easier and more predictable than PageMaker, and its library of third-party XTensions helped you create designs that were impossible in PageMaker.

How could Aldus beat Quark under those tough circumstances? PageMaker’s code aged poorly, and rewriting it would take a lot of time and money. Maybe it was time to start over. Aldus was already developing a replacement for PageMaker at the time of their merger with Adobe in 1994. This project, codenamed K2, wouldn’t just replace PageMaker; it would challenge QuarkXPress for the title of desktop publishing champion. Speaking of Quark, they attempted to buy Adobe in 1998. This incensed Adobe cofounder John Warnock. What gave Quark, a company a third the size of Adobe, the right to commit a hostile takeover? Fueled by Adobe’s money and spite, the former Aldus team redoubled their efforts to build a Quark killer. K2 launched as Adobe InDesign in 1999, featuring high-res previews, native Illustrator and Photoshop file support, and killer typography. By 2003 it was the hot new design package everyone wanted to use—but we’ll come back to that story another day.

Looking back, I don’t think I have much fondness for PageMaker as a program. I was more productive when I used QuarkXPress, and the work I produced with Quark looked better, too. But it’s hard for me to separate my memories of PageMaker from my memories of learning the basics of design. It’s like looking back at the Commodore 64—I recognize PageMaker’s achievements, and the things we did together, but I’m perfectly fine with not using it on a daily basis anymore. I produced a lot of printed materials for the city of Pittsfield, Massachusetts and its public schools using PageMaker. None of it was particularly good or remarkable, but all artists say that about their early work. Still, I couldn’t have built my career in the graphic arts without PageMaker. I’m glad I found this copy, and I hope it enjoys a comfortable retirement on my shelf.

The 2021 MacBook Pro Review

Here in Userlandia, the Power’s back in the ‘Book.


I’ve always been a fan of The Incredibles, Brad Bird’s exploration of family dynamics with superhero set dressing. There’s a bit in the movie where Bob Parr—Mister Incredible—has one of his worst days ever. Getting chewed out by his boss for helping people was just the start. Indignities pile up one after another: monotonous stop-and-go traffic, nearly slipping to death on a loose skateboard, and accidentally breaking the door on his comically tiny car. Pushed to his absolute limit, Bob Hulks out and hoists his car into the air. Just as he’s about to hurl it down the street with all of his super-strength, he locks eyes with a neighborhood kid. Both Bob and the kid are trapped in an awkward silence. The poor boy’s staring with his mouth agape, his understanding of human strength now completely destroyed. Bob, realizing he accidentally outed himself as a superhero, quietly sets the car down and backs away, hoping the kid will forget and move on.

Time passes, but things aren’t any better the next time we see Bob arriving home from work. He’s at his absolute nadir—you’d be too for being fired after punching your boss through four concrete walls. Facing another disruptive relocation of his family, Bob Parr can’t muster up the anger anymore—he’s just depressed. Bob steps out of his car, and meets the hopeful gaze of the kid once again. “Well, what are you waiting for?” asks Bob. After a brief pause, the kid shrugs and says “I dunno… something amazing, I guess.” With a regretful sigh, Mister Incredible forlornly replies: “Me too, kid. Me too.”

For the past, oh, six years or so, Apple has found itself in a Mister Incredible-esque pickle when it comes to the MacBook Pro. And the iPhone, and iPad, and, well, everything else. People are always expecting something amazing, I guess. Apple really thought they had something amazing with the 2016 MacBook Pro. The lightest, thinnest, most powerful laptop that had the most advanced I/O interface on the market. It could have worked in the alternate universe where Intel didn’t fumble their ten and seven nanometer process nodes. Even if Intel had delivered perfect processors, the design was still fatally flawed. You have to go back to the PowerBook 5300 to find Apple laptops getting so much bad press. Skyrocketing warranty costs from failing keyboards and resentment for the dongle life dragged these machines for their entire run. Most MacBook Pro users were still waiting for something amazing, and it turned out Apple was too.

Spending five years chained to this albatross of a design felt like an eternity. But Apple hopes that a brand-new chassis design powered by their mightiest processors yet will be enough for you to forgive them. Last year’s highly anticipated Apple Silicon announcements made a lot of crazy promises. Could Apple Silicon really do all the things they said? Turns out the Apple Man can deliver—at least, at the lower end. But could it scale? Was Apple really capable of delivering a processor that could meet or beat the current high end? There’s only one way to find out—I coughed up $3500 of my own Tricky Dick Fun Bills and bought one. Now it’s time to see if it’s the real deal. 

Back to the (Retro) Future

For some context, I’ve been living the 13 inch laptop life since the 2006 white MacBook, which replaced a 15 inch Titanium PowerBook G4 from 2002. My current MacBook Pro was a 2018 13 inch Touch Bar model with 16 gigabytes of RAM, a 512 gigabyte SSD, and four Thunderbolt ports. My new 16 inch MacBook Pro is the stock $3499 config: it comes equipped with a M1 Max with 32 GPU cores, 32 gigabytes of RAM, and a 1 terabyte SSD.

Think back thirteen years ago, when Apple announced the first aluminum unibody MacBook Pro. The unibody design was revealed eight months after the MacBook Air, and the lessons learned from that machine are apparent. Both computers explored new design techniques afforded by investments in new manufacturing processes. CNC milling and laser cutting of solid aluminum blocks allowed for more complex shapes in a sturdier package. For the first time, a laptop could have smooth curved surfaces while being made out of metal. While the unibody laptops were slightly thinner, they were about the same weight as their predecessors.

Apple reinvested all of the gains from the unibody manufacturing process into building a stronger chassis. PowerBooks and early MacBooks Pro were known for being a little, oh, flexible. The Unibody design fixed that problem for good with class-leading structural rigidity. But once chassis flex was solved, the designers wondered where to go next. Inspired by the success of the MacBook Air, every future MacBook Pro design pushed that original unibody language towards a thinner and lighter goal. While professionals appreciate a lighter portable—no one really misses seven pound laptops—they don’t like sacrificing performance. Toss Intel’s unanticipated thermal troubles onto the pile and Apple’s limit-pushing traded old limitations for new ones. It was time for a change.

Instead of taking inspiration from the MacBook Air, the 2021 MacBook Pro looks elsewhere: the current iPad Pro, and the Titanium PowerBook G4. The new models are square and angular, yet rounded in all the right places. Subtle rounded corners and edges along the bottom are reminiscent of an iPod classic. A slight radius along the edge of the display feels exactly like the TiBook. Put it all together and you have a machine that maximizes the internal volume of its external dimensions. Visual tricks to minimize the feeling of thickness are set aside for practical concerns like thermal capacity, repairability, and connectivity.

I’m seeing double. Four PowerBooks!

The downside of this approach is the computer feels significantly thicker in your hands. And yet the new 16 inch MacBook Pro is barely taller than its predecessor—.66 inches versus .64 (or 168mm versus 166). The 14 inch model is exactly the same thickness as its predecessor at .61 inches or 150mm. It feels thicker because the thickness is uniform, and there’s no hiding it when you pick it up from the table. The prior model's gentle, pillowy curves weren’t just for show—they made the machine feel thinner because the part you grabbed was thinner.

Memories of PowerBooks past are on display the moment you lift the notebook out of its box. I know others have made this observation, but it’s hard to miss the resemblance between the new MacBooks and the titanium PowerBook G4. As a fan of the TiBook’s aesthetics, I understand the reference. The side profiles are remarkably similar, with the same square upper body and rounded bottom edges. A perfectly flat lid with gently rolled edges looks and feels just like a Titanium. If only the new MacBook Pro had a contrasting color around the top case edges like the Titanium’s carbon-fiber ring—it really makes the TiBook look smaller than it is. Lastly, blacking out the keyboard well helps the top look less like a sea of silver or grey, fixing one of my dislikes about the prior 15 and 16 inchers.

What the new MacBook Pro doesn’t borrow from the Titanium are weak hinges and chassis flex. The redesigned hinge mechanism is smoother, and according to teardowns less likely to break display cables. It also fixes one of my biggest gripes: the dirt trap. On the previous generation MacBooks, the hinge was so close to the display that it created this crevice that seemed to attract every stray piece of lint, hair, and dust it could find. I resorted to toothbrushes and toothpicks to get the gunk out. Now, it’s more like the 2012 to 2015 models, with a wide gap and a sloping body join that lets the dust fall right out. Damp cloths are all it takes to clean out that gap, like how it used to be. That gap was my number one annoyance after the keyboard, and I thank whoever fixed it.

Something else you’ll notice if you’re coming from a 2016 through 2020 model is some extra mass. The 14 and 16 inch models, at 3.5 and 4.7 pounds respectively, have gained half a pound compared to their predecessors, which likely comes from a combination of battery, heatsink, metal, and screen. There’s no getting around that extra mass, but how you perceive it is a matter of distribution. I toted 2011 and 2012 13 inch models everywhere, and those weighed four and a half pounds. It may feel the same in a bag, but in your hands, the bigger laptop feels subjectively heavier. If you’re using a 2012 through 2015 model, you won’t notice a difference at all—the new 14 and 16 inch models weigh the same as the 13 and 15 inchers of that generation.

I don’t think Apple is going to completely abandon groundbreaking thin-and-light designs, but I do think they’re going to leave the envelope-pushing crown to the MacBook Air. There is one place Apple could throw us a bone on the style front, though, and that’s color choices. I would have bought an MBP in a “Special edition” color if they offered it, and I probably would have paid more too. Space Gray just isn’t dark enough for my tastes. Take the onyx black shade used in the keyboard surround and apply it to the entire machine—it would be the NeXT laptop we never had. I’d kill to have midnight green as well. Can’t win ‘em all, but I know people are dying for something other than “I only build in silver, and sometimes, very very dark gray.”

Alone Again, Notch-urally

There’s no getting around it: I gotta talk about the webcam. Because some people decided that bezels are now qualita non grata on modern laptops, everyone’s been racing to make computers as borderless as possible. But the want of a bezel-free life conflicts with the need for videoconferencing equipment, and you can’t have a camera inside your screen… or can you? An inch and a half wide by a quarter inch tall strip at the top of the display has been sacrificed for the webcam in an area we’ve collectively dubbed “the notch.” Now the MacBook Pro has skinny bezels measuring 3/16s of an inch, at the cost of some menu bar area.

Oh, hello. I didn’t… see you there. Would you like some… iStat menu items?

Dell once tried solving this problem in 2015 by embedding a webcam in the display hinge of their XPS series laptops. There’s a reason nobody else did it—the camera angle captured a view straight up your nose. Was that tiny top bezel really worth the cost of looking terrible to your family or coworkers? Eventually Dell made the top bezel slightly taller and crammed the least objectionable tiny 720p webcam into it. When other competitors started shrinking their bezels, people asked why Apple couldn’t do the same. Meanwhile, other users kept complaining about Apple’s lackluster webcam video quality. The only way to get better image quality is to have a bigger sensor and/or a better lens, and both solutions take up space in all three axes. Something had to give, and the loyal menu bar took one for the team.

The menu bar’s been a fixture on Macintoshes since 1984—a one-stop shop for all your commands and sometimes questionable system add-ons. It’s always there, claiming 20 to 24 precious lines of vertical real estate. Apple’s usually conservative when it comes to changing how the bar looks or operates. When they have touched it, the reactions haven’t been kind. Center-aligned Apple logo in the Mac OS X Public Beta, anyone? Yet here we are, faced with a significant chunk of the menu bar’s lawn taken away by an act of eminent domain. As they say, progress has a price.

A few blurry pictures of a notched screen surfaced on MacRumors a few days before the announcement. I was baffled by the very idea. No way would Apple score such an own-goal on a machine that was trying to right all the wrongs of the past five years. They needed to avoid controversy, not create it! I couldn’t believe they would step on yet another rake. And yet, two days later, there was Craig Federighi revealing two newly benotched screens. I had to order up a heaping helping of roasted crow for dinner.

Now that the notch has staked its claim, what does that actually mean for prospective buyers? First, about an inch and a half of menubar is no longer available for either menus or menu extras. If an app menu goes into the notch, it’ll shift over to the right-hand side automatically. Existing truncation and hiding mechanisms in the OS help conserve space if both sides of your menu bar are full. Second, fullscreen apps won’t intrude into the menubar area by default. When an app enters fullscreen mode the menu bar slides away, the mini-LED backlights turn off, and the blackened area blends in with the remaining bezel and notch. It’s as if there was nothing at all—a pretty clever illusion! Sling your mouse cursor up top and menu items fade back in, giving you access to your commands. When you’re done, it fades to black again. If an app doesn’t play nice with the notch, you can check a box in Get Info to force it to scale the display below the notch.

Menu bar gif

How the menu bar fades and slides in fullscreen mode.

But are you losing any usable screen space due to the notch? Let’s do some math. The new 16 inch screen’s resolution is 3456 pixels wide by 2234 tall, compared to the previous model’s 3072 by 1920. Divide that by two for a native Retina scale and you get usable points of 1728 by 1117 versus 1536 by 960. So if you’re used to 2x integer scaling, the answer is no—you’re actually gaining vertical real estate with a notched Mac. Since I hate non-native scaling, I’ll take that as a win. If you’re coming from a prior generation 15 inch Retina MacBook Pro with a 1440 point by 900 display, that’s almost a 24 percent increase in vertical real estate. You could even switch to the 14 inch model and net more vertical space and save size and weight!

Working space

What did the menu bar give up to make this happen? The notch claims about ten percent of the menu bar on a 16 inch screen, and fifteen percent on a 14 inch. In point terms, it takes up about 188 horizontal points of space. Smaller displays are definitely going to feel the pinch, especially if you’re running a lot of iStat Menus or haven’t invested in Bartender. Some applications like Pro Tools have enough menus that they’ll spill over to the other side of the notch. With so many menus, you might trip Mac OS’ menu triage. It hides NSMenuBar items first, and then starts truncating app menu titles. Depending on where you start clicking, certain menu items might end up overlapping each other, which is disconcerting the first time you experience it. That’s not even getting into the bugs, like a menu extra sometimes showing up under the notch when it definitely shouldn’t. I think this is an area where Apple needs to rethink how we use menulings, menu extras, or whatever you want to call them. Microsoft had its reckoning with Status Tray items two decades ago, and Apple’s bill is way past due. On the flip side, if you’re the developer of Bartender, you should expect a register full of revenue this quarter.

Audacity's window menu scoots over.

Audacity’s Window menu scoots over to the other side of the notch atuomatically.

On the vertical side, the menu bar is now 37 points tall versus the previous 24 in 2x retina scale. Not a lot, but worth noting. It just means your effective working space is increased by 44 points, not 57. The bigger question is how the vertical height works in non-native resolutions. Selecting a virtual scaled resolution keeps the menu bar at the same physical size, but shrinks or grows its contents to match the scaling factor. It looks unusual, but if you’re fine with non-native scaling you’re probably fine with that too. The notch will never dip below the menu bar regardless of the scaling factor.

What about the camera? Has its quality improved enough to justify the land takings from the menu bar? Subjective results say yes, and you’ll get picture quality similar to the new iMac. People on the other side of my video conferences noticed the difference right away. iPads and iPhones with better front cameras will still beat it, but this is at least a usable camera versus an awful one. I’ve taken some comparison photos between my 2018 model, the new 2021 model, and a Logitech C920x, one of the most popular webcams on the market. It’s also a pretty old camera and not competitive to the current crop of $200+ “4K” webcams—and I use that term loosely—but it’s good enough quality for most people.

The 720p camera on the 2018 model is just plain terrible. The C920x is better than that, but it’s still pretty soft and has bluer white balance. Apple actually comes out ahead in this comparison in terms of detail and sharpness. Note that the new cameras field of view is wider than the old 720p camera—it’s something you’ll have to keep in mind.

Looking at the tradeoffs and benefits, I understand why Apple went with the notch. Camera quality is considerably improved, the bezels are tiny, and that area of the menu bar is dead space for an overwhelming majority of users. It doesn’t affect me much—I already use Bartender to tidy up my menu extras, and the occasional menu scooting to the other side of it is fine. Unless your menu extras consume half of your menu bar, the notch probably won’t affect your day to day Mac life either.

Bartender in action.

Without Bartender, I would have a crowded menu bar—regardless of a notch.

But there’s one thing we’ll all have to live with, and that’s the aesthetics. There’s a reason that almost all of Apple’s publicity photos on their website feature apps running in fullscreen mode, which conveniently blacks out the menubar area. All of Apple’s software tricks to reduce the downsides of the notch can’t hide the fact that it’s, well… ugly. If you work primarily in light mode, there’s no avoiding it. Granted, you’re not looking at the menu bar all the time, but when you do, the notch stares back at you. If you’re like me and you use dark mode with a dark desktop picture, then the darker menu bar has a camouflage effect, making the notch less noticeable. There’s also apps like Boring Old Menu Bar and Top Notch that can black out the menubar completely in light mode. Even if you don’t care about the notch, you might like the all-black aesthetic.

Is it worse than a bezel? Personally, I don’t hate bezels. It’s helpful to have some separation between your content and the outside world. I have desktop displays with bezels so skinny that mounting a webcam on top of them introduces a notch, which also grinds my gears. At least I can solve that problem by mounting the webcam on a scissor arm. I also like the chin on the iMac—everyone sticks Post-Its to it and that’s a valid use case. It’s also nice to have somewhere to grab to adjust the display angle without actually touching the screen itself. Oh, and the rounded corners on each side of the menu bar? They’re fine. In fact, they’re retro—just like Macs of old. Round corners on the top and square at the bottom is the best compromise on that front.

All the logic is there. And yet, this intrusion upon my display still offends me in some way. I can rationalize it all I want, but the full-screen shame on display in Apple’s promotional photos is proof enough that if they could get tiny bezels without the notch, they would. It’s a compromise, and nobody likes those. The minute Apple is able to deliver a notch-less experience, there will be much rejoicing. Until then, we’ll just have to deal with it.

We All Scream for HDR Screens

The new Liquid Retina ProMotion XDR display’s been rumored for years, and not just for the mouthful of buzzwords required to name it. It’s the first major advancement in image quality for the MacBook Pro since the retina display in 2012. Instead of using an edge-lit LED or fluorescent backlight, the new display features ten thousand individually addressable mini-LEDs behind the liquid crystal panel. If you’ve seen a TV with “adaptive backlighting zones,” it’s basically the same idea—the difference is that there’s a lot more zones and they’re a lot smaller. Making an array of tiny, powerful, efficient LEDs with great response time and proper color response isn’t trivial. Now do it all at scale without costs going out of control. No wonder manufacturers struggled with these panels.

According to the spec sheet, the 16 inch MacBook Pro’s backlight consists of 10,000 mini-LEDs. I’ll be generous and assume each is an individually addressable backlight zone. The 16.2 inch diagonal display, at 13.6 by 8.8 inches, has about 119.5 square inches of screen area that needs backlighting. With 10,000 zones, that results in about 83.6 mini-LEDs per square inch. Each square inch of screen contains 64,516 individual pixels. That means each LED is responsible for around 771 pixels, which makes for a zone area of 27.76 pixels squared. Now, I’m not a math guy—I’ve got an art degree. But if you’re expecting OLED per-pixel backlighting, you’ll need to keep waiting.

All these zones means adaptive backlighting—previously found on the Pro XDR display and the iPad Pro—is now a mainstream feature for Macs. Overall contrast is improved because blacks are darker and “dark” areas are getting less light than bright areas. It largely works as advertised—HDR content looks great. For average desktop use, you’ll see a noticeable increase in contrast across the board even at medium brightness levels.

Safari HDR support

Safari supports HDR video playback on youtube, and while you can’t see the effects in this screenshot, it looks great on screen.

But that contrast boost doesn’t come for free. Because backlight zones cover multiple pixels, pinpoint light sources will exhibit some amount of glow. Bright white objects moving across a black background in a dark room will show some moving glow effect as well. Higher brightness levels and HDR mode make the effect more obvious. Whether this affects you or not depends on your workflow. The problem is most noticeable with small, bright white objects on large black areas. If you want a surefire way to demonstrate this, open up Photoshop, make a 1000x1000 document of just black pixels, and switch to the hand tool. Wave your cursor around the screen and you’ll see the subtle glow that follows the cursor as it moves. Other scenarios are less obvious—say, if you use terminal in dark mode. Lines of text will be big enough that they straddle multiple zones, so you may see a slight glow. I don’t think the effect is very noticeable unless you are intentionally trying to trigger it or you view a lot of tiny, pure white objects on a black screen in a dark room. I’ve only noticed it during software updates and reboots, and even then it is very subtle.

Watch this on a Mini-LED screen to see the glow effect. It’s otherwise unnoticeable in daily usage.

I don’t think this is a dealbreaker—I don’t work with tiny white squares on large black backgrounds all day long. But if you’re into astrophotography, you might want to try before you buy. Edge-lit displays have their own backlight foibles too, like piano keys and backlight uniformity problems, which mini-LEDs either eliminate or reduce significantly. Even CRTs suffered from bloom and blur, not to mention convergence issues. It’s just a tradeoff that you’ll have to judge. I believe the many positives will outweigh the few negatives for a majority of users.

The other major change with this screen and Mac OS Monterey is support for variable refresh rates. With a maximum refresh rate of 120Hz, the high frame rate life is now mainstream on the Mac. If you’re watching cinematic 24 frames per second content, you’ll notice the lack of judder right away. Grab a window and drag it around the screen and you’ll see how smooth it is. Most apps that support some kind of GPU rendering with vsync will yield good results. Photoshop’s rotate view tool is a great example—it’s smooth as butter. System animations and most scrolling operations are silky smooth. A great example is using Mission Control—the extra frames dramatically improve the shrinking and moving animations. Switching to fullscreen mode has a similar effect.

But the real gotcha is that most apps don’t render their content at 120 FPS. It’s a real mixed bag out there at the moment, and switching between a high frame rate app to a boring old 60Hz one is jarring. Safari, for instance, is stuck at 60 FPS. This is baffling, because a browser is the most used application on the system. Hopefully it’s fixed in 12.1.

But pushing frames is just one part of a responsive display. Pixel response time is still a factor in LCD displays, and Apple tends to use panels that are middle of the pack. Is the new MacBook Pro any different? My subjective opinion is that the 16 inch MBP is good, but not great in terms of pixel response. Compared to my ProMotion equipped 2018 iPad Pro, the Mac exhibits a lot less ghosting. Unfortunately, it’s not going to win awards for pixel response time. Until objective measurements are available, I’m going to hold off from saying it’s terrible, but compared to a 120 or 144Hz gaming-focused display, you’ll notice a difference. 

In addition to supporting variable and high refresh rates on the internal screen, the new MacBooks also support DisplayPort adaptive refresh rates—what you might know as AMD’s FreeSync. Plug in a FreeSync or DisplayPort adaptive refresh display using a Thunderbolt to DisplayPort cable and you’ll be able to set the refresh rate along with adaptive sync. Unfortunately, Adaptive Sync doesn’t work via the built-in HDMI port, because HDMI Variable Refresh Rate requires an HDMI 2.1 port. Also, Apple’s implemented the DisplayPort standard, and FreeSync over HDMI 2.0 is proprietary to AMD. I’m also not sure if Thunderbolt to HDMI 2.1 adapters will work either, because I don’t have one to test.

LG Display Prefs

The LG 5K2K Ultrawide works perfectly with these Macs and Monterey.

Speaking of external displays, I have a bunch that I tested with the M1 Max. The good news is that my LG 5K ultrawide works perfectly fine when attached via Thunderbolt. HiDPI modes are available and there’s no issues with the downstream USB ports. The LG 5K ultrawide is the most obscure monitor I own, so this bodes well for other ultrawide displays, but I can’t speak definitively about those curved ultrawides that run at 144Hz. I wasn’t able to enable HDR output, and I believe this is due to a limitation of the display’s Thunderbolt controller—if I had a DisplayPort adapter, it might be different. My Windows PC attached via DisplayPort switches to HDR just fine. I can’t definitely call it until tested that way. My aging Wacom Cintiq 21UX DTK2100 works just fine with an HDMI to DVI adapter. An LG 27 inch 4K display works well over HDMI, and HDR is supported there too. The only missing feature with the LG 4K is Adaptive Sync, which requires a DisplayPort connection from this Mac. Despite that, you can still set a specific refresh rate via HDMI.

If multi-monitor limitations kept you away from the first round of M1 Macs, those limits are gone. The M1 Pro supports two 6K external displays, and the Max supports three 6K displays plus a 4K60 over HDMI. I was indeed able to hook up to three external displays, and it worked just like on my Intel Macs. I did run into a few funny layout bugs with Monterey’s new display layout pane, which I’m sure will be ironed out in time. Or not—you never know what Apple will fix these days.

This happened a few times. Sometimes it fixed itself after a few minutes, other times it just sat there.

How about calibration? Something I’ve seen floating around is that “The XDR displays can’t be hardware calibrated,” and that’s not true. What most people think of as “hardware calibrating” is actually profiling, where a spectrophotometer or colorimeter is used to linearize and generate a color profile for managing a display’s color. You are using a piece of hardware to do the profiling, but you’re not actually calibrating anything internal to the monitor. For most users, this is fine—adjusting the color at the graphics card level does the job. For very demanding users, this isn’t enough, and that’s why companies like NEC and Eizo sell displays with hardware LUTs that can be calibrated by very special—and very expensive—equipment.

You can still run X-Rite i1 Profiler and use an i1 Display to generate a profile, and you can still assign it to a display preset. But the laptop XDR displays now have the same level of complexity as the Pro Display XDR when it comes to hardware profiles. You can use a spectroradiometer to fine-tune these built-in picture modes for the various gamuts in System Preferences, and Apple has a convenient support document detailing this process. This is not the same thing as a tristimulus colorimeter, which is what most people think of as a “monitor calibrator!” I’m still new to this, so I’m still working out the process. I’m used to profiling traditional displays for graphic arts, so these video-focused modes are out of my wheelhouse. I’ll be revisiting the subject of profiling these displays in a future episode.

Fine tune calibration

Here there be dragons, if you’re not equipped with expensive gear.

Related to this, a new (to me) feature in System Preference’s Display pane is a Display Presets feature, allowing you choose different gamut and profile presets for the display. This was only available previously for the Pro Display XDR, but a much wider audience will see these for the first time thanks to the MacBook Pro. Toggling between different targets is a handy shortcut, even though I don’t think I’ll ever use it. The majority are video mode simulations, and since I’m not a video editor, they don’t mean much to me. If they matter to you, then maybe the XDR might make your on-the-go editing a little easier.

Bye Bye, Little Butterfly

When the final MacBook with a butterfly keyboard was eliminated from Apple’s portable lineup in the spring of 2020, many breathed a sigh of relief. Even if you weren’t a fan of the Dongle Life, you’d adjust to it. But a keyboard with busted keys drives me crazier than Walter White hunting a troublesome fly. The costs of the butterfly keyboard were greater than what Apple paid in product service programs and warranty repairs. The Unibody MacBook Pro built a ton of mind- and marketshare for Apple on the back of its structural integrity. It’s no ToughBook, but compared to the plasticky PC laptops and flexible MacBook Pro of 2008, it was a revelation. People were buying Macs just to run Windows because they didn’t crumble at first touch. All that goodwill flew away thanks to the butterfly effect. Though Apple rectified that mistake in 2020, winning back user trust is an uphill battle.

Part of rebuilding that trust is ditching the Touch Bar, the 2016 models’ other controversial keyboard addition. The 2021 models have sent the OLED strip packing in favor of full-sized function keys. Apple has an ugly habit of never admitting fault. If they made a mistake—like the Touch Bar—they tend to frame a reversion as “bringing back the thing you love, but now it’s better than ever!” That’s exactly what Apple’s done with the function keys—these MacBooks are the first Apple laptop to feature full-height function keys. Lastly, the Touch ID-equipped power button gets an upgrade too—it’s now a full sized key with a guide ring.

Keyboard 2 Keyboard.

How does typing feel on this so-called “Magic” keyboard? I didn’t have any of the Magic keyboard MacBooks, but I do have a desktop Magic keyboard that I picked up at the thrift store for five bucks. It feels nearly the same as that desktop keyboard in terms of travel. It felt way more responsive than a fresh butterfly keyboard, and I’m happy for the return of proper arrow keys.  Keyboards are subjective, and if you’re unsure, try it yourself. If you’re happy with 2012 through 2015 MacBook Pro keyboards, you’ll be happy with this one. My opinion on keyboards is that you shouldn’t have to think about them. Whatever switch type works for you and lets your fingers fly is the right keyboard for you. My preferred mechanical switch is something like a Cherry Brown, though I’ve always had an affinity for the Alps on the Apple Extended Keyboard II and IBM’s buckling springs.

Since the revised keyboard introduced in the late 2019 to spring 2020 notebooks hasn’t suffered a sea of tweets and posts claiming it’s the worst keyboard ever, it’s probably fine. I’m ready to just not think about it anymore. My mid-2018 MacBook Pro had a keyboard replacement in the fall of 2019. It wasn’t even a year old—I bought it in January 2019! That replacement keyboard is now succumbing to bad keys, even though it features the “better” butterfly switches introduced in the mid-2019 models. My N key’s been misbehaving since springtime, and I’ve nursed it along waiting for a worthy replacement. On top of mechanical failures, the legends flaked off a few keys, which had never happened to me before on these laser-etched keycaps. With all of the problems Apple’s endured, will the new keyboard be easier to repair? Based on teardown reports, replacing the entire keyboard is still a very involved process, but at least you can remove keycaps again without breaking the keys.

More people will lament the passing of the Touch Bar than the butterfly, because it provided some interesting functionality. I completely understand the logic behind it. F-keys are a completely opaque mechanism for providing useful shortcuts, and the Fn key alternates always felt like a kludge. The sliders to adjust volume or brightness are a slick demo, and I do like me some context-sensitive shortcut switching. But without folivora’s BetterTouchTool, I don’t think I would have liked the Touch Bar as much as I did. 

BetterTouchTool

Goodbye, old friends. You made the Touch Bar tolerable.

Unfortunately, Apple just didn’t commit to the Touch Bar, and that’s why it failed. An update with haptic feedback and an always-on screen would have made a lot of users happy. At least, it would have made me happy, but haptic feedback wouldn’t fix the “you need to look at it” or “I accidentally brushed against it” problems. But I think what really killed it was the unwillingness to expand it to other Macs by, say, external keyboards. The only users that truly loved the Touch Bar were the ones who embraced BetterTouchTool’s contextual presets and customization. With every OS update I expected Apple to bring more power to the Touch Bar, but it never came. Ironically, by killing the Touch Bar, Apple killed a major reason to buy BetterTouchTool. It’s like Sherlocking, but in reverse… a Moriartying! Sure, let’s roll with that.

I’ll miss my BTT shortcuts. I really thought Apple was going to add  half-height function keys and keep the Touch Bar. Maybe that was the plan at some point. Either way, the bill of materials for the Touch Bar has been traded for something else—a better screen, the beefier SOC, whatever. I may like BettterTouchTool, but I’m OK with trading the Touch Bar for an HDR screen.

Regardless of the technical merits for or against the Touch Bar, it will be remembered as a monument to Apple’s arrogance. They weren’t the first company to make a laptop with a touch-sensitive OLED strip on the keyboard. Lenovo’s Thinkpad X1 Carbon tried the same exact idea as the Touch Bar in 2014, only to ditch it a year later. Apple’s attempt reeked of “anything you can do, I can do better!” After all, Apple had vertical integration on their side and provided third-party hooks to let developers leverage this new UI. But Apple never pushed the Touch Bar as hard as it could have, and users sense half-heartedness. If Apple wanted to make the Touch Bar a real thing, they should have gone all-out. Commit to the bit. Without upgrades and investments, people see half-measure gimmicks for what they really are. Hopefully they’ve learned a lesson.

Any Port in a Storm

When the leaked schematics for these new MacBooks spoiled the return of MagSafe, HDMI, and an SD card slot, users across Mac forums threw a party. Apple finally conceded that one kind of port wasn’t going to rule them all. It’s not the first time Apple’s course corrected like this—adding arrow keys back to the Mac’s keyboard, adding slots to the Macintosh II, everything about the 2019 Mac Pro. When those rumors were proven true in the announcement stream, many viewers breathed a sigh of relief. The whole band isn’t back together—USB A didn’t get a return invite—but three out of four is enough for a Hall of Fame performance.

From left to right: MagSafe, Thunderbolt4, and headphones.

Let’s start with the the left-hand side of the laptop. Two Thunderbolt 4 ports are joined by a relocated headphone jack and the returning champion of power ports: MagSafe. After five years of living on the right side of Apple’s pro notebooks, the headphone jack has returned to its historical home. The headphone jack can double as a microphone jack when used with TRRS headsets, and still supports wired remote commands like pause, fast forward, and rewind. Unfortunately, I can’t test its newest feature: support for high-impedance headphones, but if you’re interested in that, Apple’s got a support document for you. Optical TOSLINK hasn’t returned, so if you want a digital out to an amplifier, you’ll need to use HDMI or an adapter.

Next is the return of MagSafe. If you ask a group of geeks which Star Trek captain was the superior officer, you might cause a fistfight. But poll that same group about laptop power connectors, and they’ll all agree that MagSafe was the perfect proprietary power port. When the baby pulls your power cable or the dog runs over your extension cord, MagSafe prevents a multi-thousand dollar disaster. After all, you always watch where you’re going. You’d never be so careless… right? Perish the thought.

But every rose has its thorns, just like every port has its cons. Replacing a frayed or broken MagSafe cord was expensive and inconvenient because that fancy proprietary cable was permanently attached to the power brick. When MagSafe connectors were melting, Apple had to replace whole bricks instead of just swapping cables. Even after that problem was solved, my MagSafe cables kept fraying apart at the connector. I replaced three original MagSafe blocks on my white MacBook, two of which were complementary under Apple’s official replacement plan. My aluminum MacBook Pro, which used the right-angle style connector, frayed apart at a similar rate. Having to replace a whole power brick due to fraying cables really soured me on this otherwise superior power plug.

Meet the new MagSafe—not the same as the old MagSafe.

When Apple moved away from MagSafe in 2016, I was one of the few people actually happy about it! All things equal, I prefer non-proprietary ports. When Lightning finally dies, I’ll throw a party. By adopting USB Power Delivery, Apple actually gave up some control and allowed users to charge from any PD-capable charger. Don’t want an expensive Apple charger? Buy Anker or Belkin chargers instead! Another advantage I love is the ability to charge from either side of the laptop. Sometimes charging on the right hand side is better! You could also choose your favorite kind of cable—I prefer braided right-angle ones.

But with every benefit comes a tradeoff. USB Power Delivery had a long-standing 100 watt limit when using a specially rated cable. If a laptop needed more power, OEMs had no choice but to use a proprietary connector. What’s worse, 100 watts might not be enough to run your CPU and GPU at full-tilt while maintaining a fast charging rate. If USB-C was going to displace proprietary chargers, it needed to deliver MORE POWER!

The USB Implementation Forum fixed these issues in May when they announced the Power Delivery 3.1 specification. The new spec allows for 140, 180, and 240 watt power supplies when paired with appropriately rated cables. These new power standards mean the clock is ticking on proprietary power plugs. So how can MagSafe return in a world where USB-C is getting more powerful?

Logo Soup

Of course, the proliferation of standards means adding a new logo every time. Thus sulving the problem once and for all.

The good news is that the new power supply has a Type C connector and supports the new USB Power Delivery 3.1 standard. Apple’s 145 watt power supply will work just fine with 140W USB C cables. It so happens that Apple’s cable has Type C on one end and MagSafe on the other. That makes it user replaceable, to which I say thank freakin’ God. You can even use the MagSafe cable with other PD chargers, but you won’t get super fast charging if you use a lower-rated brick. The cable is now braided, which should provide better protection against strain and fraying, and the charging status light is back too.

Don’t fret if you use docks, hubs, or certain monitors—you can still charge over the Thunderbolt ports, though you’ll be limited to 100W of power draw. This means no half-hour fast charging on the 16 inch models, but depending on your CPU and GPU usage you’ll still charge at a reasonable rate. If you lose your MagSafe cable or otherwise need an emergency charge, regular USB power delivery is ready and waiting for you. I’ve been using my old 65W Apple charger and 45W Anker charger with the 16 inch and it still charges, just not as quickly.

Does the third iteration of MagSafe live up to the legacy of its forebears? Short answer: yes. The connector’s profile is thin and flat, measuring an eighth of an inch tall by three quarters wide. My first test was how well it grabs the port when I’m not looking at it. Well, it works—most of the time. One of the nice things about the taller MagSafe 1 connector was the magnet’s strong proximity effect. So long as the plug was in the general vicinity of the socket, it’d snap right into place. MagSafe 2 was a little shorter and wider, necessitating more precision to connect the cord. That same precision is required with MagSafe 3, but all it means is that you can’t just wave the cord near the port and expect it to connect. As long as you grasp the plug with your thumb and index finger you’ll always hit the spot, especially if you use the edge of the laptop as a guide.

In a remarkable act of faith, I subjected my new laptop to intentional yanks and trips to test MagSafe’s effectiveness. Yanking the cable straight out of the laptop is the easiest test, and unsurprisingly it works as expected. It takes a reasonable amount of force to dislodge the connector, so nuisance disconnects won’t happen when you pick up and move the laptop. The next test was tripping over the cord while the laptop was perched on my kitchen table. MagSafe passed the test—the cable broke away and the laptop barely moved. It’s much easier to disconnect the connector when pulling it up or down versus left or right, and that’s due to the magnet’s aspect ratio. There’s just more magnetic force acting on the left and right side of the connector. I would say this is a good tradeoff for the usual patterns of yanks and tugs that MagSafe defends against, like people tripping over cables on the floor that are attached to laptops perched on a desk, sofa, or table. The downside is that if your tug isn’t tough enough, you might end up yanking the laptop instead. USB-C could theoretically disconnect when pulled, but more often than not a laptop would go with it. Or your cable would take one for the team and break away literally and figuratively. Overall, I think everyone is glad to have MagSafe back on the team.

Oh, and one last thing: the MagSafe cable should have been color matched to the machine it came with. If Apple could do that for the iMac’s power supply cable, they should have done it for the laptops. My Space Gray machine demands a Space Gray power cable!

From left to right: UHS-II SD Card, Thunderbolt 4, and HDMI 2.0.

Let’s move on to the right-hand side’s ports. Apple has traded a Thunderbolt port for an HDMI 2.0 port and a full-sized SD card slot. These returning guests join the system’s third Thunderbolt 4 port. The necessity of these connections depends on who you ask, but for the MacBook Pro’s target audience of videographers, photographers, and traveling power users, they’ll say “yes, please!” TVs, projectors, and monitors aren’t abandoning HDMI any time soon. SD is still the most common card format for digital stills and video cameras, along with audio recorders. Don’t forget all the single board computers, game consoles, and other devices that use either SD or MicroSD cards.

First up: the HDMI port. There’s already grousing and grumbling about the fact that it’s only HDMI 2.0 compatible and not 2.1. There’s a few reasons why this might be the case—bandwidth is a primary one, as HDMI 2.1 requires 48 gigabits per second bidirectional to fully exploit its potential. There’s also the physical layer, which is radically different compared to HDMI 2.0 PHYs. 4k120 monitors aren’t too common today, and the ones that do exist are focused on gamers. The most common use case for the HDMI port is connecting to TVs and projectors on the go, and supporting a native 4k60 is fine for today. It might not be fine in five years, but there’s still the Thunderbolt ports for more demanding monitors. You know what would have been really cool? If the HDMI port could also act like a video capture card when tethered to a camera. Let the laptop work like an Atomos external video recorder! I have no idea how practical it would be, but that hasn’t stopped me from dreaming before.

The SD card sticks out pretty far from the slot.

Meanwhile, the SD slot does what it says on the tin. The port is capable of UHS-II speeds and is connected via PCI Express. Apple System Profiler says it’s a 1x link capable of 2.5GT/s—basically, a PCI-E 1.0 connection. That puts an effective cap of 250MB/s on the transfers. My stable of UHS-I V30 SanDisk Extreme cards work just fine, reading and writing the expected 80 to 90 megabytes per second. Alas, I don’t have any UHS-II cards to test. The acronym soup applied to SD cards is as confusing as ever, but if you’re looking at the latest crop of UHS-II cards, they tend to fall in two groups: slower v60, and faster v90. The fastest cards can reach over 300 megabytes per second, but Apple’s slot won’t go that fast. If you have the fastest, most expensive cards and demand the fastest speeds, you’ll still want to use a USB 3.0 or Thunderbolt card reader. As for why it’s not UHS-III, well, those cards don’t really exist yet. UHS-III slots also fall back to UHS-I speeds, not UHS-II. Given these realities, UHS-II is a safe choice.

Even if you’re not a pro or enthusiast camera user, the SD slot is still handy for always-available auxiliary storage. The fastest UHS-II cards can’t hope to compete with Apple’s 7.4 gigabytes-per-second NVME monster, but you still have some kind of escape hatch against soldered-on storage. Just keep in mind that the fastest UHS-II cards are not cheap—as of this writing, 256 gig V90 cards range between $250 and $400 depending on the brand. V60 cards are considerably cheaper, but you’ll sacrifice write performance. Also, the highest capacity UHS-II card you can buy is 256 gigs. If you want 512 gig or one terabyte cards, you’ll need to downgrade to UHS-I speeds. Also, an SD card sticks out pretty far from the slot—something to keep in mind if you plan on leaving one attached all the time.

I know a few people who attach a lot of USB devices that are unhappy about the loss of a fourth Thunderbolt port. But the needs of the many outweigh the needs of the few, and there are many more who need always available SD cards and HDMI connectors without the pain of losing an adapter. Plus, having an escape hatch for auxiliary storage takes some of the sting out of Apple’s pricey storage upgrades.

Awesome. Awesome to the (M1) Max.

Yes, yes, we’ve finally arrived at the fireworks factory. Endless forum and twitter posts debated the possibilities of  what Apple’s chip design team could do when designing a high-end chip. Would Apple stick to a monolithic die or use chiplets? How many GPU cores could they cram in there? What about the memory architecture? And how would they deal with yields? Years of leaks and speculation whetted our appetites, and now the M1 Pro and Max are finally here to answer those questions.

The answer was obvious: take the recipe that worked in the M1 and double it. Heck, quadruple it! Throw the latest LPDDR5 memory technology in the package and you’ve got a killer system on chip. It honestly surprised me that Apple went all-out like this—I’m used to them holding back. Despite all its power, the first M1 was still an entry level chip. Power users who liked the CPU performance were turned off by memory, GPU, or display limitations. Now that Apple is making an offering to the more demanding user, will they accept it?

M1, M1 Pro, M1 Max

The M1 Pro is a chonky chip, but the M1 Max goes to an absurd size. (Apple official image)

Choosing between the M1 Pro and M1 Max was a tough decision to make in the moment. If you thought ARMageddon would mean less variety in processor choices, think again. It used to be that the 13 inch was stuck with less capable CPUs and a wimpy integrated GPU, while the 15 and 16 inch models were graced with more powerful CPUs and a discrete GPU. Now the most powerful processor and GPU options are available in both 14 and 16 inch forms for the first time. In fact, the average price difference between an equally equipped 14 and 16 inch model is just $200. Big power now comes in a small package.

Next, let’s talk performance. Everyone’s seen the Geekbench numbers, and if you haven’t, you can check out AnandTech. I look at performance from an application standpoint. My usage of a laptop largely fits into two categories: productivity and creativity. Productivity tasks would be things like word processing, browsing the web, listening to music, and so on. The M1 Max is absolutely overkill for these tasks, and if that’s all you did on the computer, the M1 MacBook Air is fast enough to keep you happy. But if you want a bigger screen, you’ll have to pay more for compute you don’t need in a heavier chassis. I think there’s room in the market for a 15 inch MacBook Air that prioritizes thinness and screen space over processor power. I’ll put a pin in that for a future episode.

In average use, the efficiency cores take control and even the behemoth M1 Max is passively cooled. My 2018 13 inch MacBook Pro with four Thunderbolt ports was always warm, even if the fans didn’t run. My usual suspects include things like Music, Discord, Tweetdeck, Safari, Telegram, Pages, and so on. That scenario is nothing for the 16 inch M1 Max—the bottom is cool to the touch. 4K YouTube turned my old Mac into a hand warmer, and now it doesn’t even register. For everyday uses like these, the E-cores keep everything moving. Just like the regular M1, they keep the system responsive under heavy load. Trading two E-cores for two P-cores negatively affects battery life, but not as much as you’d think. Doing light browsing, chatting, and listening to music at medium brightness used about 20% battery over the course of two and a half hours. Not a bad performance at all, but not as good as an M1 Air. The M1 Pro uses less power thanks to a lower transistor count, so if you need more battery life, get the Pro instead of the Max.

Izotope

Izotope RX is a suite of powerful audio processing tools, and it’s very CPU intensive.

How about something a bit more challenging? These podcasts don’t just produce themselves—it takes time for plugins and editing software to render them. My favorite audio plugins are cross-platform, and they’re very CPU intensive, making them perfect for real-world tests. I used a 1 hour 40 minute raw podcast vocal track as a test file for Izotope RX’s spectral de-noise and mouth de-clicker plugins. Izotope RX is still an Intel binary, so it runs under Rosetta and takes a performance penalty. Even with that penalty, I’m betting it’ll have good performance. My old MacBook Pro would transform into a hot, noisy jet engine when running Izotope. Here’s hoping the M1 Max does better.

These laptops also compete against desktop computers, so I’ll enter my workstation into the fight. My desktop features an AMD Ryzen 5950X and Nvidia 3080TI, and I’m very curious how the M1 Max compares. When it’s going full tilt, the CPU and GPU combined guzzle around 550 watts. I undervolt my 3080TI and use curve optimizer on my 5950X, so it’s tweaked for the best performance within its power limits.

For a baseline, my 13 inch Pro ran Spectral De-Noise at 4:02.79. The M1 Max put in two minutes flat. The Max’s fans were barely audible, and the bottom was slightly warm. It was the complete opposite of the 13 inch, which was quite toasty and uncomfortable to put on my lap. It would have been even hotter if the fans weren’t running at full blast. Lastly, the 5950X ran the same test at 2:33.8. The M1 Max actually beat my ridiculously overpowered PC workstation, and it did it with emulation. Consider that HWInfo64 reported 140W of CPU package power draw on the 5950X while Apple’s power metrics reported around 35 watts for the M1X. That’s nearly one quarter the power! Bananas.

Izotope RX Spectral De-Noise

Time in seconds. Shorter bars are better.

Next is Mouth De-click. It took my 13 inch 4 minutes 11.54 seconds to render the test file. The de-clicker stresses the processor in a different way, and it makes the laptop even hotter. It can’t sustain multi-core turbo under this load, and if it had better cooling it may have finished faster. The M1 Max accomplished the same goal in one minute, 26 seconds, again with minimum fan noise and slight heat. Both were left in the dust by the 5950X, which scored a blistering fast 22 seconds—clearly there’s some optimizations going on. While the 5950x was pegged at 100% CPU usage across all cores on both tests, the de-noise test only boosted to 4.0GHz all-core. De-clicker boosted to 4.5GHz all-core and hit the 200W socket power limit. Unfortunately, I don’t know about the inner workings of the RX suite to know which plugins use AVX or other special instruction sets.

Izotope RX Mouth De-Click

Time in seconds. Shorter bars are better.

How about video rendering? I’m not a video guy, but my pal Steve—Mac84 on YouTube—asked me to give Final Cut Pro rendering a try. His current machine, a dual-socket 2012 Cheesegrater Mac Pro, renders an eleven minute Youtube video in 7 minutes 38 seconds. That’s at 1080p with the “better quality” preset, with a project that consists of mixed 4K and 1080P clips.  The 13 inch MBP, in comparison, took 11 minutes 54 seconds to render the same project. Both were no match for the M1 Max, which took 3 minutes 24 seconds. Just for laughs, I exported it at 4K, which took 12 minutes 24 seconds. Again, not a video guy, but near-real time rendering is probably pretty good! I don’t have any material to really push the video benefits of the Max, like the live preview improvements, but I’m sure you can find a video-oriented review to test those claims.

Final Cut Pro 1080p Better Quality Export

Time in minutes. Shorter bars are better.

Lightroom Classic is another app that I use a lot, and the M1 Max shines here too. After last year’s universal binary update, Lightroom runs natively on Apple Silicon, so I’m expecting both machines to run at their fullest potential. Exporting 74 Sony a99ii RAW files at full resolution—42 megapixels—took 1 minute 25 seconds on the M1 Max. My 5950x does the same task in 1 minute 15 seconds. Both machines pushed their CPU usage to 100% across all cores, and the 5950X hit 4.6GHz while drawing 200W. If you trust powerstats, the M1 Max reported around 38 watts of power draw. Now, I know my PC is overclocked—an out-the-box 5950x tops out at 150W and 3.9GHz all-core. But AMD’s PBO features allow the processor to go as fast as cooling allows, and my Noctua cooler is pretty stout. Getting that extra 600 megahertz costs another 40 to 50 watts, and that nets a 17 percent speed improvement. Had I not been running Precision Overdrive Boost 2, the M1 Max may very well have won the match. Even without the help of PBO, it’s remarkable that the M1 Max is so close while using a fifth of the power. If you’re a photo editor doing on-site editing, this kind of performance on battery is remarkable.

Adobe Lightroom Classic 42MP RAW 74 Image Export

Time in seconds. Shorter Bars are better.

Lastly, there’s been a lot of posting about game performance. Benchmarking games is very difficult right now, because anything that’s cross platform is probably running under Rosetta and might not even be optimized for Metal. But if your game is built on an engine like Unity, you might be lucky enough to have Metal support for graphics. I have one such game: Tabletop Simulator. TTS itself still runs in Rosetta, but its renderer can run in OpenGL or Metal modes, and the difference between the two is shocking. With a table like Villainous: Final Fantasy, OpenGL idles around 45-50 FPS with all settings maxed at native resolution. Switch to the Metal renderer and TTS locks at a stable, buttery smooth 120FPS. Even when I spawned several hundred animated manticores, it still ran at a respectable 52 FPS, and ProMotion adaptive sync smoothed out the hitches. Compare that to the OpenGL renderer, which chugged along at a stutter-inducing 25 frames per second. TTS is a resource hungry monster of a game even on Windows, so this is great performance for a laptop. Oh, and even though the GPU was running flat out, the fans were deathly quiet. I had to put my ear against the keyboard to tell if they were running.

Tabletop Simulator Standard

Frames per Second. Taller bars are better.

Tabletop Simulator Stress Test

Frames per Second. Taller bars are better.

My impression of the M1 Max is that it’s a monster. Do I need this level of performance? Truthfully, no—I could have lived with an M1 Pro with 32 gigs of RAM. But there’s something special about witnessing this kind of computing power in a processor that uses so little energy. I was able to run all these tests on battery as well, and the times were identical. Tabletop Simulator was equally performant. Overall, this performance bodes well for upcoming desktop Macs. Most Mac Mini or 27 inch iMac buyers would be thrilled with this level of performance. And of course, the last big question remains: If they can pull off this kind of power in a laptop, what can they do for the Mac Pro?

But it’s not all about raw numbers. If your application hasn’t been ported to ARM or uses the Metal APIs, the M1 Max won’t be running at its full potential, but it’ll still put in a respectable performance. There are still a few apps that won’t run at all in Rosetta or in Monterey, so you should always check with your app’s developers or test things out before committing.

Chips and Bits

Before I close this out, there’s a few miscellaneous observations that don’t quite fit anywhere else.

  • The fans, at minimum RPM, occasionally emit a slight buzz or coil whine. It doesn’t happen all the time, and I can’t hear it unless I put my ear against the keyboard. It doesn’t happen all the time, either.

  • High Power Mode didn’t make a difference in any of my tests. It’s likely good only for very long, sustained periods of CPU or GPU usage.

  • People griped about the shape of the feet, but you can’t see them at all on a table because the machine casts a shadow! I’m just glad they’re flat again and not those round things.

  • Kinda bummed that we still have a nearly square edge along the keyboard top case. I’m still unhappy with those sharp corners in the divot where you put your thumb to open the lid. You couldn’t round them off a little more, Apple?

  • The speakers are a massive improvement compared to the 2018 13 inch, but I don’t know if they’re better than the outgoing 16 inch. Spatial audio does work with them, and it sounds… fine, usually. The quality of spatial audio depends on masters and engineers, and some mixes are bad regardless of listening environment.

  • Unlike Windows laptops with rounded corners, the cursor follows the radius of the corner when you mouse around it. Yet you can hide the cursor behind the notch. Unless a menu is open, then it snaps immediately to the next menu in the overflow.

  • If only Apple could figure out a way to bring back the glowing Apple logo. That would complete these laptops and maybe get people to overlook the notch. Apple still seems to see the value too, because the glowing logo still shows up in their “Look at all these Macs in the field!” segments in livestreams. Apple, you gotta reclaim it back from Razer and bring some class back to light-up logos!

  • Those fan intakes on the bottom of the chassis are positioned right where your hands want to grab the laptop. It feels a little off, like you’re grabbing butter knives out of the dishwasher. They’ve been slightly dulled, but I would have liked more of a radius on them.

  • I haven’t thought of a place to stick those cool new black Apple stickers.

Comparison Corner

So should you buy one of these laptops? I’ve got some suggestions based on what you currently use.

  • Pre-2016 Retina MacBook Pro (or older): My advice is to get the 14 inch with whatever configuration fits your needs. You’ll gain working space and if you’re a 15 inch user you’ll save some size and weight. A 16 inch weighs about the same as a 15 inch, but is slightly larger in footprint, so if you want the extra real estate it’s not much of a stretch. This is the replacement you’ve been waiting for.

  • Post-2016 13 inch MacBook Pro: The 14 inch is slightly larger and is half a pound heavier, but it’ll fit in your favorite bag without breaking your shoulder. Moving up to a 16 inch will be a significant change in size and weight, so you may want to try one in person first. Unless you really want the screen space, stick to the 14 inch. You’ll love the performance improvements even with the base model, but I’d still recommend the $2499 config.

  • Post-2016 15 and 16 inch MacBook Pro: You’ll take a half-pound weight increase. You’ll love the fact that it doesn’t cook your crotch. You won’t love that it feels thicker in your hands. You’ll love sustained performance that won’t cop out when there’s heat all about. If you really liked the thinness and lightness, I’m sorry for your loss. Maybe a 15 inch Air will arrive some day.

  • A Windows Laptop: You don’t need the 16 inch to get high performance. If you want to save size and weight, get the 14 inch. Either way you’ll either be gaining massive amounts of battery life compared to a mobile workstation or a lot of performance compared to an ultrabook-style laptop. There’s no touch screen, and if you need Windows apps, well… Windows on ARM is possible, but there are gotchas. Tread carefully.

  • An M1 MacBook Air or Pro: You’ll give up battery life. If you need more performance, it’s there for you. If you need or want a bigger screen, it’s your only choice, and that’s unfortunate. Stick to the $2699 config for the 16 inch if you only want a bigger screen.

  • A Mac Mini or Intel iMac: If you’re chained to a desk for performance reasons, these machines let you take that performance on the road. Maybe it makes sense to have a laptop stand with a separate external monitor—but the 16 inch is a legit desktop replacement thanks to all that screen area. If you’re not hurting for a new machine, I’d say wait for the theoretical iMac Pro and Mac Mini Pro that might use these exact SoCs.

Two Steps Forward and One Step Back—This Mac Gets Pretty Far Like That

In the climactic battle at the end of The Incredibles, the Parr family rescues baby Jack-Jack from the clutches of Syndrome. The clan touches down in front of their suburban home amidst a backdrop of fiery explosions. Bob, Helen, Violet, and Dash all share a quiet moment of family bonding despite all the ruckus. Just when it might feel a little sappy, a shout rings out. “That was totally wicked!” It turns out that little neighbor boy from way back witnessed the whole thing, and his patience has been rewarded. He finally saw something amazing.

If you’re a professional or power user who’s been waiting for Apple to put out a no-excuses laptop with great performance, congratulations: you’re that kid. It’s like someone shrank a Mac Pro and stuffed it inside a laptop. It tackles every photo, video, and 3D rendering job you can throw at it, all while asking for more. Being able to do it all on battery without throttling is the cherry on top.

So which do you choose? The price delta between similarly equipped 14 and 16 inch machines is only $200, so you should decide your form factor first. I believe most users will be happy with the balance of weight, footprint, and screen size of the 14 inch model, and the sweet spot is the $2499 config. Current 13 inch users will gain a ton of performance without gaining too much size and weight. Current 15 inch users could safely downsize and not sacrifice what they liked about the 15 incher’s performance. For anyone who needs a portable workstation, the 16 inch is a winner, but I think more people are better served with the smaller model. If you just need more screen space, the $2699 model is a good pick. If you need GPU power, step up to the $3499 model.

We’ve made a lot of progress in 30 years.

There’s no getting around the fact that this high level of performance has an equally high price tag. Apple’s BTO options quickly inflate the price tag, so if you can stick to the stock configs, you’ll be okay. Alas, you can’t upgrade them, so you better spec out what you need up front. The only saving grace is the SD slot, which can act as take-anywhere auxiliary storage. Comparing to PC laptops is tricky. There are gamer laptops that can get you similar performance at a lower price, but they won’t last nearly as long on battery and they’ll sound like an F-22. Workstation-class laptops cost just as much and usually throttle when running on batteries. PCs win on upgradability most of the time—it’s cheaper to buy one and replace DIMMs or SSDs. Ultimately I expect most buyers to buy configs priced from two to three grand, which seems competitive with other PC workstation laptops.

The takeaway from these eleven thousand words is that we’ve witnessed a 2019 Mac Pro moment for the MacBook Pro. I think the best way of putting it is that Apple has remembered the “industrial” part of industrial design. That’s why the the Titanium G4 inspiration is so meaningful—it was one of the most beautiful laptops ever made, and its promise was “the most power you can get in an attractive package.” Yes, I know it was fragile, but you get my point. It’s still possible to make a machine that maximizes performance while still having a great looking design. That’s what we’ve wanted all along, and it’s great to see Apple finally coming around. Now, let’s see what new Mac Pro rumors are out there…

The Vintage Computer Festival East 2021 Report

Here in Userlandia, you'll never find a more fascinating hive of geeks and nerdery.

The following post is a transcript of a live, unscripted podcast. It has been edited for clarity.

Hey everybody, and welcome to a special off-the-cuff edition of Userlandia. I'm coming to you today with a post-mortem report for my trip down to the Vintage Computer Festival East, held down in the beautiful Jersey shore in New Jersey. It was a pretty fun show! I had a good time and met a lot of people, saw a lot of neat and interesting old computers and figured it'd be good idea to share some of the experiences of what I felt worked and what maybe could be improved and other fascinating bits.

It was a quite a drive from northeastern Massachusetts. It was a pretty  tough drive to go down on a Friday on a long weekend, but I made it there okay. The event itself was held in the InfoAge science center, which is on the old Camp Evans, a decommissioned army base where they've done radio and signals intelligence. It has a lot of neat history all on its own and would probably be a really interesting museum to visit under normal circumstances. But you might be asking yourself, “Dan, aren't there certain current events going on?” Yep, that’s true! Those current events stopped me from going to Vintage Computer Festival East in 2020 when it was canceled because it was being held right around the time that things started happening with the pandemic.

You know how everything else is going—waves my hands at everything all going on in the world. As for myself, I'm double vaxxed, and I wore N95 masks all the time. The folks at InfoAge and the Vintage Computer Federation had pretty reasonable protocols for people and everything else. It is what it is—it’s a fairly small show. I have no idea how many people were there, but I've done my fair share of conventions over the years where I've tabled as a vendor. I would be surprised if there was more than a few hundred people there, tops, but it was still a very fun and interesting show to go and visit. I'd like to give you guys a feel for what it was like to go down and see this as a first timer. I’m hoping to go back there in the future. They’ve got another one scheduled since this is normally a springtime show. VCF East 2022 is scheduled in the springtime, around April or May of next year. We'll see how it goes. Maybe I'll be there with a Userlandia table! You never know. 

So why would you want to go down to a show like the Vintage Computer Festival? Well, if you go to their website—which is vcfed.org—they’ve got examples and stuff from all the various vintage computer shows that have been held over the. About a month or so ago, there was VCF Midwest, which a friend of mine who is local to the Chicagoland area went to and had a very good time. Based on what he was telling me and other video reports I've seen on the interwebs, VCF Midwest is the bigger show. There's more people, it's held in a hotel, there’s more exhibits. Well, I’m not sure if maybe more exhibits, but there's definitely more tables and other things. Compared to various conventions I've been to over the years, It definitely has a small convention feel. That said, it was a three-day show with Friday, Saturday and Sunday events.

Friday was mostly what they would call learning exhibits, where they're having people giving talks and other things, not so much vendors or exhibitors or other things going on. Most of those people were still getting set up on Friday. The average person would be going on Saturday and indeed at these types of shows, Saturday is almost always the busiest day. That's when there was the most people, the most exhibits, the most stuff to buy. If you're going to pick one day to go, Saturday is probably going to be it, but there was stuff on all days that you could go see and enjoy. 

Exhibits

So what I'm going to do is talk about some of the highlights of the various exhibits and things that are at the show and give some impressions and other things like that, because I really had a good time and supporting the Vintage Computer Federation, which helps keep a lot of these old things alive. They supply knowledge on their forums, they help organize these events for people to buy, sell, trade, and exchange information. I think 90% of this really is just talking to people and giving other people more information about things that you enjoy. So why don't we talk about some of the exhibits and exhibit tours at the show?

Except for the last one, these are listed in no particular order, just things that I thought of when I was driving back in the car and decided to commit to paper. We'll start off with Doug Taylor. So Doug had brought several old graphics workstations that were doing 3D visualizations, graph renders, all sorts of interesting stuff—at least to me. He had a Tektronix workstation, which was awesome. There was a DEC Alpha too. He had a few other things that were running on some simulators doing scientific calculations, plots, charts, 3D graphics, and renders. And I found this to be highly cool and informative because as a computer graphics person, I would've never have seen or used it in real life because it was all before my time.

IMG_4912.jpg

Watching that Tektronix workstation very slowly paint in a 3D map visualization was honestly one of the coolest things that was at the show. It was old and it was slow and it was amazing because they were trying to figure out at that time how to do Z-axis occlusion to say “don't render and paint the things we can't see; just go ahead and paint the things that actually are going to be visible on the display or on the output.” Today your phone can chew that up and spit it out and you'd have no problem with it at all. But I thought it was just very interesting and fun to see that all in action in real time. You can make a screensaver out of that or something—people probably have. I could just put it on in the background and enjoy it all day.

I've found that a lot of attention given to vintage computers is a bit skewed. It’s not just at shows, but on YouTube and other places as well. A lot of the driver is games, and that's fair because for a lot of people who are producing content today, their experience was with games. That was true for me too—when I was a kid, games were certainly a big part of my computer experience, and that's why systems like the Commodore 64, the ZX Spectrum, even the old Apples have so much of a presence because a lot of people played games and want to go back and play those games again. It's a lot harder to find people bringing back stuff that was done with productivity or things like that. I was very happy to see a lot of stuff that was not just games. There was a good balance of games and other applications running on all of these old computers. And I really enjoyed that quite a bit.

One thing I found was very amusingwas a fellow named Alastair Hewitt. He was running a project about building a microcomputer out of these TTL chips, which would connect to modern peripherals and things like that. It's actually a very cool project. A link to these will be in the show notes, but what I found most amazing was that the monitor he was using. It was a LaCie Electron Blue. I love those LaCie monitors. When I saw that, I was like, “Heeeey,” because I owned one of those monitors. I worked with LaCie Electron Blue monitors in the graphic arts industry and I bought a 19 inch ElectronBlue III in like 2001 or something like that. That was a $500 monitor in 2001 money. And I still regret giving that monitor away. I should've hung on to it, but whatever. Had I known he would have it there, I would have brought him the hood shade! In production environments CRT monitors had hoods to shade them from ambient light to prevent contamination of the image. And I still have it here in my closet. Like, "damn dude, if I had known you had that, I would have brought that down and given it to you.”

A LaCie ElectronBlue II.

A LaCie ElectronBlue II.

He also had a Be Box, which is very cool because I've never seen a Be Box in person. It does look very cool. I don't know if I could ever be productive on a Be Box, but I just like seeing it all the same, because part of it is just seeing a lot of these machines in the flesh that you might not have actually seen before and actually touching them and using them. It’s kind of like a traveling museum in some cases where people come and bring all of their materials so that other people might have the chance to enjoy them.

Something else that I thought was really fun and amusing and kind of unusual was in one of the exhibit rooms. They had a series of computers all running MS-DOS or x86 emulators that you wouldn't expect to be running them. I think they were calling it the x86 challenge or something to that effect. So you had machines like an Apple IIGS with a PC Transporter and an Apple IIe also with a PC Transporter. There was an Apple Lisa running some kind of SoftWindows type of thing, which I thought was neat. I didn't even care about it running Windows—I’d never used a Lisa before in my life. So that was fun to be able to go and poke around with it. There was also a Iici that had some kind of PC compatibility card in it.

Gotta love rolling shutter capture artifacts.

Gotta love rolling shutter capture artifacts.

Lastly, there was an Acorn Archimedes. Yep, the good old Archie. It was my first time actually using a RiscOS/Acorn machine in real life. Acorn had those PC podules for the RISC PCs and they probably had something similar for the Archimedes as well that allowed them to do that. That was just really fun. I enjoyed just having hands on an Archimedes. Those were not popular here at all in the United States. So it's definitely a rare thing. Once again, you can't really see that without going to a show like this. The odds of something like that coming up on Craigslist or Facebook Marketplace or whatever is incredibly low.

The x86 challenge was really just one corner of one exhibit hall that featured a lot of IBM and other types of things. They had a whole World of IBM exhibit. There were PS/2s of all different kinds: the all-in-ones, the portable PS2s, and my old PS/2, a Model 30 286. I saw them and was all “aw, straight out of my heart.” It wasn't just PS/2s—there were also PC ATs, PC XTs… basically anything that was pre-1992 IBM, they had it all there. They even had one of these giant 19 or 20 inch IBM CRT monitors, which I had never seen before. I'd only seen the kind of very small PS/2 monitors that they had floating around the show. Part of this was OS/2 throughout the years. They had three machines laid out in a row running OS/2 2.1, OS/2 3, and OS/2 Warp. You could go from each of these machines and see the evolution of OS/2 and just the kind of way that OS/2 fell apart. I've used OS/2 in virtual machines, never on actual hardware, because why would I? But I enjoyed it quite a bit.

OS/2, in the flesh.

It was nice to see the actual evolution of it, to see where it went from that 2.X all the way up to OS/2 Warp. IBM had a lot of neat and interesting things. You know, they had their own scripting language, which was REXX, which people might be familiar with on the Amiga as A/REXX. They had their object model programming, which they tried to adapt OpenDoc and other things into. The System Object Model is what they called it. And the GUI was just really nice. It was responsive for the most part. The 2.x machine, unfortunately, didn't have as much RAM as it should have and the exhibitor apologized profusely, but it was still fun to go and pick it up and poke at it and see,  “Hey, what's going on in this particular machine?” Maybe it's gotten me willing to try OS/2 to a little more and actually dive into it a little bit. For a quickie 10 minute session of interacting with it, it was nice to see that represent not just Windows and DOS, but the other parts of IBM's computing legacy as well.

That world of IBM stuff was really cool. Unfortunately, some of the machines were having trouble as the day went on. That's kind of the risk with these old computers is that they do break. Back in the day they broke and today they're having trouble on and off again with floppy drives and such. Fortunately people had parts and there were people who knew how to fix things and get stuff back up and running again. But if you're going to be presenting at one of these kinds of shows, say with your own hardware, you just got to keep that in mind when you're bringing it all around.

Some other exhibitors had some extremely cool tech. We had Fujinet, which people have been talking about lately. It started off on the Atari, and it's a kind of network attached intelligence that you can use to access things locally over your own network via the retro computers. They're expanding it to more systems, too. I'm interested in picking up the Apple II version to use with my IIGS, because I think that would be interesting. They had the Fujinet weather display up on the monitor and then you'll find out later that weather was kind of a theme at the show.

I talked with Tom Cherryhomes, who was a fellow there doing the presenting—very affable guy. I heard a lot of interesting things about Fujinet and how they were planning on bringing it to other retro computers. I have a feeling that these types of bridges to the outside world are going to become more and more important when it comes to retro devices—to at least give people a reason to use their old computers other than just to say, “oh, I'm going to boot it up and play a game for 15, 20 minutes and turn it off.” It's a way to try to make things a little more overall useful in a modern context. I applaud them for it and I hope more people pick up Fujinet and then it gets more popular.

Another cool thing was the theme of the show, which was adventure gaming. At some of the exhibits there was a lot of adventure gaming going on. Scott Adams was a guest of honor at the show—he wrote many adventure games. His panel was very interesting, but a lot of other people here were in the text adventure theme as well. You had people playing live text adventures. There was a multi-user dungeon, displays of old Infocom games, things like that. One thing that came up was Ken and Roberta Williams' new game. Another exhibitor to keep in with this theme of the adventure game was Marcus Mira, who was there playing the hype man, as he has been for a little while, for Ken and Roberta Williams is new interactive adventure game.

The details on that game are still a little scarce at the moment. I mean, it's been announced and Marcus himself is doing a lot of work. He's doing 3D modeling and other stuff. Marcus offered to teach me some 3D modeling and, uh, hey, if you can make that happen, I'd be happy to stop by and see. As a regular artist I'm average at best, but sculpting was always my weakest point. So I would definitely be willing to try it sometime. He had an Apple III set up too. There were other things running various Sierra at his table. There was a C128 that Bill Herd signed, which was pretty cool. But most of it was talking about the new game and hopefully trying to get people interested in it.

The Apple III.

The Apple III.

I was never a Sierra OnLine guy—I was always a Lucasfilm guy because my platforms didn't really have Sierra games. So I never really played King’s Quest or stuff like that when they were contemporary. It was always kind of after they were past their prime. But I'd be willing to check it out and see what's going on. Marcus was very generous with his time and at least within the span of questions that he was allowed to answer gave some pretty good information about what people should expect about a new game from Ken and Roberta Williams.

But I think the exhibit that really stole the show and the one that everybody was just completely 100% on-board with was Smooth Jazz and Stormy Skies. These people had two tables of old vintage Weather Channel WeatherStar equipment. This is the stuff that would produce the images and slide shows and graphics work that you would see when you tuned to the Weather Channel in the eighties, nineties, and early aughts. They had a bunch of CRTs set up basically showing live local weather as if it was the old Weather Channel. It was great. There was some music too—you know the kind of music that you would hear on the Weather Channel. “And now the forecast for Belmar, New Jersey: cloudy with a high of 70.” They would just run that all weekend long.

I have to say a lot of the fun I had at the show was just sitting there and watching the freaking weather. It certainly attracted the most attention out of any exhibit simply because they had a lot of space and they had a lot of equipment. You could come up and see all the various stages—the SGI-based equipment, the Intel-based equipment, their homegrown equipment. Just seeing it on all these old TVs, like an old Commodore monitor that was running the Weather Channel, which, I dunno, something about that just seems very appropriate to me. I would highly recommend checking it out if you have any kind of affinity for the old aesthetic of the Weather Channel or just how weather used to be delivered in the past 30 or 40 years. I enjoyed it quite a bit.

Classic weather for classic computers.

Classic weather for classic computers.

In addition to these sort of exhibitors who are there to talk about various things like the Heathkit computers and such, there were also people there who are trying to sell things. These shows usually have buy ’n’ trades and there was a consignment and free table, but also there were just people there who are dealers selling things, which is cool—they had a lot of interesting things that I hadn't seen before. I think compared to VCF Midwest, there was definitely less stuff for sale. I stopped by and purchased a SCSI external enclosure from one of these fellows who was selling a whole bunch of various cards of different provenance. Things like ISA network, adapters, ethernet, adapters, serial cards, parallel cards, just all sorts of neat doodads that unfortunately were not on my neat do-dads to buy list, but it was still cool to see them altogether.

Another thing to do was taking photos. I took a lot of pictures. I'll post some in the blog post. Mostly it was to take pictures of old computers to be able to use if I ever write blog posts about them, because getting photos that are not encumbered by copyright are kind of difficult, dnd I don't like taking things from people. So I try to stick to public domain or things that aren’t going to be a problem with somebody if I use them. It's always good to ask permission from photographers, but otherwise I try to stick to public domain things that are released instead of going to Google image search, and trying to just right click and take whatever random person's photo. It’s not my photo, it's their photo, and I would rather use my own pictures if at all possible.

Panels and Keynotes

Aside from the vendors and exhibits, there were talks and panels and keynotes. I saw two panels, and the first one was Mike Tomczyk, who was the first marketing executive at Commodore for their home computers. He had a very interesting life story. He talked about his experience in the army and how it prepared him to be part of this computer market that was “business as war,” as Jack Tramiel said. And it kind of prepared him for it, because he definitely knew what war was about because he was in one. Mike talked about how he was in the early computing days, where he knew people at Atari and Apple and so on, and he decided to go with Commodore and he built those early marketing campaigns for the VIC-20.

Mike Tomczyk.

Mike Tomczyk.

Mike was part of the William Shatner commercials that everybody has seen. He also was part of getting things in magazines, changing their advertising strategy. Mike’s time with Commodore was until, I want to say, 1985-ish. I believe that was around when he left. And so he was part of those early days where they introduced the Commodore 64. It was interesting to hear him talk about some Jack Tramiel story bits that I hadn't heard before. They might've been out there, but I personally hadn't heard some of them. When he was asked about being in the Nazi prison camps, Jack would say “I try not to live in the past. I try to live in the future.” And for a guy who was in the computer business, I think that was kind of an apt way of thinking about it.

Mike didn't gloss over the problems at Commodore. He was willing to talk about Jack's sometimes self-destructive short-term gain at the expense of longterm-ness that went through Jack's way of doing business sometimes. As he said, business was war and cutthroat, and there are positives and negatives to that. I thought it was just really interesting hearing sort of a guy on the inside perspective from that, because I was never really much of a VIC-20 guy, and they talked about how it was important to get something that was cheap and inexpensive.

One thing that was prevalent in what Mike was talking about was how he believed in making computing affordable for everybody. He wanted the VIC-20 to be under $300. They had to have arguments with engineering about changing designs and other things like that. To be fair, a lot of engineers that he had were willing to work with him on that. They produced the VIC-20, which compared to the Commodore 64 is definitely underpowered and has a little bit of problems. But the VIC-20 was a pretty popular machine. It brought in a lot of revenue and kept Commodore going. It would have been nice to have heard some of these Jack Tramiel anecdotes before I went and did my Commodore episode a couple of weeks ago, but c’est la vie.

Following Mike was Bill Herd, one of the designers of the Commodore 128 and had worked on the Ted series of machines, like the Plus/4 and the C16. Bill was wearing a MOS Technologies t-shirt, which was nice to see. Now, I kind of knew what to expect going into Bill's panel because he has done some of these panels before. I think one thing that really makes him a good public speaker is that he kind of knows that some of this stuff is greatest hits material. It's been on YouTube, he’s done talks before. He's talked about how he put certain things together on the 128 or the Ted machines in the past. Here, he did it in such a way that it wasn't the same as how I've seen him talk about these things before. He knows how to mix things up, he knows how to play to the crowd a little bit. For something like this, where some people here have probably heard him say these things before, you don't know what kind of level the audience is at when you're giving these kinds of talks. So for him to be able to go through and say, “Hey, this is what we did at Commodore. This is what I did. These are the machines I made. These are the troubles that we ran into,” and still keep it fresh and interesting is a real skill.

Bill Herd.

Bill Herd.

And that's probably why people enjoy Bill so much because he has a candor that some other people don't have. He's willing to say, “Hey, you know, this is where we did things, right. This is where we might've screwed up a little bit.” It's a honest appraisal of what they were doing back in the day. You can go watch the livestreams that are on the Vintage Computer Festival YouTube channel. They'll probably divvy them up into separate panels eventually, but the livestreams are there and you can go and check them out at your leisure. That's pretty much what I did on Saturday—it was going to those panels, going to all the exhibits, buying stuff, going around and seeing other things like that.

Hanging Out on Sunday

Sunday was a much quieter day. I spent most of it just kind of wandering around, seeing what was going on in consignment and hanging out with various people. So in one corner of the exhibit hall they had the Mac shenanigan zone, which was anchored by some YouTubers. We had Steve, AKA Mac84—you might've heard him before on the Icon Garden. There was Mike from Mike's Mac Shack and Sean from Action Retro who were all in this corner with their prototype Macs and a MUD running on a Quadra 950 server, some clones, and all sorts of interesting things. I hung out with them for the most part on Sunday afternoon. It was cool to hang out and put some names to faces and talk to people in person.

Yours truly on the left, Steve on the right.

Yours truly on the left, Steve on the right.

We had a little bit of shenanigans there themselves because Sean had accidentally torched his G4 upgraded clone by trying to boot it with a Leopard disc. We wound up having to do a little bit of on-the-show surgery to reset something using a spare video card from one of Steve's clones. You never know quite what you're going to get. It was neat to see the prototype iMac G5 with the compact flash slot, which was deleted in the shipping model. We've heard of and seen these things in pictures, but it's nice to actually see them in person. I would recommend you all subscribing to these guys’s channels—they’re all good. They all talk about old Macs. If you're interested in those kinds of old things, they're cool guys to know and hang around with.

Like I said earlier, if you're interested in seeing the majority of stuff in the show, you're better off going on Saturday than Sunday. But one of the nice things about a Sunday at a show is that there's less people and it's more relaxed. It's easier to hang out with people when there's less people around. You can just break off into smaller groups and just chit chat or whatever. It's also easier to do a little more negotiating if you're interested in buying stuff on a Sunday as well. By then I had already done all of my purchasing.

My Big Get: A NeXTstation!

And speaking of purchasing, a thing that I bought was a monochrome NeXTstation. That's right, a NeXT slab—I now own one! I was thinking really hard about buying the color one, but the problem was, would my monitors work with it? I had to think about it a little bit, but unfortunately, hesitation was a problem because by the time I said, “wait a minute, one of my monitors has RGB BNC connectors,” the color workstation was already gone. So I wound up buying the monochrome NeXTstation for 75 buckazoids. Doesn't have a hard drive in it, but the machine otherwise works. So I just have to put a SCSI2SD or something else into it. I can wire up some way to hook it up to a monitor, and I have accessories that can work for keyboards and mice. So I'm looking forward to giving that machine a shot. Plus I've always wanted a NeXT just to have in my collection. It's a pretty good example of one, and it's in very good shape. So even if it's just a display piece, I'm all for it.

NeXTstations, and I got one!

NeXTstations, and I got one!

I bought the NeXT from Matt Goodrich of Drakware. He was also selling some other things like ADB to USB converters, SGI to USB, NeXT to USB, basically just ways of using modern keyboards and mice with older computers. There's still plenty of ADB things out there, but sometimes you just want to use a nice new keyboard and mouse. Those things did what they said on the tin. He had a nice old Mac SE connected with one and it worked. I’d have no complaints if I needed one. He also had a complete NeXT cube, with a SCSI2SD, monitor, keyboard, and mouse. He listed it for a thousand dollars, and somebody bought it for a thousand dollars. Good for you, man. I'm glad that NeXT found a home. It was too rich for my blood, even though I would love to have a full NeXT setup. But a thousand dollars was well out of my budget for big ticket items. I said, “well, I would allow a $250 big ticket item,” and I didn't even spend that much. The NeXTstation was much cheaper than I thought it would be.

Final Thoughts on VCF East 2021

So after I got home, what would I say about the show overall? I enjoyed it a lot—it was a fun time. If you like old computers, you'll definitely have a good time there. I saw somebody with a nice Wang professional computer, which was nice to see as somebody who lived in the shadow of the Wang towers. There was a lot of just unusual things like the Heathkits. I have no attachment to those Heathkit machines, but it was nice to see them and actually play with them. And hopefully it gets some other people involved to say “Hey, now I'm interested in collecting and restoring these.”

I really enjoyed my time at the show, but I hope that we could have some improvements for future ones. I can definitely tell that this show is a labor of love. It's run by volunteers as most all of these conventions are. But I think something that could be improved is how they're handling the consignment section. Consignments would open up at nine o’clock, and if you were not there on time you could miss out. And I will say the pickings were slim half an hour after things opened—you could definitely tell that a lot of things just got picked off very early. It's very hard to survey what was available, and you might not even have known something you wanted was there.

I don't really see how they can improve that in an equitable way without other knock on effects. What I would do is say the consignment hall is allowed to be open at nine o'clock for people to browse and be able to bring stuff in and set up. But people would only be able to purchase after, say, 10 o’clock. That way people at least have a chance to know what was coming. Yes, there was stuff that came in at various points during the day, but you would have had no idea what was coming and going unless you hung out in that hall all day. Truthfully, for a lot of the day it was kind of empty. There was stuff that came and went kind of slowly, but you would have been in there for an hour and you've been okay, I've probably seen enough.

Unless you had like some kind of notification system to know when things were going on sale, you would’ve had no idea as to when to check it out to even be able to buy something. So I didn't get anything in the consignment hall. I was actually going to put a G4 in there, but fortunately somebody contacted me before the show and I was able to trade it for a PowerBook G4 and I didn't have to worry about any of that. My other stuff that I brought with me was specifically to give to Steve: a Power Mac 8100 with a G3 card and a beige G3 tower. Hopefully we'll be seeing that on his channel in the near future.

Something else to improve would be the handling of the VIP guests. I know they had some people out in the front foyer at times, and at the end of Bill’s thing, someone said “you’ll see him in the cafeteria." I'm like, well, where's the cafeteria? Is that the staff room where they're having the lunches and stuff or what? Is that in the consignment area? It wasn't really clear. Most conventions usually have dedicated tables for like the guests of honor and things like that. I think that would have probably made sense here. I had no idea where to find Bill or these other people. Maybe they didn't want people to come by and talk to them. Maybe they just want to walk around and have fun. And they did. I mean, I saw Mike Tomczyk hanging around at various tables, but it's just one of those things where if I was running things, I would probably try to figure out a way to make those guests a little easier to find. I'm not saying they need to be chained to the table the entire show, just more to say, “Hey, Bill Herd is going to be at X table at Y time during the day, come by, buy his book, shake his hand.”

A thing I think they did really well was that even if you weren't at the show, you could still see everything because they livestreamed the keynotes. So if you wanted to see the Bill Herd talk or Mike’s talk, or Bill Mensch, or Scott Adams, you just go on the VCF YouTube channel and watch them, which I think is very fair. It’s tough for people to go to these things and to give them the ability to see it without having to be there is a smart move. Maybe they have other people living in the house that are maybe higher risk. I'm a single guy—there’s nobody else living in my house. So my exposure risk is probably lower than other people's. I think that was pretty smart of them to do something like that.

So the question is, will I be back in April? The answer is maybe. I enjoyed it a lot. I do have a feeling that I would be seeing a lot of the same stuff. If I went back in April, I don't know what they're going to do for more guests or things like that. It is kind of a long drive. Normally if I go to New York city for things, take the train. I don't like driving through there, and even just going around New York City is a pain. Even if you go over the Tappan Zee and take the long way around, it's still a five to six hour drive. For a long weekend, that’s doable, I’ve done it before, but it’s still a slog. Also, getting to the show requires a car. If you want to take the train, there is one that goes from Penn station down to Belmar. Then you'll need to take a lift from the local station or arrange some other transportation to get to the show. It might still be a good idea to bring your car anyway, just because if you decide to buy something, you need a way to lug all that stuff home.

Next year I'm definitely going to try to go to VCF Midwest, mainly because I know people in the area and it would be fun to go with other people, and it is a bigger show. Will this show grow? I don't know, but if you have any kind of interest at all in these old computers, or even just computing in general, there's other stuff to see there at the InfoAge as well. They have a World War II museum. They have other things are going on there as well, which would certainly interest you if you had a family or young kids. I saw a pretty decent amount of teenagers and other people who I could tell are getting into this because it's a fun thing to get into.

I hope that winds up bringing more people into the fold, because these machines are getting older, and the reality is, is we’re all getting a little older. So I'll close out saying it was nice to see some people and see some new things. And hey, now that I've got a NeXTstation, maybe I'll be able to make some more stuff about NeXT. Thanks for listening and check out the Vintage Computer Federation. See if they have an event in your area. They have VCF East VCF, Midwest and VCF Eest, which is in Portland, Oregon. So make sure to check it out if you're at all interested in old computers and old tech.