An Arduino keyboard for the Apple II

November 20th, 2017 7:50 AM
by
Filed under Hacks & mods;
2 comments.

My first computer was an Apple IIe that my father purchased to help manage the family business. Given the wealth of games that were also available for the Apple II, it was inevitable that its use spread to his four sons. All was going well until one of us reached for a box of floppies on the shelf above the computer and dropped it on the keyboard, busting a keycap. My father angrily decreed his expensive business computer was henceforth off-limits to us kids — a restriction that I don't recall lasting more than a week.

With the exception of that mishap, our keyboard always performed admirably, without failures or flaws. I don't recall the Apple IIe showing any other signs of wear, tear, or distress in the five years we owned it.

The same can't be said for Max Breedon, who recently unearthed his Epson AP-200 an Apple IIe clone he acquired from a pawn shop twenty years ago. The keyboard decoder chip, a C35224E, was non-functional — but that didn't stop Breedon. After consulting Mike Willegal's keyboard page and doing some testing of his own, Breedon put an Arduino on a daughterboard that connects the keyboard to the motherboard. His solution is actually better than the original, since it speeds data entry of program listings found on the Internet — something the clone's manufacturers never anticipated:

[T]he Arduino can not only decode the keyboard but also you can upload text directly into the Apple as if you typed it in. This is achieved through serial communication from your PC to the Arduino: the Arduino is listening for serial data and any that it receives it converts into keypresses and pipes it into the Apple. This means that you can cut and paste basic programs directly off the internet and upload them into the apple as if you typed it in on the actual keyboard!

Arduino keyboard

That's a neat trick! I've never used an Arduino, so I wouldn't be able to duplicate this functionality — but it could be the underpinnings for a product I'd purchase for an official Apple II. There's more technical information on Breedon's website, should anyone else wish to investigate or re-create his work.

(Hat tip to John Baichtal)

Apple IIe vs iMac throwdown

August 8th, 2016 9:22 AM
by
Filed under Mainstream coverage;
1 comment.

In 2010, the Apple iPad was brand new, having just been released that past April. At the time, I was an editor at Computerworld, where I provided annual coverage of KansasFest, the world's premiere Apple II convention. Unlike Juiced.GS magazine, whose readers are retrocomputing enthusiasts, Computerworld's website had a more general audience, requiring I connect our favorite 8-bit machine to something more modern and relevant — such as the iPad.

Thanks to the loan of Loren Damewood's iPad and Tony Diaz's Apple Graphics Tablet, I produced the photo gallery "Face-off: 1979 Apple Graphics Tablet vs. 2010 Apple iPad". Comparing a drawing tablet to a tablet computer was, of course, ridiculous; a fairer comparison would've been to compare the Apple Graphics Tablet to a Wacom tablet. But where's the fun in a fair fight?

The esteemed WIRED magazine adopted a similar philosophy when they recently pit ancient technology against new. They took an Apple IIe and an iMac — coincidentally, my father's first and last computers — and compared their specs, dimensions, expansibility, and more. The resulting smackdown is this two-minute video:

When I bought my first Macintosh in 1997, I did so begrudgingly, to comply with the requirements of my university. At the time, I felt my Apple IIGS could still do everything I needed from a modern machine. Times have changed, of course, and an Apple II is no longer a viable primary computer for someone who wants to engage in mainstream multimedia, gaming, and social networking. But it's fun to see WIRED still acknowledge some of the foresight Apple had in designing their first machines, giving it strengths that modern computers lack.

Today's computers may be more powerful — but that doesn't necessarily make them "better".

(Hat tip to David Schmenk)