Tech luminaries we lost in 2017

January 1st, 2018 10:32 AM
by
Filed under People;
Comments Off on Tech luminaries we lost in 2017

Five years ago this month, my tenure as an editor at Computerworld ended. But that wasn’t the end of the story: the many colleagues I’d worked with extended an invitation to continue freelancing for the publication — an invitation I gladly accepted.

While Computerworld was happy to publish Apple II articles when it came for free from a staff writer, it’s harder to justify paying a freelancer by the word to cover a 40-year-old computer. So my articles in the last five years explored other topics, including an annual tradition that I inadvertently began: a slideshow of tech luminaries we lost.

It was October 2011, and Steve Jobs had just passed away. I was on the features team — a group of editors who met biweekly to discuss big ideas for stories. Compared to the daily news grind, a feature could take at least a month to write and was almost always farmed out to a freelancer. Several websites were disgruntled that Steve Jobs’ passing had gotten more publicity than that of Dennis Ritchie, who created the C programming language and co-created Unix. I thought this a good opportunity to shine the spotlight on other overlooked industry veterans, so I suggested we publish a feature in time for Halloween that asked the question: "Who’s next?!" What other aging founders were we likely to soon lose?

The features team leader politely said, "Ken, that’s a really terrible idea… but there may be a good idea we can get out of it."

Thus was born the annual end-of-year slideshow that looked back on tech luminaries we lost in that calendar year. For the next several years, including during my transition from editor to freelancer, I watched other writers assemble the slideshow. In 2014, I was honored to assigned the story, finally being given the opportunity to execute the concept I’d proposed years ago.

That first year, I included Bob Bishop, whom I’d had the pleasure to meet and photograph at KansasFest. I skipped 2015 but wrote the slideshow in 2016 and again just last week for 2017. This latest lineup was the first time I got to choose which luminaries to honor, instead of having them assigned to me. It made it much easier to ensure a diverse cast when that virtue was baked in from the beginning. It also allowed me to include luminaries who might not otherwise have made the cut at Computerworld, such as Keith Robinson of Intellivision fame.

Tech luminaries we lost in 2017

While there were no Apple II legends in this year’s roundup, Apple Computer Inc. was doubtless influenced by the heroes we lost in 2017. It was Robert W. Taylor who conceived of the ARPAnet, which became the Internet — but he also worked at Xerox PARC, from which Steve Jobs got the ideas for GUI, mouse input devices, and more. Charles Thacker was another PARC alumnus who helped develop the Xerox Alto, the early computer that embodied these concepts.

Writing this slideshow is a morose way to lead up to the holiday season — but I take heart in my ability to carry the legacies of these early innovators and ensure their stories are known. For everything they did for the Apple II and its users, I salute them.

Laboratory origins of the Apple II

August 18th, 2011 4:02 PM
by
Filed under History;
1 comment.

The past week has been one of milestones: the IBM PC turned 30, just days before the Web itself turned 20. Of course, both are still whippersnappers compared to the Apple II, but it’s timely to consider the birthplace of that and other innovations.

Of course, Steve Wozniak was the genius behind the Apple II, but many of the ideas found in the Apple II and other computers were devised in collaborative environments. A recent story looks at six computer labs that gave birth to the digital world. The penultimate page is dedicated to the many innovations to come out of Xerox’s Palo Alto Research Center (PARC) and how they influenced Apple’s products:

These inventions culminated in 1973 with the Xerox Alto, the first GUI-driven personal computer (check out the three-button mouse!) Sadly the Alto was never sold commercially, and only 2,000 units were built — but don’t worry, its legacy lived on with the 1977 Apple II, the first mouse-and-GUI-driven home computer, and the 1984 Macintosh.

Although the Apple II did indeed have a mouse, I (as some article’s readers do) think the author intended to reference the Apple Lisa. Steve Weyhrich’s history of the Apple II supports that reading:

Xerox’s Palo Alto Research Center (PARC) develops the “Alto”, a breakthrough computer which used a pointing device called a “mouse”, a bit-mapped graphic screen, and icons to represent documents. Also, it had a 2.5 megabyte removable disk cartridge and the first implementation of Ethernet. It cost $15,000 just to build it, and only 1,200 were ever produced. This computer and the Xerox Star were the inspirations within a decade for the Lisa and the Macintosh.

Readers also take exception to a statement on the preceding page that "in 1981, IBM released the Personal Computer, the first home computer made from off-the-shelf parts". Was not the Apple II that machine? At least one reader says yes:

The Apple II was all off the shelf parts 6502 processor, 4K memory (8102 DRAM?), etc. There were no custom IC’s. The only “non off the shelf” were things like the power supply, keyboard, paddles, etc. The same was true for other microcomputers sold before IBM played catch up with the 5150 (including the Altair 8800). On all of the microcomputers, (including the IBM 5150) the design, circuit board, case, keyboard, expansion slots, power supply, were custom as well as the BIOS. The IBM 5150 helped to mainstream microcomputers (outside the classroom) because “No one ever got fired for recommending IBM” — a common quote from that era.

What say you, dedicated retrocomputing enthusiasts? Did the author of this gallery do his homework, or are a few of his facts a wee bit off?