A generational hardware gap

January 10th, 2011 12:10 PM
by
Filed under History;
1 comment.

Anyone who has been using computers for a few decades has the questionable pleasure of reflecting on how far technology has come. We remember the massive leap forward 3.5" floppy disks represented over the old, which makes us appreciate the volume and affordability of modern storage all the more.

Newcomers to technology don’t have that historical context — but rather than berate them for not knowing how good they have it, Montreal journalist Jean-Christophe Laurence brought them face-to-face with hardware older than they are. He presented several kids with such equipment as a Nintendo Game Boy, an LP record, several floppy disks, and more. With nary a hint as to their purpose, the children were tasked with determining the nature of the enigmatic tech. He recorded the results, which are presented in French with English subtitles:

It’s a creative scenario, as it doesn’t try to impress upon the students how different this stuff is from what they know: when they guessed an LP was like a CD, nobody said “Yes, but it holds this much less data, and has this much slower access times.” It’s more a matter of function and design than of better or worse, which is likely to be more educational and thus make them better appreciate (and familiar with) what’s come before. (Maybe they’ll learn the other by being taught programming on retrocomputers.)

It’s also similar to what older generations have to do when confronted with new technology. We’ve heard those old chestnuts of newbies mistaking a CD-ROM tray for a cup holder, or a mouse for a foot pedal or a TV remote. Those mistakes happen because users are familiar with cup holders and channel changers, so they bring those analogues to their new experiences. It’s impressive how spot-on many of the above children’s guesses are, especially when they have to use modern metaphors to make their guesses. Although it’s useful to have a frame of reference by which to learn new skills, as they demonstrated when confronted with a 3.5" floppy, it’s also occasionally necessary to abandon old ideas to grasp new ones.

What do you think? Should these kids have been able to identify these objects? Would you have been able to?

(Hat tip to Genevieve Koski)

The history of game design

November 22nd, 2010 11:11 AM
by
Filed under Game trail, History;
Comments Off on The history of game design

My alma mater offers a major in interactive media and game design, a field that didn’t exist during my time there as a student. It’s one of many such programs that have popped up across academia in the past decade, in response to the growing popularity and cultural acceptance of video games as an industry and pastime.

Yet electronic game design predates its study by decades. When there were no templates, exemplars, formulae, or rubrics, creative programmers experimented with creative and risky innovations, setting the course for thirty years of successors. Although modern games can still be ingenious, such variation from popular game design is often relegated to low-budget “indie” games and not the big-budget blockbusters sold at retail, which are almost always sequels to existing intellectual properties (IP). This was not the case with the Apple II; visionary games such as Lode Runner, Oregon Trail, and Choplifter were enormous successes and are remembered fondly today.

When today’s students are educated in game design and theory, it only makes sense to reflect on historical successes as well. Some academic institutions have wisely chosen to complement their modern game design with this retrospective look. Such a course was once offered at Stanford as the “History of Computer Game Design“.

This course provides a historical and critical approach to the evolution of computer and video game design from its beginnings to the present. It brings together cultural, business, and technical perspectives. Students should come away from the course with an understanding of the history of this medium, as well as insights into design, production, marketing, and socio-cultural impacts of interactive entertainment and communication.

The course’s required reading includes Dungeons & Dreamers, a book I gave high marks to when I reviewed it for Juiced.GS for its analysis of the 1970s and the era’s intersection of popularity in Dungeons & Dragons, The Lord of the Rings, and personal computing. Considering such engaging assignments, I have to wonder why Stanford’s course wasn’t popular enough to have become a regular part of the school’s curriculum; sadly, the “History of Computer Game Design” course does not appear to have been offered since 2005.

This class was part of an accompanying interactive project that has likewise not been updated in nearly a decade. It had an ambitious and socially relevant mission:

The aim of this project is to explore the history and cultural impact of a crucial segment of contemporary new media: interactive simulations and video games. Once the late-night amusement of nerds and hackers who built “Space Wars” and the “Game of Life” in the 1950s and 1960s, video games and interactive media have emerged as one of the most vibrant elements of today’s entertainment industry. However, despite the growing popularity and legitimacy of video games, the importance of the medium itself has all but eluded notice by most scholars and media critics. As a result, this project seeks to ground the history and study of video games within a framework of rigorous academic discourse.

While Roger Ebert may contend that video games are not art, others have suggested the better question is: “Can artists express themselves through the video game medium?” I feel the answer to that is an obvious “Yes!”, as demonstrated by games from the Apple II to today. It’s only a matter of time before game design history is as common a field of study as art history, film theory, and music appreciation.

In addition to the aforementioned Dungeons & Dreamers, other books providing academic perspectives on game design’s history include Twisty Little Passages by Nick Montfort, and Dungeons & Desktops by Matt Barton.

(Hat tip to Jason Scott)

The devolution of user engagement

November 1st, 2010 10:12 AM
by
Filed under Musings;
1 comment.

A colleague and I recently had a friendly debate over whether to use one space or two after periods. We agreed that, whatever our personal preference, we should settle on the standard of the publication for which we’re writing. In her work as a Web editor, she occasionally has to remove those extra spaces, a chore that was recently made easier when a co-worker showed her how to use the find and replace function to do so.

“Did you not know you could find and replace punctuation,” I asked, “or did you not know how to find and replace at all?”

“I didn’t know about it at all,” she clarified. “I’ve been living under a software rock.”

I was astonished at her lack of familiarity with this basic word processing function. This oversight is not representative of inability: she had finished high school and college and been accepted into an esteemed graduate program in publishing, demonstrating a felicity for learning. She’s also not alone in finding software foreign, as I’ve met many people who on a daily basis are happy to use these machines in a most inefficient way. Most consumers think that programs are something to be mastered instead of tamed: they design their workflows around what the software expects, which is the antithesis of ergonomics.

For someone who grew up with the open architecture of the Apple II, I find this pattern unnerving at best, and one I want to understand better. To what can we attribute this regression?

The improved accessibility of computers is certainly a factor, as there is now a lower barrier for entry. Computers of three decades ago required a basic understanding just to boot the machine and then run the software. Yesterday’s lack of intuitive graphic user interfaces (GUIs), online help systems, and large installed user bases meant each person was alone in deciphering the programs — or writing their own. Such arcane knowledge is merely optional today.

A consequence of this approach is diminished engagement with users. Since they no longer need to hunt for features and commands, they no longer do so at all. They take everything at face value, not realizing the program’s potential that goes untapped. There isn’t even printed documentation that they can peruse to discover all the functions they’re not using.

You may think that, with the increasing prevalence of closed operating systems such as iOS, this accommodating mindset of computer users will become a necessary one. But a user needn’t have access to a computer’s command line to be able to use software efficiently. The first thing I do in any application I install, be it for a desktop computer or a cell phone, is investigate the preferences, so I better understand the options and behaviors available to me. When I installed Microsoft Office 2011 for Mac last week, I immediately set Word to save its files in .DOC format by default, eschewing the more modern .DOCX. This single customization thus prevented numerous headaches caused by friends with older software not being able to open my files. Most users wouldn’t think to explore that possibility.

That’s because making computers friendlier has, ironically, also made them less engaging, and thus less educational. Growing up with computers that made few efforts to be understood taught me how to understand today’s computers. For example, I recently purchased the new iLife suite from Apple. The multimedia editing software iMovie is immense in its capabilities, but because I’m curious and playful — qualities that are learned as much as they are innate — it wasn’t long before I was mixing clips, ducking audio, and exporting to YouTube.

Granted, programs like iMovie have overwhelming potential: today’s software is capable of so much more than could have ever fit onto a 140K floppy. Maybe today’s users are learning just as much about today’s computers as we did about the Apple II; it’s only the proportions that have changed, as the comprehensive simplicity of yesterday’s programs made it easier to grasp its commands in their entirety.

Appleworks command set

Behold, the Appleworks word processor command set in its entirety:
just enough to master. (Click for the full list.)

As more and more people use computers, will they become less and less efficient at doing so? Will our programs continue to bloat until only artisans and the hardcore can do more than scratch their surface? Or is that potential rightly buried, rewarding those dedicated few who know there has to be something better lurking beneath the surface?

I don’t know the answers to these questions. But I’ve seen high school students marvel at being taught new computer skills, from as basic as Microsoft Word’s “Track Changes” function to as esoteric as the Scheme programming language. The earlier we introduce users to such concepts, the sooner we’ll ignite that creative spark that will drive them to learn what else these fascinating devices can do. Then we’ll have a next generation not just of programmers, but of power users — and anyone who wants to compete with them in the workforce had best start cracking the books.

Academic Ultima

October 7th, 2010 12:20 PM
by
Filed under Game trail, Musings, Software showcase;
7 comments.

Before my current job at Computerworld, I taught 11th grade tech writing at a math and science charter school. My fellow teachers had an open door policy that allowed me to observe their classes, and I developed a rapport with the computer science teacher. When an emergency called her away from class one day, she asked me to fill in but left me no lesson plan. Fortunately, I’d already installed both Adventure and VisiCalc for just such an emergency. The resulting lesson in computer history was reported in Juiced.GS, though I never did get the opportunity to explore Adventure with my students.

But other educators have had the opportunity to use electronic entertainment as a learning tool. Besides the use of interactive fiction in a classroom setting, as detailed in Get Lamp, Michael Abbott has taken a more ambitious approach to virtual adventuring by introducing his students to Richard Garriott‘s seminal role-playing game, Ultima IV: Quest of the Avatar.

In his blog, the teacher doesn’t outline his learning objectives, other than puzzle-solving and note-taking. I hope his goals were not much loftier than that, because it seems these students disappointed him:

It mostly came down to issues of user-interface, navigation, combat, and a general lack of clarity about what to do and how to do it. … it [wa]sn’t much fun for them. They want a radar in the corner of the screen. They want mission logs. They want fun combat. They want an in-game tutorial. They want a game that doesn’t feel like so much work.

Ultima IV: Quest of the Avatar

That looks like a lot of reading!
Photo courtesy Blake Patterson.

I’m unsure how many of these obstacles were inherent to the game, and how many were symbolic of a generational gap. Today, I often want a game that immerses me within the first five minutes, and which I can put down after ten. That means either simple gameplay (in the case of classics like Pac-Man or Qix) or familiar gameplay (like Dragon Quest VIII, an RPG I played hours on end for a total of 80, but whose mechanics had remained largely unchanged since the franchise’s origin in 1986).

I was not always predisposed against learning curves. I grew up playing and enjoying Ultima, but not on the Apple II. I was and am primarily a console gamer, and I played these games’ Nintendo adaptations, which vexed me with none of the issues that these students encountered. I wondered how dramatically different the NES version had been that maybe it had eased my entry into Britannia. Sure enough, one blog commenter wrote:

… have you considered giving people the NES port of Ultima IV? It faithfully retains the ethical systems design of Garriott’s original while reimagining the visual aesthetics and interface design according to the conventions of JRPGs. It was how I played Ultima IV back in the day, and it’s still probably a lot more in line with modern RPG convention than the original PC Ultima IV.

But I’d wager that most alumni of Ultima IV experienced it on the computer, which apparently did not preclude its success, so surely its original interface was not insurmountable. More likely is the change in gaming mores over the decades. In the book Dungeons & Dreamers, authors Brad King and John Borland relate the detail and intricacy with which the developers of Ultima Online imbued their world. Ecology, economy, and more were devised to create a world that lived and breathed along with the players. When it finally launched — the world was shot to hell. Gamers traveled into the countryside, burning trees and killing animals. No plane could long host such chaos, so the developers had to go back to the drawing board. I suspect Mr. Abbott’s students would likely have contributed to that headache.

Nonetheless, I hope the blog’s conclusion does not spell the end of this exercise: “I love great old games like Ultima IV, but I can no longer assume the game will make its case for greatness all by itself.” Just as we have courses in art and music appreciation, it’s important to understand and appreciate the origin of Ultima and other video game hallmarks. Today’s gaming industry was not born in a vacuum, and just as the bold experiments of yesteryear determined the future of the genre, they still have much to teach today’s gamers and programmers about what works, what doesn’t, and why things are the way they are. Finding a context in which to teach that lesson is, much like the games themselves, worth the effort.

(Hat tip to Richard Garriott)

Educational nightmare

September 30th, 2010 11:48 AM
by
Filed under History;
1 comment.

We’ve all had this nightmare, haven’t we? (Click for larger image.)

xkcd: Students

True to the comic’s caption, this dream has plagued me for nearly twenty years. As it’s most common just before a graduation, I’ll be due some sleepless nights come May 2011. Though logic has no place in the realm of sleep, upon waking, I find comfort not only in my diploma, but also in knowing I’m not alone in this fear.

Some companies are malicious enough to play on our fears for profit, and Apple is no exception:

Aren’t all the fond memories we have of the Apple II from writing term papers on it at 3 AM? How would we have graduated without it? On the other hand, data was far more fragile in the Eighties than it is now. How many nightmares were caused by school papers being lost to malfunctions, user error — or, worst of all, dysentery?

Teaching retroprogramming

September 13th, 2010 9:09 AM
by
Filed under Mainstream coverage, Musings;
3 comments.

The annual Beloit College Mindset List, which outlines the world in which the incoming class of college freshman grew up, indicates that for members of the class of 2014, “The first home computer they probably touched was an Apple II or Mac II; they are now in a museum.”

Fortunately, for students in Bletchley, Milton Keynes, England, their experience with retrocomputers is more recent — and eminently practical. BBC News reports:

As a former teacher, I can fully get behind this classroom curriculum. It wasn’t long ago that I suggested a lab of Apple II computers could be an effective and modern learning tool. Although the computers featured in this video are not Woz’s brainchild, they are its contemporaries and teach many of the same lessons my proposed lab would. As one student said, “The old machines have a lot to teach us. They run a lot slower, and you can actually see the instructions executing in real-time.”

What I hope the students learn is how to make the most of limited hardware and software resources, though this quotation makes me wonder if they missed that point: “It makes you a lot more efficient, and you think more about your code, because it’s harder to type it all in.” Although the arduousness of input can indeed be a powerful motivator against error, I don’t think it’s a programming environment that one need tolerate on even a classic computer. The Apple II worked around this limitation with Beagle Bros‘ excellent Program Writer for Applesoft BASIC. Such utilities don’t encourage sloppy programming but instead improve the rate at which you can learn from your mistakes, whereas modern machines and their gluttonous resources permit sloppy programming that would never fly on a computer whose memory is measured in kilobytes.

This classroom’s demographic reminds me of the demoparty I attended this summer, where most attendees were younger than the computers they were hacking. KansasFest likewise has an increasingly youthful attendance, with Apple II users still in or recently out of undergraduate programs. This next generation of retrocomputing enthusiasts has great potential to apply modern programming techniques and structure to classic design. For example, put these students into a limited-time programming contest, and you’d have HackFest. I wonder how they would fare?

I couldn’t help but take umbrage when the reporter says that the student’s work almost looks like a “real video game”. Of course it’s a real video game! Software doesn’t need rockstar programmers or cutting-edge technology. The original versions of Lode Runner and Oregon Trail have more staying power than any jazzed-up modern adaptations. I wouldn’t be surprised if these kids are the next programmers to recapture the fun and wonder of these classic games.

Because BBC is awesome, their story also has one of their own news reports from Oct. 17, 1986, that showcases the computers of the day, including the Apple IIGS. That video is not embeddable, so I encourage you to watch it on their site.

(Hat tip to Slashdot and Mitch Wagner)