A colleague and I recently had a friendly debate over whether to use one space or two after periods. We agreed that, whatever our personal preference, we should settle on the standard of the publication for which we're writing. In her work as a Web editor, she occasionally has to remove those extra spaces, a chore that was recently made easier when a co-worker showed her how to use the find and replace function to do so.
"Did you not know you could find and replace punctuation," I asked, "or did you not know how to find and replace at all?"
"I didn't know about it at all," she clarified. "I've been living under a software rock."
I was astonished at her lack of familiarity with this basic word processing function. This oversight is not representative of inability: she had finished high school and college and been accepted into an esteemed graduate program in publishing, demonstrating a felicity for learning. She's also not alone in finding software foreign, as I've met many people who on a daily basis are happy to use these machines in a most inefficient way. Most consumers think that programs are something to be mastered instead of tamed: they design their workflows around what the software expects, which is the antithesis of ergonomics.
For someone who grew up with the open architecture of the Apple II, I find this pattern unnerving at best, and one I want to understand better. To what can we attribute this regression?
The improved accessibility of computers is certainly a factor, as there is now a lower barrier for entry. Computers of three decades ago required a basic understanding just to boot the machine and then run the software. Yesterday's lack of intuitive graphic user interfaces (GUIs), online help systems, and large installed user bases meant each person was alone in deciphering the programs — or writing their own. Such arcane knowledge is merely optional today.
A consequence of this approach is diminished engagement with users. Since they no longer need to hunt for features and commands, they no longer do so at all. They take everything at face value, not realizing the program's potential that goes untapped. There isn't even printed documentation that they can peruse to discover all the functions they're not using.
You may think that, with the increasing prevalence of closed operating systems such as iOS, this accommodating mindset of computer users will become a necessary one. But a user needn't have access to a computer's command line to be able to use software efficiently. The first thing I do in any application I install, be it for a desktop computer or a cell phone, is investigate the preferences, so I better understand the options and behaviors available to me. When I installed Microsoft Office 2011 for Mac last week, I immediately set Word to save its files in .DOC format by default, eschewing the more modern .DOCX. This single customization thus prevented numerous headaches caused by friends with older software not being able to open my files. Most users wouldn't think to explore that possibility.
That's because making computers friendlier has, ironically, also made them less engaging, and thus less educational. Growing up with computers that made few efforts to be understood taught me how to understand today's computers. For example, I recently purchased the new iLife suite from Apple. The multimedia editing software iMovie is immense in its capabilities, but because I'm curious and playful — qualities that are learned as much as they are innate — it wasn't long before I was mixing clips, ducking audio, and exporting to YouTube.
Granted, programs like iMovie have overwhelming potential: today's software is capable of so much more than could have ever fit onto a 140K floppy. Maybe today's users are learning just as much about today's computers as we did about the Apple II; it's only the proportions that have changed, as the comprehensive simplicity of yesterday's programs made it easier to grasp its commands in their entirety.
Behold, the Appleworks word processor command set in its entirety:
just enough to master. (Click for the full list.)
As more and more people use computers, will they become less and less efficient at doing so? Will our programs continue to bloat until only artisans and the hardcore can do more than scratch their surface? Or is that potential rightly buried, rewarding those dedicated few who know there has to be something better lurking beneath the surface?
I don't know the answers to these questions. But I've seen high school students marvel at being taught new computer skills, from as basic as Microsoft Word's "Track Changes" function to as esoteric as the Scheme programming language. The earlier we introduce users to such concepts, the sooner we'll ignite that creative spark that will drive them to learn what else these fascinating devices can do. Then we'll have a next generation not just of programmers, but of power users — and anyone who wants to compete with them in the workforce had best start cracking the books.