Wednesday, March 30, 2011

Fie on Both Their Mouses!

Folks,
The state of personal computing has come quite a way in its scant thirty or so years of existence. It’s remarkable how “civilized” and reliable what were once toys for dedicated hobbyists with soldering irons and ribbons of punched paper bearing barely serviceable code that more often than not failed to do anything, and at best did something as useful as tricking a hijacked plotter into printing a giant image of Mr. Spock in Xs and 0s on map paper. The college student that could coax a Mandelbrot set out of a gadget with computational abilities barely superior to a pile of paper clips clumped together with snot and ear wax was worshipped as a genius.
I am a bit of a geek, as was my Great Grandfather. He owned the first home telephone and personal automobile in his little town of Greenfield, MA. My Grandfather, bought the first commercial photocopier, the “Haloid Xerox” company’s behemoth 1959 Xerox 914, to sell in his office supply store in Springfield, MA (once the Silicon Valley of the first Industrial Age). My Father owned the very first Polaroid Land Camera to hit the market. I have gadget lust in my genes, and thus, early on became fascinated with computing, particularly personal computing.
But, I never gravitated toward coding. Rigorous logic and late nights sans intoxicants were not my youthful inclinations. The closest I’ve ever come to a hack was rescuing an early installation of Lotus Notes, which my boss had managed to crash by requiring a return receipt from all our three-hundred-thirty-four employees on the system while our single, much overworked and under-appreciated Sys Admin happened to be on vacation.
So, I 'll confess, I am a "Mac Person"; have been since the first one rolled out in 1984. Over the years, I've bought nine Macs, and I've “sold” more of them than most people who are paid to do so. Relying on the meager tools bestowed by the authors of a superb GUI is my default state, and I’m happy to preach the religion of simplicity, usability, and usefulness in personal computing. Still, I've also used plenty of DOS (and related OSs) as well as Windows machines. I’ve bought several Amigas (remember those?), and actually used the Mac’s Apple predecessor, the Lisa. Once upon a time, you could find me poking my way through the syntax and command lines of Unix-based systems, and mashing up the desktops of various Linux variants.
You know what? None of these systems have worked as well as they should have. Worse, as they've become more powerful and feature-full, they seem, subjectively at least, to have gotten all the more cantankerous, clunky and prone to misbehavior.
Maybe it's because the number of things that can go wrong has risen in proportion to the size of the system and application software. Maybe it's the fact that we now sit in front of these things all day long. Whatever the cause, we're all familiar with machines, Macs and PCs, that crash without apparent cause. Device drivers inexplicably change their settings or interfere with each other. The network has a little hiccup and our computers freeze. Then, when we restart them, they give us angry little warnings telling us that we've been bad users for not properly shutting them down. There is also the matter of application software. The manual for my new publishing application is about the size of the King James Bible. The newest version of Word has so many buttons, do-hickeys and menus, that there's little space left for text. I guess I've made the mistake of assuming that a word processor is supposed to help you write, not to run the equivalent of a print shop inside your computer.
Here’s an illustrative true-life story of what can go wrong with the best, most highly developed, and supposedly simple to use personal computer. I’m not talking about the iPad or other tablet computers and smart phones now on the market. They are still special cases, at this time (more on that in a bit). I’m talking about my trusty, but now aging MacBook.
A few days ago, I booted the sleek little monster up only to find that the Date and Time setting had become corrupt and was locked up. It was now counting forward from 12:34:06AM PST, March 24, 1969, and I could not change it to the present day and moment. This may seem like a non-problem, as I’ve got plenty of clocks and calendars around me; my cell phone has one that gets accurate network time from an atomic clock in Colorado, and I rely on that when on the go.
But, on my personal computer an array of programs and functions rely on accurate time telling within the machine, itself. Take my personal date book and contact programs and their alarms that rely on an accurate clock. It won’t do to show up forty-two years, three days, sixteen hours and fifty-four seconds late for a meeting with my literary agent. Neither will it be helpful to get an urgent and automated security patch to my OS some decades after nefarious black-hat hackers have made off with my passwords, credit card info, and other private data.
Perhaps an errant cosmic ray had bored through my magnetic hard drive and flipped a bit in this seemingly trivial but crucial software resource. The date and time that the clock was programmed to begin ticking was likely born of mere coincidence. It might have been the moment, one early morning, when a post-doc in Stanford U. had completed his mundane chore of building the original clock code for the first iteration of what was to become the venerable, flexible and extensible Unix OS. Maybe it’s the birthday of the then kid who rewrote the resource for NeXTStep or OSX. Whatever. Software engineers are notorious for their bad documentation, and with all invention, serendipity and handy, half-assed solutions arrived at in the wee hours are always abundant. I just knew I had a to fix a stupid problem with the most elegant personal computing system presently sitting like a paper weight on my hotel room desk.
I happened to have handy the original install discs for the OS that came with the machine when I bought it. The simplest solution was to reinstall the entire system to its original state from those discs. That would take an hour or so, and some courage. I was careful to select the install settings to preserve my current data and program files. I made it so and waited for the hoped for outcome. All went well. Now all I had to do was go through the time consuming process of updating the OS to its most recent and secure form via the ‘net. This only took an hour and a half, while I enjoyed watching the dismal news coming out of Libya on CNN. At least the hotel cable TV was still working.
Next, I found that my most relied upon productivity software was no longer working. Those old install discs had reinstalled those wonderful word processing, presentation, and spreadsheet programs that came bundled with the OS to an earlier version than my most recent updates, and they no longer worked with the now updated operating system. My install discs for those programs were sitting in a box forty miles from my hotel room. Off I went to the Apple Store to spend two hours and $80+ dollars getting the most current versions of that the essential software. At that price, it should have been a bargain, but for the time wasted and the expense of twice buying the same thing.
About five hours after my adventure began, now somewhat financially challenged and behind on an important deadline, I was again ready to enjoy the convenience of thoroughly modern personal computing.
But, what happened to the marvel of self-healing software, long promised, but still a tenuous gleam in the gauzy warp and weave of a computer scientist’s loom of logic? Will the dream of a world of reliable and super-fast delivery of software, security patches, and truly collaborative computing exemplified in primordial form by Google Apps soon appear? Will access to every sort of information sans physical media finally be as dependable as flicking a light switch and seeing your own bedroom appear before you in an instant and as though by magic? Will that software be as easy to use as it is to put on your slippers? Will the vaunted Cloud of cloud computing be so pillow soft that our minds can rest easy upon it to dream and create?
Fortunately, some help is on the way, and not a moment to soon. Cloud computing is starting to mature into a feasible tech and a realistic platform for commerce. Apple’s App Stores, Amazon’s various efforts, Google’s audacious innovations and experiments, all point to a not very distant future of utility computing.
The iPad, and similar devices are implementing the long sought grail of banishing confusing and non-intuitive file management systems with a model that associates all data files with application programs, so you can click on an app and simply see only the  files created by it. Or, perhaps you’d like to see your files stored along a timeline, in just the way you recall life, as a story in time. David Gelernter, of Yale, and others have been working on such intuitive systems. In such a model, the user, a Human, no longer has to do machine-like work and have a memory that cleaves to a machine’s mode of “thinking”.
Some decry this effort as resting control from the Human operator by hiding the guts of their machine’s logic and memory inside a black box. Most Human’s will likely find it liberating to not have to remember where they stored >ImSoMadAtObama< on their hard drive or Goodle Docs account, and what programs will open it. For those that want to dig into the guts of their machine’s “mind”, their will always be hacks and jail breaks, and plenty of good fun to be had in the infinite space of imagination painted in digits and pixels.
In any case, what ever the near future brings, let’s hope our junk just works. We need hammers and crowbars, and so far, we’re still getting Rube Goldberg Machines.
S

No comments:

Post a Comment