Friday, April 1, 2011

An Extemporaneous Explateration on Life, the Universe and Everything…

Folks,

Last night in northern Connecticut was quite dreary, the woods out back socked in with chilly fog and a nasty drizzle more fitting mid-November than the early Spring. Over the creek behind my Mother’s house coursed a slow moving, pulsing mist, like a zephyr in slow motion. It was occasionally pierced by gobs of sweet water falling from the evergreens and sodden, winter-dead oaks that clung to the decaying banks of the rivulet.
The stars and the Moon where hidden behind a profound haze above. I acutely missed those sign posts to the Heavens. I was reminded of Allan Watts’ wonderful title to his wonderful little book on a westerner’s take on Buddhism, “Cloud Hidden, Whereabouts Unknown.
I’m not a religious soul, in any conventional sense, but my inclinations are acutely spiritual in equal proportion to my scientific leaning. Curiosity is a principle motivation is many of my interactions with other folks and the Universe. Science and spirituality are, to me, incomplete when not informing each other in my outlook and pursuits.
As Einstein said; “The finest emotion of which we are capable is the mystic emotion. Herein lies the germ of all art and all true science. Anyone to whom this feeling is alien, who is no longer capable of wonderment and lives in a state of fear is a dead man. To know that what is impenetrable for us really exists and manifests itself as the highest wisdom and the most radiant beauty, whose gross forms alone are intelligible to our poor faculties — this knowledge, this feeling ... that is the core of the true religious sentiment. In this sense, and in this sense alone, I rank myself among profoundly religious men.”
Read that again. Maybe read it aloud to yourself. It’s that good a notion and well put.
Anyhow, in Einstein’s, my own, and no less than the current Dali Lama’s view, Lao Tzu, the man who generated from Taoism the lineage of profound Eastern philosophies that apprehended no central G-d in the Western sense, but are to westerners considered religions, was a superb scientist. Likewise, the ancient Greek practical philosophers, engineers and proto-scientists, those who trod the path of the Mystery Rites, were as divinely religious as our Uncle Al.
But, back to the stars. Peering into the vaporous ceiling occluding distant points in the Cosmos, I remembered an autumn evening lying, with no embarrassment, on the front lawn of our suburban home with my Father. We were staring at what he informed me was the constellation, Orion. I was about six years old. He asked me, “Do you ever get the feeling that somebody up there might be looking back at us, wondering if anybody is looking back at him?” That question has fascinated me ever since. I am not alone in being thus compelled back, again and again, to conjecture on the possibility, even likelihood, that we are not alone.
There was a fellow, another great physicist of Einstein’s generation, Enrico Fermi, who wondered on this same non-trivial thought. He also proposed what is known as Fermi’s Paradox in light of the possibility that we are indeed, against scientific odds, alone as intelligent life in this Universe. After all, if Life and Intelligence is something that is inherently possible across the breadth of perhaps a septillion of stars, a surfeit of planetary orbs circling those suns, many of them amenable to the creation and sustenance of life as we know it from our own limited experience… where are the dang aliens!?!
Surely we should have spied them by now. If they are now buzzing about the heavens in super-tech space craft, if they have ever have even gotten to the point of playing with crystal radios or blown themselves up with atom bombs, they should have left some clue to their presence in the starry bough.
Solutions to this enigma include that they are advanced enough to stay hid, and don’t wish to listen to the wailing of our baby civilization in its crib. They wait and watch, and hope we grow up and fly right. They may be wise enough fear our immature and violent tendencies. Perhaps don’t really give a damn about what seems to them to be barely more advanced the pond scum.
Another possibility, is that we are aliens. Perhaps our good Earth is but a petri dish, and we an experiment by an advanced species possessed of great patience born of billions of years evolution prior to the initiation of our own synthetic creation as drops of water, bits of clay, RNA and a somewhat reliably clement environment. Our theoretical creators might thus someday be back, in their own good time, to reconnoiter the cold hard data revealed in their planetary wet lab. Let us hope we do not then wind up in the cosmic bio-hazard bin.
S
Enrico Fermi

Thursday, March 31, 2011

Reflections After the Flood…

Folks,
I’ve not before written about the catastrophe emerging from the fourfold whammy that has hit Japan. They’ve suffered a historically significant and death-dealing earthquake, a murderous tsunami, the meltdown of one or more nuclear reactors at Fukushima Daiichi, and the poisoning of their land, ground water, sea water, and animal and Human life with radiation. The disaster has worldwide health, social and economic ramifications. It may be that wide swaths of land and sea will be uninhabitable by Humans for at least 25,000 years. If plutonium has leaked from Reactor #3, as seems apparent now, the land, drinking water and nearby ocean will be toxic for as long as 250,000 years. That is longer than our species has trod the Earth that we now befoul.
But, the news of the day adequately covers, albeit in a cursory manner, the steady rain of bad news coming from the confluence of Nature’s shrugs and surly moods, and the consequences of Human muddling and arrogance. I’ll instead write a bit about what might be pondered as we move forward as a species from this very moment.
First, let me be clear. I am not a reflexive anti-nuke power sort. We, the people and our democratically chosen governments (where they exist to whatever extent) can tell the nuke industry that the free ride is over; that it is time to solve the problems of nuclear waste storage and disposal, as well as the proliferation of nuclear materials across the globe. They must also design and build self regulating nuclear machines, or none at all. It is well past time for global industry to stop behaving like an adolescent who might at any moment blow off his neighbor’s fingers with the toss of a fire cracker. If we grow up and take responsibility for the perils inherent in sloppy engineering, we may gain a reasonable nuclear alternative to fossil fuels as just one arrow in our potentially environmentally benign quiver of tech.
Okay, so where are we today? We are dealing presently with the second most fundamentally powerful force at our disposal: nuclear fission. So far, we’ve been able to use it build bombs that can flatten entire cities with that power, and we can sort of use it to reliably and sometimes safely generate steam to make electricity with what is essentially 18th Century engineering and mid-20th Century physics.
Next up the ladder in terms of essential physical forces at our command is nuclear fusion, the power that makes the stars shine and gives birth to the elements of all life, including our own. As of this date, we can make bombs theoretically scalable to a power able to shatter planets. This potential was revealed in the math and physics of one Edward Teller, the real life model for the cinematic Dr. Strangelove. It is a blessing that Teller’s conjectures have not been proven in experiment and practice. Meanwhile, the promise of harnessing this power for the generation of cheap, pollution-free energy has been perpetually just over the horizon, coming within fifty years, for the past fifty years.
But, there is more in store. Today, at the Large Hadron Collider (LHC), experiments are underway to probe deeper into the fabric of matter and energy with the grandest, most complex and sophisticated scientific instrument yet created on our pale blue dot of a world. As you read this, scientists and engineers are smashing sub-atomic particles together to potentially reveal the secrets of hidden dimensions in our own Universe, peering into theorized adjacent universes to our own, and they are perhaps on their way to producing sub-microscopic black holes that may, someday, be coaxed into birthing new universes on a lab bench. Shall we children of Earth then have at our disposal the power of a G-d? We better wise up fast, if that is in our future.
There is more, and more of a more pressing matter in view of our carelessness with the relatively feeble energies of fission and fusion. From LHC is likely to come the production of significant quantities of antimatter. That stuff will make the power of Teller’s Super Bomb look like a Forth of July sparkler if some careless lab tech bumps into a chair and drops the beaker on the floor. Matter and antimatter annihilate each other with a release of energy that dwarfs the potential of a giant star’s implosion and explosive radiance as a supernova.
Still, there is even more to awe ourselves into humility in the face of our inventiveness. Far below the almost unimaginable energies emerging from the birth of all we now can know, is the subtle ebb and flow of simple, organic chemistry, and the fundamentals of life. Recently, Craig Ventner’s team (the good folks that helped map the Human Genome) created the Earth’s first synthetic organism. It was just a very simple bacterium, and by no means the creation of life from scratch at Human hands. Nonetheless, the achievement was a signpost on the way to the day, coming soon, when our computers will control machine chemistry labs… and what will issue from those petri dishes will be Life born not of Nature directly, but through Human ingenuity and intention. We’ll be playing G-d, indeed.
So, here in the realm of The Living, we encounter the same issues as with nuclear tech and advanced physics. Can we be trusted with our own tech, or will it be our undoing? A generation or so ago, it was unthinkable that a teenager might have a computer capable of bringing a government agency such as NASA to its knees, or that a media savvy hacker might steal away and then reveal secret State Department documents to the world. But, that is now happening, and we can see the results. So, what happens when some kid in the Ukraine or Spokane, WA decides to make his own bugs? What happens when a maniacal tyrant figures that nukes are old fashioned, and would rather see a prolonged plague on his list of achievements, as opposed to a swift annihilation of his perceived enemies?
Yup… it’s time for our species to grow up.
S

Wednesday, March 30, 2011

Fie on Both Their Mouses!

Folks,
The state of personal computing has come quite a way in its scant thirty or so years of existence. It’s remarkable how “civilized” and reliable what were once toys for dedicated hobbyists with soldering irons and ribbons of punched paper bearing barely serviceable code that more often than not failed to do anything, and at best did something as useful as tricking a hijacked plotter into printing a giant image of Mr. Spock in Xs and 0s on map paper. The college student that could coax a Mandelbrot set out of a gadget with computational abilities barely superior to a pile of paper clips clumped together with snot and ear wax was worshipped as a genius.
I am a bit of a geek, as was my Great Grandfather. He owned the first home telephone and personal automobile in his little town of Greenfield, MA. My Grandfather, bought the first commercial photocopier, the “Haloid Xerox” company’s behemoth 1959 Xerox 914, to sell in his office supply store in Springfield, MA (once the Silicon Valley of the first Industrial Age). My Father owned the very first Polaroid Land Camera to hit the market. I have gadget lust in my genes, and thus, early on became fascinated with computing, particularly personal computing.
But, I never gravitated toward coding. Rigorous logic and late nights sans intoxicants were not my youthful inclinations. The closest I’ve ever come to a hack was rescuing an early installation of Lotus Notes, which my boss had managed to crash by requiring a return receipt from all our three-hundred-thirty-four employees on the system while our single, much overworked and under-appreciated Sys Admin happened to be on vacation.
So, I 'll confess, I am a "Mac Person"; have been since the first one rolled out in 1984. Over the years, I've bought nine Macs, and I've “sold” more of them than most people who are paid to do so. Relying on the meager tools bestowed by the authors of a superb GUI is my default state, and I’m happy to preach the religion of simplicity, usability, and usefulness in personal computing. Still, I've also used plenty of DOS (and related OSs) as well as Windows machines. I’ve bought several Amigas (remember those?), and actually used the Mac’s Apple predecessor, the Lisa. Once upon a time, you could find me poking my way through the syntax and command lines of Unix-based systems, and mashing up the desktops of various Linux variants.
You know what? None of these systems have worked as well as they should have. Worse, as they've become more powerful and feature-full, they seem, subjectively at least, to have gotten all the more cantankerous, clunky and prone to misbehavior.
Maybe it's because the number of things that can go wrong has risen in proportion to the size of the system and application software. Maybe it's the fact that we now sit in front of these things all day long. Whatever the cause, we're all familiar with machines, Macs and PCs, that crash without apparent cause. Device drivers inexplicably change their settings or interfere with each other. The network has a little hiccup and our computers freeze. Then, when we restart them, they give us angry little warnings telling us that we've been bad users for not properly shutting them down. There is also the matter of application software. The manual for my new publishing application is about the size of the King James Bible. The newest version of Word has so many buttons, do-hickeys and menus, that there's little space left for text. I guess I've made the mistake of assuming that a word processor is supposed to help you write, not to run the equivalent of a print shop inside your computer.
Here’s an illustrative true-life story of what can go wrong with the best, most highly developed, and supposedly simple to use personal computer. I’m not talking about the iPad or other tablet computers and smart phones now on the market. They are still special cases, at this time (more on that in a bit). I’m talking about my trusty, but now aging MacBook.
A few days ago, I booted the sleek little monster up only to find that the Date and Time setting had become corrupt and was locked up. It was now counting forward from 12:34:06AM PST, March 24, 1969, and I could not change it to the present day and moment. This may seem like a non-problem, as I’ve got plenty of clocks and calendars around me; my cell phone has one that gets accurate network time from an atomic clock in Colorado, and I rely on that when on the go.
But, on my personal computer an array of programs and functions rely on accurate time telling within the machine, itself. Take my personal date book and contact programs and their alarms that rely on an accurate clock. It won’t do to show up forty-two years, three days, sixteen hours and fifty-four seconds late for a meeting with my literary agent. Neither will it be helpful to get an urgent and automated security patch to my OS some decades after nefarious black-hat hackers have made off with my passwords, credit card info, and other private data.
Perhaps an errant cosmic ray had bored through my magnetic hard drive and flipped a bit in this seemingly trivial but crucial software resource. The date and time that the clock was programmed to begin ticking was likely born of mere coincidence. It might have been the moment, one early morning, when a post-doc in Stanford U. had completed his mundane chore of building the original clock code for the first iteration of what was to become the venerable, flexible and extensible Unix OS. Maybe it’s the birthday of the then kid who rewrote the resource for NeXTStep or OSX. Whatever. Software engineers are notorious for their bad documentation, and with all invention, serendipity and handy, half-assed solutions arrived at in the wee hours are always abundant. I just knew I had a to fix a stupid problem with the most elegant personal computing system presently sitting like a paper weight on my hotel room desk.
I happened to have handy the original install discs for the OS that came with the machine when I bought it. The simplest solution was to reinstall the entire system to its original state from those discs. That would take an hour or so, and some courage. I was careful to select the install settings to preserve my current data and program files. I made it so and waited for the hoped for outcome. All went well. Now all I had to do was go through the time consuming process of updating the OS to its most recent and secure form via the ‘net. This only took an hour and a half, while I enjoyed watching the dismal news coming out of Libya on CNN. At least the hotel cable TV was still working.
Next, I found that my most relied upon productivity software was no longer working. Those old install discs had reinstalled those wonderful word processing, presentation, and spreadsheet programs that came bundled with the OS to an earlier version than my most recent updates, and they no longer worked with the now updated operating system. My install discs for those programs were sitting in a box forty miles from my hotel room. Off I went to the Apple Store to spend two hours and $80+ dollars getting the most current versions of that the essential software. At that price, it should have been a bargain, but for the time wasted and the expense of twice buying the same thing.
About five hours after my adventure began, now somewhat financially challenged and behind on an important deadline, I was again ready to enjoy the convenience of thoroughly modern personal computing.
But, what happened to the marvel of self-healing software, long promised, but still a tenuous gleam in the gauzy warp and weave of a computer scientist’s loom of logic? Will the dream of a world of reliable and super-fast delivery of software, security patches, and truly collaborative computing exemplified in primordial form by Google Apps soon appear? Will access to every sort of information sans physical media finally be as dependable as flicking a light switch and seeing your own bedroom appear before you in an instant and as though by magic? Will that software be as easy to use as it is to put on your slippers? Will the vaunted Cloud of cloud computing be so pillow soft that our minds can rest easy upon it to dream and create?
Fortunately, some help is on the way, and not a moment to soon. Cloud computing is starting to mature into a feasible tech and a realistic platform for commerce. Apple’s App Stores, Amazon’s various efforts, Google’s audacious innovations and experiments, all point to a not very distant future of utility computing.
The iPad, and similar devices are implementing the long sought grail of banishing confusing and non-intuitive file management systems with a model that associates all data files with application programs, so you can click on an app and simply see only the  files created by it. Or, perhaps you’d like to see your files stored along a timeline, in just the way you recall life, as a story in time. David Gelernter, of Yale, and others have been working on such intuitive systems. In such a model, the user, a Human, no longer has to do machine-like work and have a memory that cleaves to a machine’s mode of “thinking”.
Some decry this effort as resting control from the Human operator by hiding the guts of their machine’s logic and memory inside a black box. Most Human’s will likely find it liberating to not have to remember where they stored >ImSoMadAtObama< on their hard drive or Goodle Docs account, and what programs will open it. For those that want to dig into the guts of their machine’s “mind”, their will always be hacks and jail breaks, and plenty of good fun to be had in the infinite space of imagination painted in digits and pixels.
In any case, what ever the near future brings, let’s hope our junk just works. We need hammers and crowbars, and so far, we’re still getting Rube Goldberg Machines.
S

Tuesday, March 29, 2011

Fashionable Computers Disappear and Things Start Thinking

Folks,
In the long ago days of 1998, my Mom bought her first computer, an iMac. At the time of her purchase, the fact that it looked cute and didn't require a mess of wires to hook-up was more important than all that megahertz and RAM stuff that she still does not nor care to understand.
Gateway soon introduced a line of radically slimmed down desktop machines, as did Sony and other manufacturers. Intel was showing off prototype computers that looked like brightly colored Aztec pyramids or sleek modern sculpture. A company that made a popular line of web servers was packing their industrial electronics in a blue cube barely bigger than your hand. At the same time, AOL and Microsoft wanted to be everywhere anywhere via a new generation of handheld and TV set-top devices. Palm Computing’s then current offering came in a sleek aluminum case that would not have been out of place on a Klingon Warrior, sheathed next to his Bat’leth.
Today, such design consciousness in digital devices, from computers to gadgets that had not existed in 1998 (WiFi hotspots and routers, digital music players, inexpensive consumer digital cameras, etc.) is the norm. As with an older tech, the automobile, consumers are making buying decisions based as much on a machine’s look and feel, as on the technology inside the box.
So what? Well, there are smart folks, such as the  MIT Media Lab’s near futurist Andy Lippman, that believe this tells us that we are witnessing both the first and final two or so decades of the personal computer as an Everyman's status object and fashion statement. They contend that when a technology gains more attention and confers more status as a fashion statement than its work-a-day purpose, it's probably about to disappear. That's "disappear", as in to be removed from view.
Confused? Let's look at an analogous situation. When was the last time you thought of the multi-gigawatt power plant on the other end of the wire that connects it to the motor you never hear in the compressor that you never think of in your fridge? All four pieces of technology just mentioned used to be big deals in the marketing of electric power, as well as the industrial design of fridges. Remember those old machines with the fat, round compressors on top? If you've never seen one for yourself, look for one in the background next time you're watching a 1930's vintage movie.
Today, the most important thing about picking a place to keep the beer cold is how well it disappears into the decor of our faux colonial kitchen. The last thing that we want from a reliable, ubiquitous technology is for it to call attention to itself. It should just be there, and be working the next time you feel like having some ice-cream. Thus, we hear the prediction that after the current phase of computer and telecom product design, the devices will begin to fade into the background of our environment. Their services, though, will still be there, but more reliably, like the light that comes on when you open the fridge.
What will emerge from this reinvented model for computing and telecommunications? Individually, many services will seem trivial from our present point of view. Milk cartons may access the internet-grocery store when they get low, and order replacements for themselves. A necktie might tell a business associate’s electronic rolodex what your email address is, as you shake his hand. Through the same tech, tablet computers distributed freely across the office will know who is holding them and what documents will be required for the next damn meeting. Pages in electronic books and catalogues, made with electronic paper, will update themselves when new information is available. Web-based information will be accessible from not only from hand-held devices, but previously "dumb" objects such as the tread of your car's wearing tires. 
Other services will seem less triffling. Your tee-shirt may contain processors woven into its fabric and a web-connected cardiac monitor to let the hospital know to send an ambulance when you’ve eaten more heart arresting calzones than you can jog away. That shirt and it’s wireless connection will be subsidized by a changing assortment of ads for the hospital’s services displayed on flexible video screens over your chest and back. Your own exertions will supply the power for the “smart” shirt’s connectivity and computing.
Devices like mice and trackpads, even keyboards, will truly disappear, and not just from view. They will be replaced by an intelligent environment that knows where you are looking and understands your facial and other gestures, as well as speech via cameras, microphones, and machine smarts. You may be wearing a hat or headband that puts you in direct connection to your outboard “brain”. The ubiquitous screen that we’ve been peering at for almost seventy years will disappear into your contact lenses or stylish shades, and they will also be your computer interface.
We are at the threshold of some truly remarkable tech. It will enable the things around us will seem to think, and we will think little of that, as the machines fade into the background and do their work invisibly. This prospect brings both promise and peril, of course. Do we really want our very environment to know all of our comings and goings, where our eyes wander, what stimulates various sectors of our neocortex… the host of our self and self awareness? Will the last bastions of privacy fall with this new generation of hidden tech? Can a society and culture function without secrets be kept from not only its denizens, but their own machines?
Interesting questions to ponder on the high speed ride to the near future!
S

Monday, March 28, 2011

What’s in a Name?

Folks,
If you're like myself, or my father or mother for that matter, you probably had a little red wagon when you were a kid. It was probably called a Radio Flyer. Did you every wonder why? What did radio have to do with a child's wagon? As for flying, it never went much faster than you could pull it over the sidewalk or lawn.
So, why call it Radio Flyer? Well, immigrant Antonio Pasin, inventor of the little red wagon formerly known as Model #18, needed something catchy and thoroughly modern sounding in a name. The time was the 1920s, and the two truly hot technologies coming to commercial prominence were radio broadcasting and human flight.
Other immigrants, such as Marconi, Tesla, and General” Davide Sarnoff, where making distant sounds, and not inconsiderable money, appear out of thin air. People like Charles Lindbergh and Glenn Curtiss were finding some things to do with "aeroplanes", besides dropping grenades ineffectually onto the farm fields of WWI Europe. They were setting records, zipping across entire oceans in barely more than a day, and swiftly delivering mail across continents. People turned out at country fairs, and paid good money, to see "Barnstormers" perform death-defying aerial magic. What name could have been more trendy and cool than Radio Flyer? It even looks pretty good today at RadioFlyer.com. Check out their present retro/modern as tomorrow new/old logo.
Now, every generation since the dawn of the industrial revolution has stood somehow transfixed by the latest technology to take off in the commercial market. At the dawn of television, the public was treated to television programs with characters like Captain Video; a rocket pilot, not a camera man. A couple of generations before the renamed Model #18, writer Nathaniel Hawthorne remarked, "Is it a fact- or have I dreamed it- that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence!" He's referring, of course, to the telegraph. The year was 1851. Hawthorne, like Teilhard de Chardin in his treatise “The Phenomenon of Man” wrote a century after the great poet’s heralding of a new age, foresaw the advent of today’s Internet.
Whatever. Today, just about anything with a dot-com in its name is apt to predispose us to a favorable view of it. Things cyber and virtual have become so trendy, the words have almost lost their original meaning. In fact, the word cyber practically has no meaning! It's borrowed from the word cybernetics, itself an adaption of the Greek word for a helmsman, kybernetes, and was coined by mathematician Norbert Wiener. The year was 1948, well before the advent of the microchip, and rose from Wiener's theories about the similarities between steering mechanisms for naval torpedoes and human mental processes.
Everything has a history. No technology emerges into the world as though from the Void. A company, technology or product with a name that contains dot-com, or for that matter, video, radio, steam or steel, is no more or less likely to be excellent or useful than one with the word ACME in it. But, names can sound cool. They can evoke emotions and spark notions. They can even offer a sense of dominion over what is named and pride in naming it. Naming things was, after all, one of the first gifts that G-d is alleged to have given to Man after He made the first Woman.
What’s in a name? 
S

Sunday, March 27, 2011

Great Grandmothers, Social Networks, Revolution!

Folks,

A long, long time ago, when the late Victorian Era ladies and their daughters who were born before the arrival of the Roaring Twenties, when most "civilized" women of the day's cultured society did not drive horseless carriages  nor leave the house without extensive preparation in grooming and attire. Even after the Dress Reform Movement and associated undergarment reform, it might take hours to just get ready to go to the market on the horse drawn trolley down a poop laden, smelly dirt street. This after waking at dawn or before to see to it that her husband and children were fed and clothed. It was an ordeal, not very glamorous, particularly for the wife of a successful business person or public figure in her community.

Alas, the market, the general store, the apothecary, and so on, were the only places during the average day that she could meet and chat with friends and acquaintances, those not her immediate neighbors over the fence behind the garden or grape arbor. Communication, idle or otherwise, was no more easy to engage in that transportation.

My great grandmother was one of the ladies of that day. She lived in Greenfield, MA in the 1910s. Her husband was a bit of a techie for his day, a gadget freak. He had one of the first private autos in town. Several folks had early trucks  to haul goods to and from market and the local metal and textile mills, but my grandfather's car for hauling people for such purposes as having a picnic up the old farm road by the mountain stream on Sunday. Gasoline powered vehicles were so rare in that day, he had to have his own big gas tank with attached hand-pump in what had been the carriage barn.

Around the same time that my great grandfather brought home that big car, he had the local phone company install two telephones . One was in the house that he, his wife, and my grandmother and grand uncle lived in. The other was at his mill up by the Vermont border, in Turner's Falls. Back at the turn of the last century, as the auto and telephone were ascending, and the region's textile and milling economy had yet to settle into utter decline, it was common for affluent owners of businesses to live a far bit away apart from their going concerns. With no highways as we know them today, a telephone link from the owner's home to the plant or store was the way to ride herd on employees working nights and weekends. Likewise, pharmacists , doctors, shop owners and other business folk could call up via the central operator (no dials or push buttons on phones yet) their suppliers and commercial customers.

But, what's this got to do with social networking and ladies who liked to gab with their turn o'the last century friends?

Well, remember how dirty, smelly, and time consuming it was for middle class and prosperous women to get down to the milliners for a fancy hat and the essential community news of the day with their friends about who and what was up or down, and for coconspirators to discuss such things as the emerging Suffrage Movement.

My great grandmother and some of her colleagues noticed that they could pick up those phones and ring through Central to get in touch with each other while still dressed in their house coats, securely away from their husbands and children busy at work or school or toiling in the mills, and speak freely and intimately… even politically.


Today, girls in Tokyo, Seoul, Detroit, Toronto, Tel Aviv and elsewhere text gossip and small news franticly while young women at the foment of revolution in the Middle East lay their lives on the line in tele-communicated bits.


The beat goes on. It is interesting how some things do not change, but they do evolve and iterate with tech and culture. Interesting. Anyhow, thank you Bubbie Fannie and your ilk, then and now and into the future, for your contributions.

S


Thursday, March 24, 2011

Folks,


When I was about six years old, my idea of geeky fun was to hide out in the dank basement of our family's house on Abbott Street, in Springfield, MA. There, I'd take apart old clocks, yank the vacuum tubes from ancient radios, and play jungle explorer with my father's night-watch kerosene lantern from his days defending Panama from WWII Japanese subs that never came.


But, my favorite activity was playing with an greasy, heavy wrought iron monkey wrench. I was fascinated by its heft in my little hands, and the simple dexterous motion of the thumbscrew that coiled and uncoiled to adjust the vice atop its hand grip. That it had only two moving parts was aesthetically pleasing to the arts and tech-inclined nerd that I was to grow up to be.


But, why was it called a monkey wrench? It didn't look like a monkey. Even my young mind knew that arboreal, knuckle walking social primates had no fittings nor pipes to tighten. My all-knowing dad had not a clue to the moniker's origin, either. I soon moved on to other questions, and left this matter behind to ponder and execute the design of futuristic model cites out of balsa wood cubes and some copper pipes, plastic safety razor holders, Instamatic Flash Cubes recovered from the trash. It was now four years later, 1964, and I'd just seen the NYC hosted World's Fair. The future was inhabiting my thoughts; a future where turbine powered cars would run on grain alcohol and people would make phone calls with video.


Some two decades later, however, I was looking toward the other end of history, rummaging through the fabulous Johnson's Used Book Store. This place is a lost treasure to the decay of downtown Springfield. It is so gone that it leaves not even a trace in the digital aether on Google nor Wikipedia. In any case, on one sunny afternoon, I somehow found the mouldering scent of old periodicals more enticing than the scent of spring blossoms and fetching young ladies, and I browsed the racks of tattered periodicals. There, I stumbled upon a magazine whose name I have long forgotten. Within it, though, was an article about the origin of the term monkey wrench. Shazzam! My inner six year old got his long delayed prize.


Although most tech and tool nerds believe that a British guy named Moncky (see the link above illustration below), was the fellow that patented an adjustable wrench in 1858 and gave his name to the tool, the fact is Lexicographers have the found the term monkey wrench dating back to at least 1840. Connecting the dots, the path to the tool's name comes to an origin in the hands of an iron miller named Monk who had grown tired of lugging around his heavy tool box full of fix sized wrenches. He worked in a small metal fab plant in Springfield on the bend where Mill Street meets southern Main Street. The mill was then owned by Bemis and Call Company, and they proceeded to produce Monk's innovative gripper/twister/puller and plumbers helper. That place on the corner still stands today, by the way. It's now a furniture and home decorating shop for the affluent.


Whatever, some day a future father will be cranky as he goes down to the basement to bang on the fittings of the home's aging thin-film photovoltaic powered sodium/water heating system and hand a wrench to a kid that will become fascinated with supple, useful, adjustable tech.


Keep and eye to the future, and an ear to the past…


S