Thursday, March 31, 2011

Reflections After the Flood…

Folks,
I’ve not before written about the catastrophe emerging from the fourfold whammy that has hit Japan. They’ve suffered a historically significant and death-dealing earthquake, a murderous tsunami, the meltdown of one or more nuclear reactors at Fukushima Daiichi, and the poisoning of their land, ground water, sea water, and animal and Human life with radiation. The disaster has worldwide health, social and economic ramifications. It may be that wide swaths of land and sea will be uninhabitable by Humans for at least 25,000 years. If plutonium has leaked from Reactor #3, as seems apparent now, the land, drinking water and nearby ocean will be toxic for as long as 250,000 years. That is longer than our species has trod the Earth that we now befoul.
But, the news of the day adequately covers, albeit in a cursory manner, the steady rain of bad news coming from the confluence of Nature’s shrugs and surly moods, and the consequences of Human muddling and arrogance. I’ll instead write a bit about what might be pondered as we move forward as a species from this very moment.
First, let me be clear. I am not a reflexive anti-nuke power sort. We, the people and our democratically chosen governments (where they exist to whatever extent) can tell the nuke industry that the free ride is over; that it is time to solve the problems of nuclear waste storage and disposal, as well as the proliferation of nuclear materials across the globe. They must also design and build self regulating nuclear machines, or none at all. It is well past time for global industry to stop behaving like an adolescent who might at any moment blow off his neighbor’s fingers with the toss of a fire cracker. If we grow up and take responsibility for the perils inherent in sloppy engineering, we may gain a reasonable nuclear alternative to fossil fuels as just one arrow in our potentially environmentally benign quiver of tech.
Okay, so where are we today? We are dealing presently with the second most fundamentally powerful force at our disposal: nuclear fission. So far, we’ve been able to use it build bombs that can flatten entire cities with that power, and we can sort of use it to reliably and sometimes safely generate steam to make electricity with what is essentially 18th Century engineering and mid-20th Century physics.
Next up the ladder in terms of essential physical forces at our command is nuclear fusion, the power that makes the stars shine and gives birth to the elements of all life, including our own. As of this date, we can make bombs theoretically scalable to a power able to shatter planets. This potential was revealed in the math and physics of one Edward Teller, the real life model for the cinematic Dr. Strangelove. It is a blessing that Teller’s conjectures have not been proven in experiment and practice. Meanwhile, the promise of harnessing this power for the generation of cheap, pollution-free energy has been perpetually just over the horizon, coming within fifty years, for the past fifty years.
But, there is more in store. Today, at the Large Hadron Collider (LHC), experiments are underway to probe deeper into the fabric of matter and energy with the grandest, most complex and sophisticated scientific instrument yet created on our pale blue dot of a world. As you read this, scientists and engineers are smashing sub-atomic particles together to potentially reveal the secrets of hidden dimensions in our own Universe, peering into theorized adjacent universes to our own, and they are perhaps on their way to producing sub-microscopic black holes that may, someday, be coaxed into birthing new universes on a lab bench. Shall we children of Earth then have at our disposal the power of a G-d? We better wise up fast, if that is in our future.
There is more, and more of a more pressing matter in view of our carelessness with the relatively feeble energies of fission and fusion. From LHC is likely to come the production of significant quantities of antimatter. That stuff will make the power of Teller’s Super Bomb look like a Forth of July sparkler if some careless lab tech bumps into a chair and drops the beaker on the floor. Matter and antimatter annihilate each other with a release of energy that dwarfs the potential of a giant star’s implosion and explosive radiance as a supernova.
Still, there is even more to awe ourselves into humility in the face of our inventiveness. Far below the almost unimaginable energies emerging from the birth of all we now can know, is the subtle ebb and flow of simple, organic chemistry, and the fundamentals of life. Recently, Craig Ventner’s team (the good folks that helped map the Human Genome) created the Earth’s first synthetic organism. It was just a very simple bacterium, and by no means the creation of life from scratch at Human hands. Nonetheless, the achievement was a signpost on the way to the day, coming soon, when our computers will control machine chemistry labs… and what will issue from those petri dishes will be Life born not of Nature directly, but through Human ingenuity and intention. We’ll be playing G-d, indeed.
So, here in the realm of The Living, we encounter the same issues as with nuclear tech and advanced physics. Can we be trusted with our own tech, or will it be our undoing? A generation or so ago, it was unthinkable that a teenager might have a computer capable of bringing a government agency such as NASA to its knees, or that a media savvy hacker might steal away and then reveal secret State Department documents to the world. But, that is now happening, and we can see the results. So, what happens when some kid in the Ukraine or Spokane, WA decides to make his own bugs? What happens when a maniacal tyrant figures that nukes are old fashioned, and would rather see a prolonged plague on his list of achievements, as opposed to a swift annihilation of his perceived enemies?
Yup… it’s time for our species to grow up.
S

Wednesday, March 30, 2011

Fie on Both Their Mouses!

Folks,
The state of personal computing has come quite a way in its scant thirty or so years of existence. It’s remarkable how “civilized” and reliable what were once toys for dedicated hobbyists with soldering irons and ribbons of punched paper bearing barely serviceable code that more often than not failed to do anything, and at best did something as useful as tricking a hijacked plotter into printing a giant image of Mr. Spock in Xs and 0s on map paper. The college student that could coax a Mandelbrot set out of a gadget with computational abilities barely superior to a pile of paper clips clumped together with snot and ear wax was worshipped as a genius.
I am a bit of a geek, as was my Great Grandfather. He owned the first home telephone and personal automobile in his little town of Greenfield, MA. My Grandfather, bought the first commercial photocopier, the “Haloid Xerox” company’s behemoth 1959 Xerox 914, to sell in his office supply store in Springfield, MA (once the Silicon Valley of the first Industrial Age). My Father owned the very first Polaroid Land Camera to hit the market. I have gadget lust in my genes, and thus, early on became fascinated with computing, particularly personal computing.
But, I never gravitated toward coding. Rigorous logic and late nights sans intoxicants were not my youthful inclinations. The closest I’ve ever come to a hack was rescuing an early installation of Lotus Notes, which my boss had managed to crash by requiring a return receipt from all our three-hundred-thirty-four employees on the system while our single, much overworked and under-appreciated Sys Admin happened to be on vacation.
So, I 'll confess, I am a "Mac Person"; have been since the first one rolled out in 1984. Over the years, I've bought nine Macs, and I've “sold” more of them than most people who are paid to do so. Relying on the meager tools bestowed by the authors of a superb GUI is my default state, and I’m happy to preach the religion of simplicity, usability, and usefulness in personal computing. Still, I've also used plenty of DOS (and related OSs) as well as Windows machines. I’ve bought several Amigas (remember those?), and actually used the Mac’s Apple predecessor, the Lisa. Once upon a time, you could find me poking my way through the syntax and command lines of Unix-based systems, and mashing up the desktops of various Linux variants.
You know what? None of these systems have worked as well as they should have. Worse, as they've become more powerful and feature-full, they seem, subjectively at least, to have gotten all the more cantankerous, clunky and prone to misbehavior.
Maybe it's because the number of things that can go wrong has risen in proportion to the size of the system and application software. Maybe it's the fact that we now sit in front of these things all day long. Whatever the cause, we're all familiar with machines, Macs and PCs, that crash without apparent cause. Device drivers inexplicably change their settings or interfere with each other. The network has a little hiccup and our computers freeze. Then, when we restart them, they give us angry little warnings telling us that we've been bad users for not properly shutting them down. There is also the matter of application software. The manual for my new publishing application is about the size of the King James Bible. The newest version of Word has so many buttons, do-hickeys and menus, that there's little space left for text. I guess I've made the mistake of assuming that a word processor is supposed to help you write, not to run the equivalent of a print shop inside your computer.
Here’s an illustrative true-life story of what can go wrong with the best, most highly developed, and supposedly simple to use personal computer. I’m not talking about the iPad or other tablet computers and smart phones now on the market. They are still special cases, at this time (more on that in a bit). I’m talking about my trusty, but now aging MacBook.
A few days ago, I booted the sleek little monster up only to find that the Date and Time setting had become corrupt and was locked up. It was now counting forward from 12:34:06AM PST, March 24, 1969, and I could not change it to the present day and moment. This may seem like a non-problem, as I’ve got plenty of clocks and calendars around me; my cell phone has one that gets accurate network time from an atomic clock in Colorado, and I rely on that when on the go.
But, on my personal computer an array of programs and functions rely on accurate time telling within the machine, itself. Take my personal date book and contact programs and their alarms that rely on an accurate clock. It won’t do to show up forty-two years, three days, sixteen hours and fifty-four seconds late for a meeting with my literary agent. Neither will it be helpful to get an urgent and automated security patch to my OS some decades after nefarious black-hat hackers have made off with my passwords, credit card info, and other private data.
Perhaps an errant cosmic ray had bored through my magnetic hard drive and flipped a bit in this seemingly trivial but crucial software resource. The date and time that the clock was programmed to begin ticking was likely born of mere coincidence. It might have been the moment, one early morning, when a post-doc in Stanford U. had completed his mundane chore of building the original clock code for the first iteration of what was to become the venerable, flexible and extensible Unix OS. Maybe it’s the birthday of the then kid who rewrote the resource for NeXTStep or OSX. Whatever. Software engineers are notorious for their bad documentation, and with all invention, serendipity and handy, half-assed solutions arrived at in the wee hours are always abundant. I just knew I had a to fix a stupid problem with the most elegant personal computing system presently sitting like a paper weight on my hotel room desk.
I happened to have handy the original install discs for the OS that came with the machine when I bought it. The simplest solution was to reinstall the entire system to its original state from those discs. That would take an hour or so, and some courage. I was careful to select the install settings to preserve my current data and program files. I made it so and waited for the hoped for outcome. All went well. Now all I had to do was go through the time consuming process of updating the OS to its most recent and secure form via the ‘net. This only took an hour and a half, while I enjoyed watching the dismal news coming out of Libya on CNN. At least the hotel cable TV was still working.
Next, I found that my most relied upon productivity software was no longer working. Those old install discs had reinstalled those wonderful word processing, presentation, and spreadsheet programs that came bundled with the OS to an earlier version than my most recent updates, and they no longer worked with the now updated operating system. My install discs for those programs were sitting in a box forty miles from my hotel room. Off I went to the Apple Store to spend two hours and $80+ dollars getting the most current versions of that the essential software. At that price, it should have been a bargain, but for the time wasted and the expense of twice buying the same thing.
About five hours after my adventure began, now somewhat financially challenged and behind on an important deadline, I was again ready to enjoy the convenience of thoroughly modern personal computing.
But, what happened to the marvel of self-healing software, long promised, but still a tenuous gleam in the gauzy warp and weave of a computer scientist’s loom of logic? Will the dream of a world of reliable and super-fast delivery of software, security patches, and truly collaborative computing exemplified in primordial form by Google Apps soon appear? Will access to every sort of information sans physical media finally be as dependable as flicking a light switch and seeing your own bedroom appear before you in an instant and as though by magic? Will that software be as easy to use as it is to put on your slippers? Will the vaunted Cloud of cloud computing be so pillow soft that our minds can rest easy upon it to dream and create?
Fortunately, some help is on the way, and not a moment to soon. Cloud computing is starting to mature into a feasible tech and a realistic platform for commerce. Apple’s App Stores, Amazon’s various efforts, Google’s audacious innovations and experiments, all point to a not very distant future of utility computing.
The iPad, and similar devices are implementing the long sought grail of banishing confusing and non-intuitive file management systems with a model that associates all data files with application programs, so you can click on an app and simply see only the  files created by it. Or, perhaps you’d like to see your files stored along a timeline, in just the way you recall life, as a story in time. David Gelernter, of Yale, and others have been working on such intuitive systems. In such a model, the user, a Human, no longer has to do machine-like work and have a memory that cleaves to a machine’s mode of “thinking”.
Some decry this effort as resting control from the Human operator by hiding the guts of their machine’s logic and memory inside a black box. Most Human’s will likely find it liberating to not have to remember where they stored >ImSoMadAtObama< on their hard drive or Goodle Docs account, and what programs will open it. For those that want to dig into the guts of their machine’s “mind”, their will always be hacks and jail breaks, and plenty of good fun to be had in the infinite space of imagination painted in digits and pixels.
In any case, what ever the near future brings, let’s hope our junk just works. We need hammers and crowbars, and so far, we’re still getting Rube Goldberg Machines.
S

Tuesday, March 29, 2011

Fashionable Computers Disappear and Things Start Thinking

Folks,
In the long ago days of 1998, my Mom bought her first computer, an iMac. At the time of her purchase, the fact that it looked cute and didn't require a mess of wires to hook-up was more important than all that megahertz and RAM stuff that she still does not nor care to understand.
Gateway soon introduced a line of radically slimmed down desktop machines, as did Sony and other manufacturers. Intel was showing off prototype computers that looked like brightly colored Aztec pyramids or sleek modern sculpture. A company that made a popular line of web servers was packing their industrial electronics in a blue cube barely bigger than your hand. At the same time, AOL and Microsoft wanted to be everywhere anywhere via a new generation of handheld and TV set-top devices. Palm Computing’s then current offering came in a sleek aluminum case that would not have been out of place on a Klingon Warrior, sheathed next to his Bat’leth.
Today, such design consciousness in digital devices, from computers to gadgets that had not existed in 1998 (WiFi hotspots and routers, digital music players, inexpensive consumer digital cameras, etc.) is the norm. As with an older tech, the automobile, consumers are making buying decisions based as much on a machine’s look and feel, as on the technology inside the box.
So what? Well, there are smart folks, such as the  MIT Media Lab’s near futurist Andy Lippman, that believe this tells us that we are witnessing both the first and final two or so decades of the personal computer as an Everyman's status object and fashion statement. They contend that when a technology gains more attention and confers more status as a fashion statement than its work-a-day purpose, it's probably about to disappear. That's "disappear", as in to be removed from view.
Confused? Let's look at an analogous situation. When was the last time you thought of the multi-gigawatt power plant on the other end of the wire that connects it to the motor you never hear in the compressor that you never think of in your fridge? All four pieces of technology just mentioned used to be big deals in the marketing of electric power, as well as the industrial design of fridges. Remember those old machines with the fat, round compressors on top? If you've never seen one for yourself, look for one in the background next time you're watching a 1930's vintage movie.
Today, the most important thing about picking a place to keep the beer cold is how well it disappears into the decor of our faux colonial kitchen. The last thing that we want from a reliable, ubiquitous technology is for it to call attention to itself. It should just be there, and be working the next time you feel like having some ice-cream. Thus, we hear the prediction that after the current phase of computer and telecom product design, the devices will begin to fade into the background of our environment. Their services, though, will still be there, but more reliably, like the light that comes on when you open the fridge.
What will emerge from this reinvented model for computing and telecommunications? Individually, many services will seem trivial from our present point of view. Milk cartons may access the internet-grocery store when they get low, and order replacements for themselves. A necktie might tell a business associate’s electronic rolodex what your email address is, as you shake his hand. Through the same tech, tablet computers distributed freely across the office will know who is holding them and what documents will be required for the next damn meeting. Pages in electronic books and catalogues, made with electronic paper, will update themselves when new information is available. Web-based information will be accessible from not only from hand-held devices, but previously "dumb" objects such as the tread of your car's wearing tires. 
Other services will seem less triffling. Your tee-shirt may contain processors woven into its fabric and a web-connected cardiac monitor to let the hospital know to send an ambulance when you’ve eaten more heart arresting calzones than you can jog away. That shirt and it’s wireless connection will be subsidized by a changing assortment of ads for the hospital’s services displayed on flexible video screens over your chest and back. Your own exertions will supply the power for the “smart” shirt’s connectivity and computing.
Devices like mice and trackpads, even keyboards, will truly disappear, and not just from view. They will be replaced by an intelligent environment that knows where you are looking and understands your facial and other gestures, as well as speech via cameras, microphones, and machine smarts. You may be wearing a hat or headband that puts you in direct connection to your outboard “brain”. The ubiquitous screen that we’ve been peering at for almost seventy years will disappear into your contact lenses or stylish shades, and they will also be your computer interface.
We are at the threshold of some truly remarkable tech. It will enable the things around us will seem to think, and we will think little of that, as the machines fade into the background and do their work invisibly. This prospect brings both promise and peril, of course. Do we really want our very environment to know all of our comings and goings, where our eyes wander, what stimulates various sectors of our neocortex… the host of our self and self awareness? Will the last bastions of privacy fall with this new generation of hidden tech? Can a society and culture function without secrets be kept from not only its denizens, but their own machines?
Interesting questions to ponder on the high speed ride to the near future!
S

Monday, March 28, 2011

What’s in a Name?

Folks,
If you're like myself, or my father or mother for that matter, you probably had a little red wagon when you were a kid. It was probably called a Radio Flyer. Did you every wonder why? What did radio have to do with a child's wagon? As for flying, it never went much faster than you could pull it over the sidewalk or lawn.
So, why call it Radio Flyer? Well, immigrant Antonio Pasin, inventor of the little red wagon formerly known as Model #18, needed something catchy and thoroughly modern sounding in a name. The time was the 1920s, and the two truly hot technologies coming to commercial prominence were radio broadcasting and human flight.
Other immigrants, such as Marconi, Tesla, and General” Davide Sarnoff, where making distant sounds, and not inconsiderable money, appear out of thin air. People like Charles Lindbergh and Glenn Curtiss were finding some things to do with "aeroplanes", besides dropping grenades ineffectually onto the farm fields of WWI Europe. They were setting records, zipping across entire oceans in barely more than a day, and swiftly delivering mail across continents. People turned out at country fairs, and paid good money, to see "Barnstormers" perform death-defying aerial magic. What name could have been more trendy and cool than Radio Flyer? It even looks pretty good today at RadioFlyer.com. Check out their present retro/modern as tomorrow new/old logo.
Now, every generation since the dawn of the industrial revolution has stood somehow transfixed by the latest technology to take off in the commercial market. At the dawn of television, the public was treated to television programs with characters like Captain Video; a rocket pilot, not a camera man. A couple of generations before the renamed Model #18, writer Nathaniel Hawthorne remarked, "Is it a fact- or have I dreamed it- that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence!" He's referring, of course, to the telegraph. The year was 1851. Hawthorne, like Teilhard de Chardin in his treatise “The Phenomenon of Man” wrote a century after the great poet’s heralding of a new age, foresaw the advent of today’s Internet.
Whatever. Today, just about anything with a dot-com in its name is apt to predispose us to a favorable view of it. Things cyber and virtual have become so trendy, the words have almost lost their original meaning. In fact, the word cyber practically has no meaning! It's borrowed from the word cybernetics, itself an adaption of the Greek word for a helmsman, kybernetes, and was coined by mathematician Norbert Wiener. The year was 1948, well before the advent of the microchip, and rose from Wiener's theories about the similarities between steering mechanisms for naval torpedoes and human mental processes.
Everything has a history. No technology emerges into the world as though from the Void. A company, technology or product with a name that contains dot-com, or for that matter, video, radio, steam or steel, is no more or less likely to be excellent or useful than one with the word ACME in it. But, names can sound cool. They can evoke emotions and spark notions. They can even offer a sense of dominion over what is named and pride in naming it. Naming things was, after all, one of the first gifts that G-d is alleged to have given to Man after He made the first Woman.
What’s in a name? 
S

Sunday, March 27, 2011

Great Grandmothers, Social Networks, Revolution!

Folks,

A long, long time ago, when the late Victorian Era ladies and their daughters who were born before the arrival of the Roaring Twenties, when most "civilized" women of the day's cultured society did not drive horseless carriages  nor leave the house without extensive preparation in grooming and attire. Even after the Dress Reform Movement and associated undergarment reform, it might take hours to just get ready to go to the market on the horse drawn trolley down a poop laden, smelly dirt street. This after waking at dawn or before to see to it that her husband and children were fed and clothed. It was an ordeal, not very glamorous, particularly for the wife of a successful business person or public figure in her community.

Alas, the market, the general store, the apothecary, and so on, were the only places during the average day that she could meet and chat with friends and acquaintances, those not her immediate neighbors over the fence behind the garden or grape arbor. Communication, idle or otherwise, was no more easy to engage in that transportation.

My great grandmother was one of the ladies of that day. She lived in Greenfield, MA in the 1910s. Her husband was a bit of a techie for his day, a gadget freak. He had one of the first private autos in town. Several folks had early trucks  to haul goods to and from market and the local metal and textile mills, but my grandfather's car for hauling people for such purposes as having a picnic up the old farm road by the mountain stream on Sunday. Gasoline powered vehicles were so rare in that day, he had to have his own big gas tank with attached hand-pump in what had been the carriage barn.

Around the same time that my great grandfather brought home that big car, he had the local phone company install two telephones . One was in the house that he, his wife, and my grandmother and grand uncle lived in. The other was at his mill up by the Vermont border, in Turner's Falls. Back at the turn of the last century, as the auto and telephone were ascending, and the region's textile and milling economy had yet to settle into utter decline, it was common for affluent owners of businesses to live a far bit away apart from their going concerns. With no highways as we know them today, a telephone link from the owner's home to the plant or store was the way to ride herd on employees working nights and weekends. Likewise, pharmacists , doctors, shop owners and other business folk could call up via the central operator (no dials or push buttons on phones yet) their suppliers and commercial customers.

But, what's this got to do with social networking and ladies who liked to gab with their turn o'the last century friends?

Well, remember how dirty, smelly, and time consuming it was for middle class and prosperous women to get down to the milliners for a fancy hat and the essential community news of the day with their friends about who and what was up or down, and for coconspirators to discuss such things as the emerging Suffrage Movement.

My great grandmother and some of her colleagues noticed that they could pick up those phones and ring through Central to get in touch with each other while still dressed in their house coats, securely away from their husbands and children busy at work or school or toiling in the mills, and speak freely and intimately… even politically.


Today, girls in Tokyo, Seoul, Detroit, Toronto, Tel Aviv and elsewhere text gossip and small news franticly while young women at the foment of revolution in the Middle East lay their lives on the line in tele-communicated bits.


The beat goes on. It is interesting how some things do not change, but they do evolve and iterate with tech and culture. Interesting. Anyhow, thank you Bubbie Fannie and your ilk, then and now and into the future, for your contributions.

S


Thursday, March 24, 2011

Folks,


When I was about six years old, my idea of geeky fun was to hide out in the dank basement of our family's house on Abbott Street, in Springfield, MA. There, I'd take apart old clocks, yank the vacuum tubes from ancient radios, and play jungle explorer with my father's night-watch kerosene lantern from his days defending Panama from WWII Japanese subs that never came.


But, my favorite activity was playing with an greasy, heavy wrought iron monkey wrench. I was fascinated by its heft in my little hands, and the simple dexterous motion of the thumbscrew that coiled and uncoiled to adjust the vice atop its hand grip. That it had only two moving parts was aesthetically pleasing to the arts and tech-inclined nerd that I was to grow up to be.


But, why was it called a monkey wrench? It didn't look like a monkey. Even my young mind knew that arboreal, knuckle walking social primates had no fittings nor pipes to tighten. My all-knowing dad had not a clue to the moniker's origin, either. I soon moved on to other questions, and left this matter behind to ponder and execute the design of futuristic model cites out of balsa wood cubes and some copper pipes, plastic safety razor holders, Instamatic Flash Cubes recovered from the trash. It was now four years later, 1964, and I'd just seen the NYC hosted World's Fair. The future was inhabiting my thoughts; a future where turbine powered cars would run on grain alcohol and people would make phone calls with video.


Some two decades later, however, I was looking toward the other end of history, rummaging through the fabulous Johnson's Used Book Store. This place is a lost treasure to the decay of downtown Springfield. It is so gone that it leaves not even a trace in the digital aether on Google nor Wikipedia. In any case, on one sunny afternoon, I somehow found the mouldering scent of old periodicals more enticing than the scent of spring blossoms and fetching young ladies, and I browsed the racks of tattered periodicals. There, I stumbled upon a magazine whose name I have long forgotten. Within it, though, was an article about the origin of the term monkey wrench. Shazzam! My inner six year old got his long delayed prize.


Although most tech and tool nerds believe that a British guy named Moncky (see the link above illustration below), was the fellow that patented an adjustable wrench in 1858 and gave his name to the tool, the fact is Lexicographers have the found the term monkey wrench dating back to at least 1840. Connecting the dots, the path to the tool's name comes to an origin in the hands of an iron miller named Monk who had grown tired of lugging around his heavy tool box full of fix sized wrenches. He worked in a small metal fab plant in Springfield on the bend where Mill Street meets southern Main Street. The mill was then owned by Bemis and Call Company, and they proceeded to produce Monk's innovative gripper/twister/puller and plumbers helper. That place on the corner still stands today, by the way. It's now a furniture and home decorating shop for the affluent.


Whatever, some day a future father will be cranky as he goes down to the basement to bang on the fittings of the home's aging thin-film photovoltaic powered sodium/water heating system and hand a wrench to a kid that will become fascinated with supple, useful, adjustable tech.


Keep and eye to the future, and an ear to the past…


S


Wednesday, March 23, 2011

A Modest Proposal…

Folks,

I've been blogging recently about where tech collides with politics, society, commerce and culture. Yesterday, it occurred to me that I could could localize this stuff to be of particular interest to folks in Northampton, MA and the Pioneer Valley, as I tell simple stories of how tech hit the streets, worked its way into business, and entered our homes in the past, and continues to do so today. Some of the topics that immediately occurred to me are:
  • Why are the only industries remaining in the Pioneer Valley, once the Silicon Valley of the first Industrial Age, light metal machining, arms manufacturing, specialty plastic fasteners, and agricultural chemicals?
  • Why do we call a monkey wrench a monkey wrench? Clue: it has everything to do with the Valley, and nothing to do with monkeys.
  • How did William Pynchon, Springfield, MA's founder, pull off one of the great swindles of early British colonialism in America by using ancient technology to drain a swamp in what became Springfield's north end. And, where did the natives that he swindled go?
  • How and why was Florence, MA chosen as the home to a utopian community and worker owned textile mills?
  • Why did Rolls Royce choose this region to build their vehicles outside the UK?
  • And, why is the Valley a rare hotbed of enthusiasm for NASCAR in the Northeast?
  • What are so many sensible but poor Hispanics doing in New England the Valley when they could be just as disadvantaged in someplace warm by coral beaches? Clue #2: It has to do with bombs and those agricultural chemicals.
  • How did Victorian Era ladies help build the telephone industry in Greenfield, circa 1910?
  • Did Mark Twain finance and own the world's first word processor while living in the region in the late 19th Century?
  • How do Northampton's Web development companies stay afloat, even prosper with an international clientel, in the face of competition from 'Net and telecom giants, cloud-based freeware, and industry consolidation?
  • Why was fiber-optic tech invented in the region at the dawn of the Internet and telecommunications revolution, but never commercialized by its inventor? How did Quaker Oats figure in this story?
So I'm pitching these ideas for an occasional column at Northampton Media, an excellent local news portal whose reporters and editors typically hustle harder and outperform their colleagues in regional print, television and radio. We'll see what happens.

S

Monday, March 21, 2011

There's Power and Then There's Power…

Folks,

Does nuke power have to be ugly and dangerous? There is such a tech, Thorium Nuclear Reactors, originally researched in the USA back in the '40s but never adopted as it was not suitable to generating Plutonium for Hydrogen Bombs from spent fuel used in the Peaceful Atom Program. Thorium nuclear tech is now being explored by the Chinese for the purpose of clearing their air of fossil fuel pollution, and powering their booming economy into the future. It also has the benefit of producing far lower amounts of toxic waste that must be sequestered for longer than Humanity has had civilization and perhaps even existed.

Meanwhile, other alternative energy sources including turning the faces of our skyscrapers into power plants, harvesting tides and the warmth of the deep Earth, the breath of the wind, the rain of photons from the Sun remain to be harvested.

Anyway we shake it, here in the West, to cut our addiction to oil, allow the folks in the Middle East and Northern Africa settle their affairs on their own as the big boys they really are (Israel has its own nuclear bombs, and Iran soon will, as well), we have to likely have to embrace everything in our tech quiver to quench our thirst for power.

S

Sunday, March 20, 2011

More or Less in Line.

Folks,

Throughout Human history, across all aspects of our many cultures, innovation continues to either its terminal evolution or highest development along the line laid out in its initial state or conception, a path dependency. This trend can be seen in everything as diverse as religion and mythology to rocketry and space flight.


Right now, today, we can see this pattern manifest in another tech realm as Japan endeavors to stitch its electrical grid back together after it was brought to its knees by the triple whammy of earthquake, tsunami, and a nuclear disaster brought about by Human muddling and over-optimism. Their problem today is the product of decision made one-hundred and five years ago when the big issue to solve was electrically powering newfangled, durable Edison light bulbs and textile shops automated by central electric motors that drove thick, wide bands of leather to power looms and sometimes shredded and  killed workers. Nuclear power was then more than half a century in the future, and nuclear radiation had yet to be discovered. The country was so busy in 1895, industrializing and modernizing, that they hadn't even gotten the knack of numbering new buildings by orderly street addresses, and thus had to invent the fax machine to send maps to each other over the recently installed telephone lines.


So, what's the point? I guess it's just that if you've got a bright idea, take a look to the Long Now. Your notion might have the legs to change and form the future in unanticipated ways.

S

Saturday, March 19, 2011

Hacking Toward Freedom…

Folks,

Tech folklore claims that the Internet routes information around obstacles, even a nuclear holocaust. As we've seen from ongoing and recent events in ChinaEgyptLibya and elsewhere, including the United States, this ain't necessarily so. Humans engineered the Internet, and the forces of repression can always install devices to spy on Internet users, and even hit a "kill switch" to cut off access to all or part of their populations as they please.

But, here's an article from the Economist magazine. It describes how rebels can build tools out of bubble gum and bailing wire to preserve their access to and from the 'Net, despite the efforts of tyrants to silence them and stop their ability to organize and make trouble for the powers that be. Any means of transport, whether for bullion or bytes, email or electrons, can be hijacked by clever people. That microwave oven (tech first repurposed from military radar to cook crappy meals) on your kitchen counter might someday be a tool for democracy as potent as the printing press was in the American colonies of the Eighteenth  Century. Organizations such as Tactical Tech and the Tor Project supported by the Electronic Frontier Foundation can show you how.

S


Friday, March 18, 2011

Edison, Jobs, Kurzweil!

Folks,

Some folks refer to Apple's Steve Jobs as the Edison of our age, building useful playthings and workplace tools that disrupt incumbent industries and transform popular culture. For Edison, it was the phonograph and motion picture camera that put the kibosh on the player piano and got folks off the front porch and into movie theaters on Saturday afternoons. With Jobs it was first the Macintosh, then the iPhone and iPad.


With both of those guys, what they essentially did was take a pile of already existing inventions and materials off the lab bench, bee's wax, a lathe, digital circuit boards and aluminum, and configure them in ways that provided a superior way to be entertained or to do work. Their inventions are not novel in an essential sense. But, there is a guy named Ray Kurzweil who actually has invented machines that do things that machines could previously not do, that extend Human capacities beyond what is possible, or at least too tedious to undertake with our own senses and limbs.


If you've ever used an optical character recognition program to convert printed text to digital information, you can thank Kurzweil. When you listen to Stevie Wonder play, you are hearing electronic instruments controlled by Kurweil's voice pattern recognition engines and sonic wave forms created by the circuits that he designed.


But there's more. Kurzweil has a remarkable record of predicting the future of Humanity and tech. Starting with his book, "The Age of Intelligent Machines", and then in his more recent "The Age of Spiritual Machines", he forecasts a day coming soon, when our inventions will embody characteristics that more than mimic our own capacities in dexterous thought. They will appear to us to be conscious.


Given the solid batting average that he's thus far demonstrated both in prediction and commercializing the fruits of his ruminations, despite his seemingly nutty ideas on human life extension, this fellow demands being paid attention to. We have already seen, this year, a machine named Watson, that appears to understand subtle jokes and puns. Computers are emerging from University and industry labs that can "calculate" from human facial expressions and postures the state of their user's emotions. "Androids" are being created that closely mimic the human form, right down to appearing to breathe and perspire.


Yes, do stay tuned to what this old whack job is forecasting. When the iPod is as distant a memory to civilization as rotating cylinders that seem to sing, the proceeds of Kurzweil's work may still be evolving.


S

Ray Kurzweil