I've been paying close attention to developments in technology for over half a century, which has led me to studying the history of technology (which is much more interesting and makes more sense than the political history they teach you in high school). That means I'm able to take the long view of current events. I look at them as the latest instant in a half-million-year story beginning with some monkey-like ancestors who noticed that by banging two rocks together, they could produce a sharp edge useful for slicing their neighbors into bite-sized pieces.
That makes me a serious skeptic of the latest whizz-bang! gizmos and gadgets. It, of course, does not endear me to marketers who want everyone to get excited about what they've thought up to sell next.
Don't get me wrong. There's lots going on at the cutting edge of technology. You can't buy it, though. What you find proudly displayed at Best Buy is technology I covered twenty years ago when I was Chief Editor at Test & Measurement World.
A case in point is the ability to have a face-to-face conversation through your real-time-video-capable handheld mobile communication device (smartphone for short). Blackberry this week was sending me (unwanted) messages pushing their latest smartphones as if I was dying to pony up a bunch of bucks to experience the "latest and greatest." One of the most impressive features (aside from 4G speed, which would be convenient, but is really only specsmanship) is the aforementioned face-to-face conversation.
From the long view, this is ancient technology. Science fiction writers predicted it in the 1940s and 1950s. It was test marketed in the late 1960s. Called the "videophone" then, folks who had the equipment generally blocked the video portion. "Too much information," was the consensus. The important information in nearly all conversations could be exchanged by voice. The rest was of little value and potentially embarrassing. Yet, marketers constantly drag it out as the next great thing whenever they get the chance.
Another "innovation" that keeps getting promoted is the touch-screen keyboard. Folks who use a mechanical keyboard a lot wouldn't touch a touch-screen keyboard on a bet! (Pun very much intended.) The tactile feedback that naturally comes from a mechanical keyboard is critically important. You only use a touch-screen keyboard when you can't get an array of buttons. It's second tier at best, so don't try to convince me it's the latest and greatest!
Finally, let's look at the whole idea of Web-enabled mobile devices. Back in the 1970s we ("we" being scientists and engineers constantly being lacerated by the cutting edge of computer technology) had our own, personal, "minicomputers," and were using novel networking technology to connect them to the big brains of the mainframe computers owned by the university and shared by whoever had access. We always knew when Administration was running payroll because nobody else could get anything done.
There were also so-called "dumb terminals" available that couldn't do anything but send information back and forth to the mainframe. We used our personal computers to get actual work done, and store information that we wanted to keep to ourselves. We saved mainframe access for simulation programs that took forever to run, anyway.
Today, however, the latest and greatest seem to be things that just talk to the "cloud." They've little ability to store and process information locally -- and now we're worried about hackers stealing our important data! It's a dumb terminal at best.
The dilemma I'm faced with in this column is what technology stories to cover. Do I write about the exciting cutting-edge technology advances being developed today, which just-folks can't get their hands on and wouldn't have a use for if they could, or do I cover the old, boring (to me) technology being flogged by mass marketers as the latest and greatest?