Switch to Desktop Site

The ghost in the iPad. Why tech becomes us

(Read article summary)

AP Photo

(Read caption) Newscaster Walter Cronkite, right, listens as Dr. J. Presper Eckert, center, describes the functions of the UNIVAC I computer he helped develop in the early 1950's. Modern computers are only a fraction of the size.

About these ads

In an early 1970s computer-science class, I learned to organize punch cards to run basic programs on an IBM computer so big and hot that it had its own supercooled building. The computer lab was one of the best places to hang out on 100-degree F. afternoons in Austin, Texas.

When Fortran and COBOL were properly coded, the behemoth’s results emerged in minutes. Although laughably primitive by today’s standards, in 1972 the IBM mainframe felt like the future. Twenty-five years earlier, cybervisionary Alan Turing had pined for such a machine: 
“[B]eing digital should be of more interest than ... being electronic,” he said at a time when a “computer” was more commonly understood to be a gifted human with a pencil, paper, and instructions about what to calculate.

You know where this is going: Twenty-five years from now, iPads, BlackBerrys, and Android phones will be as quaint as the IBM mainframe or the defunct Macintosh 128K I found at the town dump last year. Moviemakers will use them as props to create a retro 2010 look (“It was hilarious, Martha! He was using a ‘touch screen’ computer and a ‘virtual keyboard.’ Wha?”)


Page:   1   |   2   |   3