Thursday, January 21, 2010

Sand to Silicon By Shivanand Kanavi, Internet Edition-4

NIRVANA OF PERSONAL COMPUTING

“The fig tree is pollinated only by the insect Blastophaga Grossorum. The larva of the insect lives in the ovary of the fig tree, and there it gets its food. The tree and the insect are thus heavily interdependent; the tree cannot reproduce without the insect; the insect cannot eat without the tree; together they constitute not only a viable, but also a productive and thriving partnership. The purposes of this paper are to present the concept of and hopefully foster the development of mancomputer symbiosis.”

— J.C.R. LICKLIDER, in his pioneering paper, ‘Man-computer symbiosis’ (1960).

“What is it [PC] good for?”

—GORDON MOORE, co-founder of Intel, recalling his response to the idea proposed by an Intel design team, of putting together a personal computer based on Intel’s chips in 1975.


In the last chapter we saw glimpses of the power of computers to crunch numbers and their universal ability to simulate any machine. We also looked at some applications of this power. However, they were engineering or business applications, the plumbing of today’s society, necessary and critical for running the system but hidden from most eyes. The users of those applications were a restricted set of academics, engineers, large corporations and government agencies.

Yet today there are nearly half a billion computers in the world and it is ordinary people who are using most of them. How did this happen and what new work habits and lifestyles have computers spawned? What new powers have they brought to our fingertips? How did machines that earlier intimidated users by their sheer size and price proliferate so widely and change our lives? These are some of the issues we will examine in this chapter.

GETTING PERSONAL

Instead of an apocryphal case study, let me narrate my own journey as a computer user. As a postgraduate student in physics, I took an undergraduate course in programming at IIT, Kanpur, more than thirty years ago. That was my introduction to computing and computers. The course looked interesting, but preoccupied as I was with classical and quantum physics, I looked at computers merely as an interesting new tool for numerically solving complicated equations in physics, especially when I could not find an exact solution through mathematical ingenuity.

What I did find was that computers were useful in dressing up my laboratory reports. A routine experiment, repeating the method devised by Robert Millikan (Nobel Prize for physics, 1923) to determine the charge of an electron, and another dealing with the vibrational spectra of diatomic molecules, needed some numerical calculations. I knew that the lab journal would look good if I added several pages of computer printout.

I used an IBM1620 at IIT, Kanpur’s computer centre. It was an advanced computer for 1972-73, occupying an entire room and needing a deck-full of punched cards to understand my program written in FORTRAN. It took a whole day’s wait to get the printout, and to find out whether there was an error in the work or the results were good. My achievement of using the computer for a routine lab experiment looked forbiddingly opaque and impressive to the examiner. He smelt a rat somewhere but could not figure out what, though he remarked that the experimental errors did not warrant a result correct to eight decimal places. He did not become privy to the cover-up I had done inside the lengthy computer printout, and I got an A in the course. Moral of the story: computer printouts look impressive. Numbers can hide more than they reveal.

My next brief encounter with computers, this time a time-sharing system at graduate school in Boston (1974-77), was perfunctory. My research problem did not involve numerical computing, since I was investigating the rarefied subject of ‘super symmetry and quantum gravity in eight-dimensional space’. But for the first time I saw some of my colleagues use a clacking electric typewriter in a special room in the department, with a phone line and a coupler to connect them to the main computer, a CDC6600. They would type some commands and, seconds later, a brief response would manifest itself on the paper. For someone accustomed to daylong waits for printout, this appeared magical.

On my return to India, I took up another research problem, this one at IIT, Bombay (1978-80). It was a more down-to-earth problem in quantum physics, and needed some numerical calculations. My thesis advisor, an old-fashioned slide-rule-and-pencil man, depended on me to do the computations. Though there was a Russian mainframe at the IIT campus, I did my initial calculations on a DCM programmable calculator in the department. Having proved our hunches regarding the results, we needed a more powerful computing device.

We discovered a small Hewlett-Packard computer in a corner of the computer centre. It needed paper tape feed and had blinking lights to show the progress in computation. The BASIC interpreter, which had to be loaded from a paper tape after initialising the computer, made it interactive—the errors in the program showed up immediately and so did the result when the program was correct. We were overjoyed by this ‘instant gratification’ and higher accuracy in our computation. We went on to publish our research results in several international journals. Clearly, interactivity, however primitive, can do wonders when one is testing intuition through trial and error.

Ten years and some career switches later, I had become a writer and was glad to acquire an Indian clone of the IBM PC, powered by an Intel 286 microprocessor. It could help me write and edit. The word processing and desktop publishing function immediately endeared the PC to me, and continues to do so till today. I think the vast majority of computer users in the world today are with me on this.

In the early 1990s I was introduced to Axcess, a pioneering e-mail service in India, at Business India, where I worked as a journalist. It became a handy communication medium. Then we got access to the World Wide Web in the mid-’90s, thanks to Videsh Sanchar Nigam Ltd (VSNL), and a new window opened up to information, making my job as a business journalist both easy and hard. Easy, since I could be on par with any journalist in the world in terms of information access through the Internet. Hard, because the speed in information services suddenly shot up, increasing the pressure to produce high quality content in my stories before anyone else did and posted it on the Internet. The computer as a communication tool and an information appliance is another story, which we will deal with later.

The purpose of this rather long autobiographical note is to communicate the enormous changes in computing from central mainframes, to interactive systems, to personal computing. Older readers might empathise with me, recalling their own experience, while younger ones might chuckle at the Neanderthal characteristics of my story.

Nevertheless, it is a fact that the change has turned a vast majority of today’s computers into information appliances.

SYMBIOTIC VISION

One of the visionaries who drove personal computing more than forty years ago was J.C.R. Licklider. Lick, as he was fondly called, was not a computer scientist at all, but a psycho-acoustics expert. He championed interactive computing relentlessly and created the ground for personal computing. In a classic 1960 paper, Man-Computer Symbiosis, Licklider wrote, “Living together in intimate association, or even close union, of two dissimilar organisms is called symbiosis. Present day computers are designed primarily to solve pre-formulated problems, or to process data according to predetermined procedures. All alternatives must be foreseen in advance. If an unforeseen alternative arises, the whole procedure comes to a halt.

“If the user can think his problem through in advance, symbiotic association with a computing machine is not necessary. However, many problems that can be thought through in advance are very difficult to think through in advance. They would be easier to solve and they can be solved faster, through an intuitively guided trial and error procedure in which the computer cooperated, showing flaws in the solution.”

Licklider conducted an experiment on himself, which he quoted in the same paper. “About eighty-five per cent of my thinking time was spent getting into a position to think, to make a decision, to learn something I needed to know. Much more time went into finding or obtaining information than into digesting it. My thinking time was devoted mainly to activities that were essentially clerical or mechanical: searching, calculating, plotting, transforming, determining the logical or dynamic consequences of a set of assumptions or hypotheses, preparing the way for a decision or an insight. Moreover, my choices of what to attempt and what not to attempt were determined to an embarrassingly great extent by considerations of clerical feasibility, not intellectual capability. Cooperative interaction would greatly improve the thinking process.”

Licklider left MIT to head the information processing technology office of the Advanced Research Projects Agency, ARPA, attached to the US defence department. He funded and brought together a computer science community in the US in the early 1960s. He also encouraged the development of computer science departments for the first time at Carnegie Mellon, MIT, Stanford and the University of California at Berkeley.

“When I read Lick’s paper in 1960, it greatly influenced my own thinking. This was it,” says Bob Taylor, now retired to the woods of the San Francisco Bay Area. Taylor worked as Licklider’s assistant at ARPA and brought computer networks into being for the first time, through the Arpanet. But that is another story, which we will tell later. For the time being it is important to note that after he left Arpa, Taylor was recruited by Xerox to set up the computing group at the Palo Alto Research Centre, the famous Xerox Parc.

THE SPARK AT PARC

One can safely say that in the 1970s Xerox Parc played the same role in personal computing that Bell Labs had played in the history of communications. Besides articulating ideas of interactive and personal computing, Parc pioneered the technology used by the Apple Macintosh, Microsoft Windows, as well as the laser printer – the point and click programs using the ‘mouse’ and layered windows. Parc also encouraged the design of graphic chips, which led to Silicon Graphics, and championed VLSI technology along with Carver Mead of Caltech. Small Talk, an object-oriented language that heavily influenced C++ and Java also originated there. The Ethernet was created at Parc to build a local area network and so was the Bravo word processor, which led to Microsoft Word.

No other group can claim to have contributed so much to the future of personal computing.

Xerox made a couple of billion dollars from the laser printer technology invented at Parc, thereby more than recovering all the money it invested in open-ended research at the centre. However, as a document copier company faced with its own challenges, it could not win the PC battle. Xerox has been accused of “fumbling the future”, but interested readers can get a well-researched and balanced account of Xerox’s research in Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age by Michael Hiltzik.

The role played by Xerox PARC in personal computing shows once again that a spark is not enough to light a prairie fire: one needs dry grass and the wind too.

There are two important factors that fuelled the personal computing revolution. One is the well-recognised factor of hardware becoming cheaper and faster and the other the development of software.

MICRO IN SIZE BUT MEGA IN POWER

It is educative to note that the development of the microprocessor, which is the central processing unit of a computer on a single chip, was not the result of a well-sculpted corporate strategy or market research, but of serendipity. An Intel design team led by Ted Hoff and Federico Faggin had been asked by a Japanese calculator maker to develop a set of eight logic chips. Instead of designing a different chip for every calculator, the Intel team decided to design a microprocessor that could be programmed to exhibit different features and functions for several generations of calculators. The result was the world’s first microprocessor— the Intel 4004.

While launching the 4004 in 1971, Intel pointed out in an ad in Electronic News that the new chip equalled the capacity of the original ENIAC computer though it cost less than $100. Gordon Moore went further and described the microprocessor as “one of the most revolutionary products in the history of mankind”. A prescient declaration no doubt, but it was mostly dismissed as marketing hype. The world was not ready for it.

A year later Intel designed a more powerful 8-bit processor for a client interested in building monitors. The Intel-8008 led to glimmers of interest that it could be used inside business computers and for creating programmable industrial controls. But programming the 8008 was complicated; so very few engineers incorporated it in their industrial programmable controllers. But hobbyists thought it would be cool to use the chip to wire up their own computer, if they could.

Among these hobbyists were the sixteen-year-old Bill Gates and the nineteen-year-old Paul Allen. All their enthusiasm and ingenuity could not make the microprocessor support the BASIC programming language. So they instead made machines that could analyse traffic data for municipalities. They called their company Traf-O-Data. Several people were impressed by their machine’s capability, but nobody bought any.

Then, two years later, Intel introduced the Intel–8080, ten times more powerful. That kindled immediate interest. A company named MITS announced the first desktop computer kit, called the Altair 8800, at a price of less than $400. Featured on the cover of Popular Electronics magazine’s January 1975 issue, the Altair 8800 marked a historic moment in personal
computing.

But, as Bill Gates and Paul Allen discovered, the machine did not have a keyboard or display or software, and could do no more than blink a few lights. True to form, the name Altair itself came from an episode of the then hugely popular sci-fi TV serial, Star Trek. Gates kick-started his software career by writing software to support BASIC on Altair computers. Later he took leave from Harvard College and, along with Paul Allen, started a microcomputer software company, Microsoft, at Albuquerque, New Mexico.

BIG BLUE AND THE PC

Intel then came out with the 16-bit microprocessor, the 8086, and a stripped down version, the 8088. At that time, ‘Big Blue’, that is, IBM, got into the act. Using Intel’s processor and Microsoft’s MS-DOS operating system, IBM introduced the PC in 1981. It took some time for the PC to start selling because enough software applications had to be written for it to be useful to customers, and a hard disk capable of holding a few megabytes of memory had still to be attached. This happened in 1983, when IBM introduced the PC-XT. Spreadsheets such as VisiCalc and Lotus 1-2-3, database management software like dBASE, and word processing packages like WordStar were written for it.

Meanwhile, a fledgling company, Apple Computers, introduced Apple Macintosh, which used a Motorola chip. The Macintosh became instantly popular because of its graphical user interface and its use of a mouse and multi-layered windows.

The PC caught up with it when the text based MS-DOS commands were replaced by a graphical operating system, Windows. Since then Windows has become the dominant operating system for personal computing.

What is an operating system? It is the software that comes into play after you ‘boot’ the PC into wakefulness. Few people today communicate directly with a computer. Instead, they communicate via an operating system. The more user-friendly the operating system the easier it is for people to use a computer. Although we talk of an operating system as if it is one single entity, it is actually a bunch of programs, which function as a harmonious whole, translating language, managing the hard disk with a provision to amend the data and programs on the disk, sending results to the monitor or printer, and so on.

Without an operating system a computer is not much more than an assembly of circuits.

A MULTIFACETED DEVICE

The significance of the proliferation of inexpensive peripheral devices, like dot matrix printers, inkjet printers, scanners, digital cameras, CD and DVD drives, and multimedia cards should never be underestimated. They have brought a rich functionality to the PC, moving it increasingly into homes.

Spreadsheets were the first innovation that caught the imagination of people, especially in accounts and bookkeeping. Bookkeepers and engineers have for long used the device of tabular writing to record accounts and technical data. While this was a convenient way to read the information afterwards, making changes in these tables was a tedious affair.

Imagine what happens when the interest rate changes in a quarter, and what a bank has to do. It has to re-calculate all the interest entries in its books. Earlier this took ages. Now an electronic spreadsheet allows you to define the relation between the cells of the various columns and rows. So, in the case of the bank above, a formula for calculating interest is written into the background of the interest column, when the interest rate is changed, the figures in the interest columns are automatically altered.

Imagine yourself as a payroll clerk preparing cheques and salary slips and the income tax rates change, or you are a purchase officer and the discounts given by suppliers change. The whole spreadsheet changes by itself with the modification of just one value. Isn’t that magical?

FINANCIAL SECTOR DRIVES TECHNOLOGY

These techie machines made the accountant’s life a lot easier. Computer use has since spread across the financial sector. Banks, stock exchanges, currency trading floors, investment bankers, commodity dealers, corporate finance departments and insurance companies have not only become early users of cutting-edge technology but even drive the creation of new technology. Database management, real-time event-driven systems, networking, encryption, transactions, disaster recovery, 24x7, aggregation, digital signatures, ‘publish and subscribe’ and so on are phrases that have come into software engineer’s jargon, thanks to demand from financial markets.

The dealing room of an investment banker today looks very similar to an air traffic control tower or a space launch command centre – with banks of monitors, high-end software and failsafe infrastructure. One can get a very readable and graphic account of this in Power of Now by Vivek Ranadive, whose event-driven technology has become part of the ‘plumbing’ across much of Wall Street.

Word processing and desktop publishing, or DTP, have come as a boon to everyone. After all, everybody writes, scratches off words and sentences, rewrites and prints. Isn’t it cool that you need not worry about your scribbled handwriting or an over-written letter that exposes the confusion in your mind? In a formal document, different fonts can separate the heading, sub-headings and text, and you can quickly incorporate charts, graphs and pictures as well.

These developments have had a much greater impact than the large office, and will continue to spread the benefits of computerisation.

THE BHASHA† EXPLOSION

If printing presses democratised knowledge to a great extent, then word processing and DTP have brought printing and publishing to our homes.
______________________
†An Indian language.

The development of Indian language fonts and the software for DTP have given a remarkable boost to publishing and journalism in Indian languages.

“It was not money which drove us but the realisation that languages die if scripts die. If we want to retain and develop our rich cultural heritage then Indian language DTP is a necessity,” says Mohan Tambey, who passionately led the development of the graphic Indian script terminal, GIST, at IIT, Kanpur, during his M.Tech and later at C-DAC, Pune.

During the late ’80s and early ’90s, GIST cards powered Indian language DTP all over the country. Today software packages are available off the shelf. Some, like Baraha 5—a Windows-compatible Kannada word processor, are being freely distributed through the Internet. A tough nut to crack is developing Indian language spell-check programs and character recognition software. “The latter would greatly advance the work of creating digital libraries of Indian literature, both traditional and modern,” says Veeresh Badiger of Kannada University, Hampi, whose group is involved in researching ancient Kannada manuscripts. It is a non-trivial problem due to the complexities of compounded words in Indian languages.

RAJA RANI DEKHO

Meanwhile, English language users can add a scanner to their PC and, by using optical character recognition software, digitise the text of a scanned document and build their own personal digital library. They can even clean a scanned image or fill it with different colours before storing it.

The capability to add multimedia features with a CD or DVD player has converted the PC into a video game console or an audio-video device, making it a fun gadget.

Instead of blackboards and flip charts, people are increasingly using PC-based multimedia presentations. The users are not just corporate executives, but teachers and students too. “Many people do not know that Power Point is a Microsoft product. It has become a verb, like Xerox,” muses Vijay Vashee, the ex-Microsoft honcho who played a significant role in developing the product.
______________
†Form of rural entertainment for children with visuals of fascinating places and objects.

A NOMAD’S COMPANION

The advent of laptops added a new dimension to personal computing— mobility. Though more expensive than desktop PCs, and used mainly by nomadic executives, laptops have become an integral part of corporate life.

To make presentations, work on documents and access the Internet when you are travelling, a laptop with an in-built modem is a must, preferably with a compatible mobile phone. In a country like the US, where there are very few ‘cyber cafes’, a travelling journalist or executive would be cut off from his e-mail if he did not have his laptop with him.

The major technical challenge in developing laptops has come from display and battery technologies. To create an inexpensive, high-resolution flat screen, is one of the main problems. “People all over are working on it,” says Praveen Chaudhari a thin film solid-state physicist, at IBM’s T.J. Watson Research Centre, Yorktown Heights. In 1995 Chaudhari won the National Technology Medal for his contribution to magneto optic storage technology, and was recently named director of the prestigious Brookhaven National Laboratory. His own work in developing the technology for large and inexpensive thin film displays might have a significant impact in this field.

“As for battery life, the benchmark for laptops in the US is 4.5 hours, since that is the coast-to-coast flight time,” remarks Vivek Mehra, who played a key role in Apple’s Newton II project. The Newton, a personal digital assistant, failed, but Mehra successfully applied what he learned there about consumer behaviour in the company he founded later: Cobalt Networks.

“In the case of all portables—laptops, PDAs or cellphones—a lightweight, powerful battery is the key,” says Desh Deshpande, well known for his enterprises in optical networking. Deshpande is also the chairman of a company that is commercialising nanotechnology developed at MIT to produce better batteries.

In the mid-’90s, when multimedia applications began to be used extensively in desktops, here was a scramble to include these features in laptops. Prakash Agarwal, a chip designer, took up the challenge and designed a new chip, which combined the microprocessor logic with memory in a single chip. Memory and logic on a chip created magic and brought multimedia capabilities to laptops. Appropriately, Agarwal named his company Neomagic. At one time his chips powered about seventy per cent of the laptops in the world.

Designing chips that work at lower and lower voltages is another problem. “Lower voltages lead to lower power consumption and less heat generated by the chip, which needs to be dissipated. But this is easier said than done,” says Sabeer Bhatia of Hotmail fame. Few people know that before he became the poster boy of Internet, Bhatia was working hard to reduce the voltages in chips at Stanford University.

IN YOUR PALMS

Not many people know that Sam Pitroda, whose name is associated with the Indian telecom revolution, is also the inventor of the digital diary, that handy gizmo which helps you store schedules, addresses, telephone numbers and e-mail addresses. Gradually digital diaries became more powerful and evolved into personal digital assistants, or PDAs. With models available at less than $100, PDAs are fast proliferating among travellers and executives. They not only store addresses and appointments, they also contain digital scratch pads, and can access email through wireless Internet!

I came to know of another function of PDAs almost accidentally. An American software entrepreneur struck up a conversation with me as we waited outside Los Angeles airport. After picking my brain about the Indian software industry, he said at the end, “Shall we beam?” I had no idea what he was talking about. Turns out that ‘beaming’ is a new way of exchanging visiting cards. On returning from a conference or a business trip, it is a pain to input all the data from visiting cards into your computer or address book. The PDAs can store your digital visiting card and transmit the info at the touch of a button and using an infrared beam, to a nearby PDA.

No wonder a Neanderthal like me, using a dog-eared diary, was zapped. But that is what happens when the giant computers of von Neumann’s days become consumer appliances. People invent newer and newer ways of using them.

WORKHORSES

Where PDAs and laptops constitute one end of personal computing, workstations constitute the other. Workstations are basically high-end PCs tuned to specialised use. For example, graphics workstations are used in computer graphics. They are also being used in feature-rich industrial design. For example, say, you would like to see how a particular concept car looks. The workstation can show it to you in a jiffy in three dimensions. Attached to a manufacturing system these workstations can convert the final designs to ‘soft dies’ in die making machines, to create prototype cars. The reason why the Engineering Research Centre at Tata Motors was able to launch its popular Indica quickly was that it used such applications to reduce the concept-to-commissioning cycle time.

As we saw earlier in the chapter on microchips, it is workstations that help engineers design chips.

High-end workstations can do computational fluid dynamics studies to help in aerodynamic design, as they did in the wing design of the Indian light combat aircraft, or are doing in the design of the civilian transport aircraft, Saras, at the National Aerospace Laboratory (NAL), Bangalore.

Roddam Narasimha, a distinguished expert in aerodynamics, took the lead in building a computer called Flo Solver, which could do complex computations in fluid dynamics at NAL. Of course, he did not use workstations; he built a parallel computer.

INDIAN AT THE OSCARS

Among the early users of graphics technology were advertising, TV and films. As a result, today’s heroes battle dinosaurs in Jurassic Park and ride runaway meteors in Armageddon, or an antique on the table turns into an ogre and jumps at the Young Sherlock Holmes.

As Harish Mehta, one of the founders of Nasscom (the National Association of Software and Services Companies) puts it, “The Indian computer software industry should work closely with the entertainment industry to produce a major new thrust into animation and computer graphics”.

Not many people know that during the Star Wars production in the 1970s, George Lucas, Hollywood’s special effects genius, used some of the technology developed by an Indian academic-turned-entrepreneur, Bala Manian. Pixar, another well-known computer graphics company, also used a piece of Manian’s technology of transferring digital images on to film—a technology he had developed in the ’60s for use by medical experts looking at X-ray films.

Manian was honoured for his contribution to Hollywood’s computer graphics technology with a technical Oscar in 1999. “The screen in the auditorium showed a clip from Adventures of Young Sherlock Holmes, one of the many films that used my technology, as they announced my name,” reminisces Manian, a shy academic with wide-ranging interests in optics, biomedical engineering and bio-informatics.

BREATHING LIFE INTO SILICON

Gordon Moore’s question at the beginning of the chapter—“What is it [the PC] good for,” when an Intel brain trust suggested in 1975 that the company build a PC—has to be understood in its context. Though Intel had the chips to put together a PC, without the requisite software it would have been a curiosity for electronics hobbyists, not a winner.

Today an increasing amount of software capable of diverse things has breathed life into silicon. Before the advent of the PC, there was hardly any software industry. The birth of the PC went hand in hand with the birth of the packaged software industry. If programming languages like BASIC, FORTRAN and COBOL hid the complexities of the mainframe from the programmer and made him concentrate on the modelling task at hand, packaged software created millions upon millions of consumers who employ the computer as an appliance to carry out a large number of tasks.

The complexities of programming, the complexities of the mathematical algorithms behind a word processing, image processing or graphical software are left to the software developers. A draftsman or a cartoonist need not worry about the Bezier curves or Spline functions involved in a graphics package; a photo-journalist downloading images from a digital camera into his PC for processing and transmission need not worry about coding theorems, data compression algorithms or Fourier transforms; a writer like me need not know about piece tables behind my word processor while editing.

SPIRALS

Let us step back a bit. Intel’s failure to realise the opportunity in PCs, or Xerox’s inability to commercialise the PC technology developed at its Palo Alto Research Centre under Bob Taylor’s leadership, should be viewed with circumspection.

Every decision needs to be looked at in its historical context, not with 20:20 hindsight. In real life, the future is never obvious. In every decision there is an element of risk; if it succeeds others can look back and analyse what contributed to the success. But that does not guarantee a winning formula. Success is contextual, and the context is constantly changing.
Also there are the unknown parameters we call luck.

Bill Gates discovers positive and negative spirals in business successes and failures while analysing the super success of MS-DOS, Windows and Microsoft Office. The analysis shows that it is not the brilliance of one individual and his ‘vision’ that leads to success, but a host of factors acting together and propelling a trend forward.

Gates is sober enough to realise that he has been ‘lucky’ in the PC revolution and does not know whether he will be similarly successful in the Internet world, despite the tremendous resources, hard work, and focused research at Microsoft.

ECOSYSTEM OF A REVOLUTION

Clearly, all that we have discussed in this chapter shatters a popular romantic myth that long-haired school dropouts working out of garages in the Silicon Valley developed the PC. The PC triumph was the result of a vision carefully articulated by a number of outstanding psychologists, computer scientists, engineers and mathematicians supported by almost open-ended funding from the US defence department’s Advanced Research Project Agency and from corporations such as Xerox, Intel and IBM.

The self-driven entrepreneurship of many individuals played a major role in the advancement of personal computing. To name a few prominent ones:

• Digital Equipment Corporation’s Ken Olsen, who created the PDP minicomputer;

• Apple’s Steve Jobs and Steve Wozniak;

• Microsoft’s Bill Gates and Paul Allen;

• Jim Clarke of Silicon Graphics, famous for its computer graphics applications including animation and special effects;

• Andy Bechtolsheim, Bill Joy, Vinod Khosla and Scott McNealy of Sun Microsystems, which created workstations to fuel chip design and industrial design;

• John Warnock and Charles Geschke of Adobe Systems, who created software for desktop publishing, image processing and so on.

The contributions of several hardcore technologists cannot be ignored either:

• Wesley Clark, who developed the TX-2 at MIT, the first interactive small computer with a graphic display, in the 1950s;

• Douglas Engelbart (Turing Award 1997), who developed the mouse and graphical user interface at Stanford;

• Alan Kay, who spearheaded the development of overlaying windows, ‘drag and drop’ icons and ‘point and click’ technologiesto initiate action with his object oriented programming language—Smalltalk at Xerox Palo Alto research Centre;

• Carver Mead, who propounded VLSI (very large scale integrated circuit) technology at Caltech and Xerox Parc, which is today testing the physical limits of miniaturisation in electronics;

• Gary Kildall who created the first PC operating system, along with BIOS (basic input/output system);

• Dan Bricklin and Bob Frankston, with their first electronic spreadsheet VisiCalc; similarly, the inventors of WordStar and dBASE, which made the first PCs ‘useful’;

• Tim Paterson, the creator of MS-DOS;

• Mitch Kapor and Jonathan Sachs, with their spreadsheet Lotus 1- 2-3;

• Butler Lampson (Turing Award 1992) and Charles Simonyi with their word processing at Xerox Parc;

• Gary Starkweather, with his Laser Printer at Xerox Parc, to name a few.

Then there are the thousands who were part of software product development: writing code, testing programs, detecting bugs, and supporting customers. Similarly, hardware design and manufacturing teams came up with faster and better chips, and marketing teams spread the gospel of personal computing.

And let us not forget the thousands who tried out their ideas and failed.

I am not trying to run the credits, as at the end of a blockbuster movie. What I want to emphasise is that any real technology creation is a collective effort. The story of the PC revolution, when objectively written, is not pulp fiction with heroes and villains but a Tolstoyesque epic. It involves a global canvas, a time scale of half a century and thousands upon thousands of characters. It is definitely not, as sometimes portrayed in the media, the romantic mythology of a few oracles spouting pearls of wisdom, or flamboyant whizkids making quick billions.

AND IT CONTINUES….

To illustrate the behind-the-scenes activity that fuels such a revolution, let me summarise a report from the February 2003 issue of the IEEE Spectrum magazine. Recently, about fifty software and hardware engineers and musicians from several companies shed their company identities and brainstormed for three days and nights at a Texas ranch. What were they trying to solve? The next quantum algorithm? A new computer architecture? A new development in nanotechnology that might extend the life of Moore’s Law?

No. They had gathered to solve the problems of digital bits that translate into bongs, shrieks, beeps, honks and an occasional musical interlude when you boot your PC or when you blast a monster in a computer game. And they have been doing this Project BarbQ for the
last six years!

A movement or a quantum leap in technology is the result of an ecosystem. Individual brilliance in technology or business acumen counts only within that context.

FURTHER READING

1. The Dream Machine: J.C.R Licklider and the revolution that made computing personal—M. Mitchell Waldrop, Viking 2001.

2. The Road Ahead—Bill Gates, Viking-Penguin, 1995.

3. Dealers of lightning: Xerox Parc and the dawn of the computer age— Michael Hiltzik, Harpers Collins, 1999.

4. Inside Intel: Andy Grove and the rise of the world’s most powerful chip company—Tim Jackson, Dutton-Penguin, 1997.

5. Kannada manuscripts and software technology (In Kannada)—Dr Veeresh S. Badiger, Prasaranga Kannada University Hampi, 2000.

6. Digital Image processing—Rafael C Gonzalez and Richard E Woods, Addison Wesley, 1998.

7. Computer Graphics—Donald Hearn and M Pauline Baker, Prentice Hall of India, 1997.

8. Indians @ USA: How the hitech Indians won the Silicon Valley, Business India special issue, Jan 22-Feb 6, 2001.

9. Beyond Valuations—Shivanand Kanavi, Business India, Sept 17-30, 2001
(http://reflections-shivanand.blogspot.com/2007/08/tech-pioneers.html )

10. His Masters’ Slave—Tapan Bhattacharya, CSIR Golden Jubilee Series, 1992.

11. The Power of Now—Vivek Ranadive, McGraw-Hill, 1999.