The fact that COBOL is around and going
strong - even though it was obsolete
by 1965 or so [...] says something
about the power of the familiar.
Alan Kay1
The history of computers is often seen as a history of such events, typically compiled into timelines3. The events most often cited may be the ENIAC, the PDP-11, the Apple II, the VISICALC spreadsheet. One must, however, remember that these events more often than not were manifestations of what was happening at the time; if Apple had not become successful with the first personal computer, someone else would - maybe Cromemco, maybe Texas Instruments, maybe Lee Felsenstein with the SOL.
To find out what will happen in the near future, one could go to the research laboratories, look at what is generating interest there, and then with some certainty say when these inventions will be applied to real-world use and marketed commercially4. A roundup of the technologies in the labs as of today may give us neural networks, broadband ISDN, chaos theory, gallium arsenide microchips, groupware and massively parallel computers. All of these technologies will be available before 2000; in fact, most of them are available today in one form or another. Determining which of them will have the largest impact on organizations is more like informed guesswork than science, since the number of unidentifiable variables is enormous and the influence of future unrelated events large and unpredictable. Predictions therefore are often based on extrapolations of previous technical developments, limited by today's technological hurdles, such as the "speed limit" on microprocessors due to the departure from the statistically based behavior of electrons as the processor speeds go up.
Yet the factors most influencing organizational structure and behavior are often determined outside the research laboratories: more often than not, the technically sophisticated loses out to the standardized, understandable, available. In the world of information technology, Herbert Simon's administrative man lives and decides. Moreover, technical breakthroughs are like earthquakes: we know they will come, but there is a problem with timing.
I believe that in the next three to five years, maybe even
through
the next decade, the main influence from technology on organizations
will
be the proliferation of computers in itself: the availability of more
powerful
and cheaper computers which can be connected together will itself spawn
new ways to use them. I do not foresee any new "spreadsheet"; the main
influence computers will have on organizations will be their ubiquity.
Computers will be even cheaper, even smaller, easier to use, faster,
and
above all the need for specialized deployment knowledge will virtually
disappear. The main directions will be towards standardization, towards
systems extending beyond the traditional boundaries of the business
enterprise,
object orientation, and, on the consumer side, the computer as a
commodity,
a household item. The computer will, in most families, eventually
occupy
a place somewhere between the car, the telephone, the TV set and the
stereo
equipment.
The personal computer started out around 1974 as both a revolt against the traditional world of computers to bring "computers to the people"5, and as a hobby for computer professionals. Various companies (Apple with the Apple II, Tandy with the TRS-80, Commodore with the PET and later the VIC-20, Atari with the 400 and 800) made and successfully sold computers, but it took VISICALC and later the IBM PC and Lotus 123 to really get the message through to the business community. From then on it has largely been a battle between the IBM-compatible camp and Apple Macintosh, with the minicomputers losing out to ever more powerful PCs and workstations. The mainframe is gradually being relegated to server status, and the main headache of the CTO is how to tie all these little things together again (and to hope that at least some of the users back up their data every once in a while).
The main issue in the technical debate for the last three
years
has been standardization. Standards come in two flavors:
I see a dramatic awakening to standards in the information technology industry. Manufacturers see that there is a need for them to adhere to standards, as witnessed by the scramble towards UNIX by all the previously proprietary oriented minicomputer manufacturers. There is also an understanding that complete standardization is not possible. IBM's SAA, which was intended to be a focusing standard for IBM developments, has been expanded due to pressure from the installed customer base up to a point where it now includes almost everything IBM does (and has ever done). A more realistic strategy is to lower the ambitions a bit and aim for interoperability, where standardization is achieved by having translation functions in the systems, or by having standards for data formats rather then executables. A well known example of interoperability is the Microsoft Word word processor and Excel spreadsheet: documents from these applications can be copied between IBM-compatible PCs and Macintoshes without conversions. There is a need for interoperability on a lower level (binary compatibility) as networks proliferate, but this has not to my knowledge been achieved except within applications from one single vendors (an example may be the Smalltalk development environment, which may exchange images among technological platforms without any conversion). Another form of interoperability lies in the definition of interfaces between applications, as is demonstrated with Apple Hypercard front-ends to databases residing on mainframes. Hardware forms of interoperability is found in the SCSI interface for computer peripherals, now finally adopted on a large scale by IBM7, and the 1.44 MB 3.5'' diskette, now adopted by both Apple and NeXT.
The most powerful standards may be the ones proposed and
developed
by institutions that do not have any direct commercial interest in
them.
Examples here may be the X windowing standard or the Kerberos
authentication
system, developed at MIT. The "official" standard committees often take
so long getting a standard out that the technological evolution has
surpassed
it (as seen with ISDN), and the intervendor organizations often get
lost
in competitive bickering over details (as seen in the case of the OSF
versus
the SUN/AT&T consortium). On the other hand, independently defined
standards may be incomplete in certain areas of little interest to the
developers.
Interorganizational systems may be looked upon as a form of standardization. By agreeing on a standard way to conduct business, organizations connect electronically with each other, saving time and money through designing their systems for the normal, uncomplicated business transaction, leaving exceptions to be handled by humans. Such systems may be used to gain a tremendous and not very competitive advantage, as seen in the air travel industry9. Interorganizational systems may be initiated from either party of the business transaction, from an organization in the middle, by mutual consent, or, as is increasingly common, by a market provider organized specifically to provide a business arena, an electronic marketplace. Standards for the exchange of documents are emerging, notably the EDIFACT standard, which now dominates the EDI market.
I believe that the move towards an electronical extension of
the
enterprise will find a number of companies in dire straits, not so much
because of technical inability as because of lack of imagination.
Marketing
electronically requires a different way of thinking about the customer
than previously: the successful companies will be the ones that
exploits
the technologies ability to add new value for the customer: value by
adding
specialized configuration, tailored products and procedures, by
extending
the range of products offered based on how the customer behaves. There
is much to be learned here from the otherwise staid mail-order
business.
A mail-order company always keeps track of which advertising in which
magazines
gives what kind of response; this knowledge of the distribution
channels
is the core of the business. As we see electronical services being
extended
to the home, companies mastering this form of marketing will find new
markets,
in a time where the access to products becomes an increasingly
important
part of the purchasing decision.
Object oriented programming is especially suited for the complex world of graphical user interfaces. As many a Macintosh programmer has experienced, programming in an event-driven, graphically based environment is non-trivial because of the number of environmental variables that constantly has to be monitored. Object oriented programming organizes the whole environment in a hierarchy, where each object responds to a message either by executing it themselves, or by borrowing methods to do so from objects higher up in the hierarchy (inheritance). In this way, each object stores only the methods (that is, code) that is different from the objects above it in the hierarchy; for everything else, the code from the higher objects are used (code reusability). This makes object oriented development more a question about finding a pertinent existing class of objects and modify them than actually writing new code. As most object oriented programming systems come with an extensive set of standard objects, actually having to write a new object from scratch relatively rare.
Object orientation is beginning to show up in user interfaces as well. Bill Gates, founder of Microsoft, has long been a proponent of what he calls "document-driven" (as opposed to "application-driven") computer usage. Under this paradigm, the user groups his files according to the content, not according to the applications that created them. A document may consist of text, drawings, calculations, pictures or data from a database. The idea is that the user shall only concentrate on the content of the document, and that the document itself will know which application to invoke according to what the user wants to change. Moreover, the user can create groups of files, after content, which will be dynamically linked to each other. Elements of this idea can be seen in the Macintosh file system, where the file knows which application created it, and in the Lotus Magellan file management program for DOS-based machines, where files are grouped across directories according to content.
Today's computers have hardware capable of running incredibly
complex software--software that is not written yet, causing a situation
referred to as the "software gap". The complexity of new software lies
not so much in more complex algorithms, more advanced functions--at
least
not for the business community. Instead the powerful new hardware will
be used to let the computer take over more of the complexity of the
interaction
between man and computer: graphical user interfaces, alternative input
devices, transparent communication. I think object orientation, both in
development and in user interfaces, gives an answer to the question of
how to handle the incredible complexity arising from this shift in
cognitive
responsibilities.
It is tempting to view the IBM personal computer, introduced in 1981, as the Model T of the information age. It certainly had its idiosyncrasies (a crippled operating system, a multitude of screen standards of varying quality, and a bulky and heavy case which had the switches controlling the system configuration inside, secured with 5 screws). Just as the model T, it made the regular user technically literate: as the need for upgrades arose, people bought expansion cards and installed them themselves. Few original IBM PC users have not been inside their machines.
As more and more people get experience in the use of computers, we may expect the computer to turn into a standardized household item, almost a commodity, much like the automobile of today. The computer will become part of everyday life also in the household to a much larger extent than it is today; we will see systems for paying bills (such as CheckFree) and communication services (like Compuserve or Prodigy) becoming ubiquitous. In France, the Minitel system has been relatively successful, mainly because a) getting the terminal was very cheap (initially it was freely distributed), and b) because the telephone directory of France was offered electronically, which instantly gave the user an application of real value. Other attempts to get a public communication system has not been as successful, although Compuserve now has 500,000 members and recently (Jan. 1., 1991) added a complete telephone directory for the US to its services. Certainly IBM has seen the market possibilities with its PS/1 home computer--still, there is a need for a product that spans more than the traditional uses of a personal computer; a multi-media machine capable of something more than word processing and remote bank account managing.
There is an issue of accessibility, or "readiness-to-hand"13; for the computer to achieve the place the car now has in society (as pictured in Apple's Knowledge Navigator video) certain things must happen. Firstly, the computer must be portable. There is a powerful move towards portable computers to gradually replace the deskbound ones; even the newspaper trend journalists have understood that portable computers are IN14. Alan Kay has envisioned a computer called a Dynabook; something so versatile that the user would write the grocery shopping list on it, and be able to carry it together with two bags of groceries15. Secondly, new user interfaces such as voice or handwriting recognition must become practical. The popularity of cellular phones may, in my opinion, in part be credited to their simple user interface. Thirdly, access to information and communication services must be seamless and cheap enough to get the "Model T" effect rolling: there is a need for the consumer electronics industry to do the same thing that was done with the VCR market; initially take low profits to establish a standard and a distribution system, then earn money through large volume and spin-off business. (Another issue is that most consumer electronics items are far to complex for the average user; the design of programmable VCR's can leave even experienced computer programmers helpless16).
Accessibility is also a question about previous knowledge:
through
education of the users the need for a specialist is reduced or
disappears.
This has taken place in most commercialized technologies; around the
turn
of the century, having a car also meant having a chauffeur. I think the
element of user education has been pushed a bit to far; to effectively
use and understand a computer, the user must have an understanding of
"why"
as opposed to "how". Many users of computers, especially in production
or transaction oriented environments, have not got the capability or
the
interest in developing this understanding. Moreover, the people
training
them has from the beginning had a "how" orientation; the result can be
seen in the many handwritten lists of key sequences found taped to
computer
terminals. To overcome this, software developers, also with the large
business
systems, will have to adopt not only new user interfaces, but also
create
metaphorical systems in which the user can deduce what to do by
referring
to what he or she would have done in "real life".
My personal dream of the computer of the future is not one, but two systems: one is a portable computer much like the Knowledge Navigator depicted by Apple. The other one is a desk-size LCD screen, tilted towards the user like an engineer's drawing table. On this desktop everything normally found in an office would be in digital form; the file cabinet, the telephone, the documents floating around. Applications would be available as the documents are accessed. Information repositories would be instantly and transparently on-line. The voice and video-based electronic mail system would have a filter shutting out sales pitches. In fact, the only thing physical in this office would be the chair, the coffee mug--and the dust, since the owner would have better things to do than sit tied to a computer all day....