To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It would be boring indeed to detail the innumerable ways in which information has become important to economic activity and social cohesion. We have all been told so many times. If information really does perform so vital a function it must be very different from the disposable stuff which pours over us in an unending mish-mash of news, views and abuse. Facts, speculations and persuasion are smoothly blended – I almost wrote ‘blanded’ – as the trite, the trivial and the titillating are fleetingly presented as having as much claim on our attention as more important matters. We have no control over the flow, and no way of answering back. We are, of course, still allowed to turn it off, but like amputation that is a remedy of the last resort.
Perhaps the development of IT will let us select, question and compare. It is necessary to write ‘perhaps’ because IT merely enables, it does not compel and cannot guarantee. Strong commercial and political interests will continue to fish for our attention and strive to steer our responses. This they would be able to do all the more insidiously were we to permit our use of IT to persuade us that we were now in full control. What can come out of an IT system depends on what goes into it, and I see no rush to abandon control over the primary sources.
Marshal McLuhan could be lured by the prospect of an epigram into uttering a delphic half-truth. Even so, his phrase, ‘the medium is the message’, does actually fit a large IT system, for its imposing façade confers an undeserved authority on its output. We charitably assume that so much blood, brains and treasure would not have been poured out unless the results commanded our instant and unquestioning acceptance. But it is not quite like that. Computing plus communications is a quite exceptionally powerful combination, and we need to consider why that should be so. For most people, computers are the mysterious part of IT. After more than a century's experience, rapid electrical communications over long distances are taken very much for granted.
With the widespread use of personal computers, everyone now knows that ‘hardware’ is the electronic and mechanical equipment, and ‘software’ the controlling programs which make it do what we require. Interchangeable control is not a particularly new idea. The drive mechanisms, electronics and loudspeakers of a record player are the hardware of a general-purpose musical instrument which can simulate a symphony orchestra or a soprano by playing the corresponding record. Records are easily changed, and there is no limit to the number of different ones that can be played. It is the same with computers and their programs.
1.1. Conduct research free from external misdirection in pursuit of short-range objectives set by industry or the state (8.3, below).
1.2. Develop an improved theoretical base for cooperative multiprocessor computing in parallel and network systems.
1.3. Inform the professions and the media fully and frequently about research in progress and its implications.
2. The IT industry
2.1. Continue free to develop and produce hardware, software and services to meet commercial objectives of own choosing, subject only to the trading controls generally applied to protect the rights of customers.
2.2. Participate in,v and help to fund, the development of improved methods of technology assessment (4.4 and 8.5, below).
3. Users of IT systems
3.1. Continue free to develop and operate IT systems at will, scrupulously observing legislation to protect privacy, and subject only to the controls generally applied to maintain the rights of customers and employees.
3.2. Top management and governing boards: learn enough about IT to exercise firm strategic control over their organization's use of it, in order to redirect projects (too narrowly conceived by specialists) which could damage their customer or industrial relations (4.5, below).
3.3. Take potential consequences into account when designing systems, accepting some extra cost, delay and loss of efficiency where necessary to mitigate adverse effects (4.5, below).
3.4. Consult employees and customers liable to be affected by new proposals before these have been finalized, and while they can be modified to meet valid objections.
The effect of IT on employment requires a chapter of its own (see Chapter 5). Here, we consider some other economic consequences which bear on the formulation of policies by governments and large corporations whose actions impinge on us at many points.
The dismal science
We are bombarded with interpretations of events by amateur and professional economists who tend to agree in disliking current policies, although their reasons differ as widely as the remedies they propose. Economists have not yet formulated a common body of theory, and instead of a science we have a hubbub of competing schools. Two reasons contribute to this sorry state of affairs: the frailty of economic data, and the impossibility of testing hypotheses by controlled experiments on real societies.
Refined statistical techniques – their use being often a sign of theoretical weakness – can sometimes be used to reduce the consequences of input error. IT may be able to help by gathering data where they arise, as when a cash register records item-byitem purchases as they are made. This ‘data capture at source’ will certainly increase the volume of raw material, but its quality will improve only if the data are the right ones, and representative of their kind, for they can never be complete.
Absence of theory is more troublesome, for not even a tonne of computation can make up for the lack of a gram of insight.
Politics is about power, information is an instrument of power and IT has given us the most effective information systems we have ever had. But they are not equally available to all, and could shift the locus of political power yet further away from ordinary citizens. ‘Democracy’ is a word with many different meanings: until this century it was often a term of abuse, meaning ‘mob rule’. Today, it is widely appropriated as a mark of approval – or of self-justification. I shall be using it in the restricted sense of ‘representative democracy’, and more particularly to denote the forms and styles of government current in Britain, in the USA, and in some other Western countries.
The will of the people
Representative democracy requires the consent of the governed, as expressed in the occasional election of representatives to local and national assemblies. Great emphasis is laid on this right to vote and on successive enlargements of the franchise to include women, those young enough for military service and the poor. Politicians congratulate themselves on giving so many of us the opportunity to participate in our own governing. Certainly we are able to make our views known to them by refusing to re-elect; but that method of protest has the defect of being sporadic, infrequent and less-rewarding than you had hoped.
A survey of the implications of information technology runs two risks. First, it may degenerate into a long and tedious list of the ways in which the use of IT bears upon our lives. Second, in seeking to fill the gaps left by enthusiastic advocates it may suggest that the consequences are wholly bad. In this book, beneficial effects are largely taken for granted, for those who design and introduce IT systems are rarely reticent about merits; attention is less often drawn to blemishes. But, we must not forget that adverse consequences could follow from failing to use IT. The intricate pattern of economic and governmental activities that underpins daily life in industrialized countries now depends critically on rapid and effective exchanges of data: without IT that pattern could unravel into chaos. We have already reached the stage where even a temporary failure of a major IT system can have undesirable social and economic repercussions.
When faced with changes we neither understand nor like, we are tempted to choose a scapegoat to bear the blame, and IT has been singled out as the cause of unwelcome social and economic developments that would have happened anyway. Some of these have certainly been accelerated or intensified, but it would be dangerously blinkered and wholely naive to see IT as the principal agent of change. Life is rarely as simple as we suppose.
More words have been written about the effects of computers on employment than on almost any other topic related to IT. The subject is one that attracts statistics as management and labour each strive to heap up evidence to overwhelm the other; but other people's statistics breed suspicion not trust. This chapter will not bury the subject under a mountain of obsolescent and disputable figures; it will attempt instead to present the principal factors involved.
No manager sets out with the prime aim of creating unemployment. The main reason for using IT is to reduce the unit cost of manufacturing some product or providing some service. When it is a matter of slimming a public bureaucracy, almost everyone speaks out fearlessly in favour of drastic economies. In other fields, senior managers find it prudent to present their costcutting proposals as ways of increasing productivity. That beguiling word makes it seems reactionary or irrational to oppose so eminently desirable an objective. Who can argue in favour of lower productivity?
Productivity is no more than a ratio of some input to some output, and measures the consumption of any resource used to create a product or a service. We can speak of the productivity of a raw material, or of a piece of machinery, but most often the word is used without a qualifying adjective and then refers to the use of labour.
The high fixed costs of producing a new design of silicon chip and of programming a new application imply that one of the more profitable outlets for microcomputers is the domestic mass market. Few of us are aware that every day we use 30 or more small electric motors in our homes and our cars: in the same casual way we are coming to rely on an increasing number of small computers. These we throw away and replace as required for they have a single hard-wired program, and for that reason are often known as ‘microprocessors’ rather than as computers.
Household appliances such as cookers, washing machines, sewing and knitting machines are equipped with microprocessor control to sharpen their competitive edge. This does not make them ‘intelligent’, however hard advertisers may try to persuade us, but it does make them more effective and helpful – more competent, more smart. Household systems also are improved by microprocessor control. Central heating can be much more flexibly controlled than by a simple time-switch-cum-thermostat. With sensors in every room and remote control over individual radiators, independent heating cycles can be programmed for each part of a house to match its pattern of use, and to conserve energy overall. The room sensors can be more subtle than mere thermometers, measuring the balance between radiated and convected heat, and the movement and humidity of air.
Computer simulation has become a valuable - even indispensable - tool in the search for viable models of the self-organizing and self-replicating systems of the biological world as well as the inanimate systems of conventional physics. In this paper we shall present selected results from a large number of computer experiments on model neural networks of a very simple type. In the spirit of McCulloch & Pitts (1943) and Caianiello (1961), the model involves binary threshold elements (which may crudely represent neurons); these elements operate synchronously in discrete time. The synaptic interactions between neurons are represented by a non-symmetric coupling matrix which determines the strength of the stimulus which an arbitrary neuronal element, in the ‘on’ configuration, can exert on a second neuron to which it sends an input connection line. Within this model, the classes of networks singled out for study are defined by one or another prescription for random connection of the nodal units, implying that the entries in the coupling matrix are chosen randomly subject to certain overall constraints governing the number of inputs per ‘neuron’, the fraction of inhibitory ‘neurons’ and the magnitudes of the non-zero couplings.
We are primarily concerned with the statistics of cycling activity in such model networks, as gleaned from computer runs which follow the autonomous dynamical evolution of sample nets. An aspect of considerable interest is the stability of cyclic modes under disturbance of a single neuron in a single state of the cycle.
During the last decade, a conspicuous theme of experimental and theoretical efforts toward understanding the behavior of complex systems has been the identification and analysis of chaotic phenomena in a wide range of physical contexts where the underlying dynamical laws are considered to be deterministic (Schuster, 1984). Such chaotic activity has been examined in great detail in hydrodynamics, chemical reactions, Josephson junctions, semiconductors, and lasers, to mention just a few examples. Chaotic solutions of deterministic evolution equations are characterized by (i) irregular motion of the state variables, and (ii) extreme sensitivity to initial conditions. The latter feature implies that the future time development of the system is effectively unpredictable. An essential prerequisite for deterministic chaos is non-linear response; and although there are famous examples of chaos in relatively simple systems (e.g. Lorenz, 1963; Feigenbaum, 1978), we expect this kind of behavior to arise most naturally in systems of high complexity. Since biological nerve nets are notoriously non-linear and are perhaps the most complex of all known physical systems, it would be most surprising if the phenomena associated with deterministic chaos were irrelevant to neurobiology. Indeed, there has been a growing interest in the detection and verification of deterministic chaos in biological preparations consisting of few or many neurons. At one extreme we may point to the pioneering work of Guevara et al. (1981) on irregular dynamics observed in periodically stimulated cardiac cells; and, at the other, to the recent analysis by Babloyantz et al. (1985) of EEG data from the brains of human subjects during the sleep cycle, aimed at establishing the existence of chaotic attractors for sleep stages two and four.
The brain and the computer: a misleading metaphor in place of brain theory
Contrary to the philosophy of natural sciences, the brain has always been understood in terms of the most complex scientific technology of manmade organisms, for the simple reason of human vanity. Before and after the computer era, the brain was paraded in the clothing of hydraulic systems (in Descartes' times), and in the modern era as radio command centers, telephone switchboards, learn-matrices or feedback control amplifiers. Presently, it is fashionable to borrow terms of holograms, catastrophes or even spin glasses. Comparing brains to computers, however, has been by far the most important and most grossly misleading metaphor of all. Its importance has been twofold. First, the early post-war era was the first and last time in history that such analogy paved the way both to a model of the single neuron, the flip–flop binary element, cf. McCulloch & Pitts, 1943, and to a grand mathematical theory of the function of the entire brain (i.e., information processing and control by networks implementing Boolean algebra, cf. Shannon, 1948; Wiener, 1948). Second, the classical computer, the so-called von Neumann machine, provided neuroscience with not only a metaphor, but at the same time with a powerful working tool. This made computer simulation and modeling flourish in the brain sciences as well (cf. Pellionisz, 1979).
The basic misunderstanding inherent in the metaphor, nevertheless, left brain theory in an eclipse, although the creator of the computers was the first to point out (von Neumann, 1958) that these living- and non-living epitomes of complex organisms appear to operate on diametrically opposite structuro–functional principles.
The modeling of dendritic trees was carefully presented and discussed in earlier publications; only a few points will be summarized here. In Rail, 1962 it was shown how the partial differential equation for a passive nerve cable can represent an entire dendritic tree, and how this can be generalized from cylindrical to tapered branches and trees; this paper also showed how to incorporate synaptic conductance input into the mathematical model, and presented several computed examples. In Rail, 1964 it was shown how the same results can be obtained with compartmental modeling of dendritic trees; this paper also pointed out that such compartmental models are not restricted to the assumption of uniform membrane properties, or to the family of dendritic trees which transforms to an equivalent cylinder or an equivalent taper and, consequently, that such models can be used to represent any arbitrary amount of nonuniformity in branching pattern, in membrane properties, and in synaptic input that one chooses to specify. Recently, this compartmental approach has been applied to detailed dendritic anatomy represented as thousands of compartments (Bunow et al., 1985; Segev et al., 1985; Redman & Clements, personal communication).
Significant theoretical predictions and insights were obtained by means of computations with a simple ten-compartment model (Rail, 1964). One computation predicted different shapes for the voltage transients expected at the neuron soma when identical brief synaptic inputs are delivered to different dendritic locations; these predictions (and their elaboration, Rail, 1967) have been experimentally confirmed in many laboratories (see Jack et al., 1975; Redman, 1976; Rail, 1977).
I am the set of neural firings taking place in your brain as you read the set of letters in this sentence and think of me.
(D. Hofstadter, Metamagical Themas)
Neurobiological systems embody solutions to many difficult problems such as associative memory, learning, pattern recognition, motor coordination, vision and language. It appears they do this via massive parallel processing within and between specialized structures. The mammalian brain is a marvel of coordinated specialization. There are separate areas for each sense modality, with massive intercommunication between areas. There are topographic maps, many specialized neuron types, and quasi-regular small-scale structure (columns and layers) which vary from area to area to accommodate local needs, and plasticity in connections between neurons. Feedback occurs on many levels. This complexity is apparently necessary for the kind of multi-mode processing that brains perform, but it's not clear how much of this structure is necessary to perform isolated tasks such as vision or speech recognition; nor do we know if nature's solutions are optimal. (See chapter 8 of Oster & Wilson (1978), for example, for an interesting discussion of optimization in biology.)
Regardless of whether the brain represents the optimal structure for cognitive processes, it is the only successful one we know of. By analyzing it and modeling it, we may learn the principles on which it operates, and presumably be able to apply these principles to computer technology.