The Bleeding Edge. Bob Hughes
Читать онлайн книгу.hierarchies and profit-driven, intellectual property (IP) regime that underpin giants like Apple. Richard Stallman, founder of the ‘Free Software’ movement, sees any attempt to take ownership of the process as an affront to humanity. Of intellectual property law, Stallman has said:
I consider that immoral… and I’m working to put an end to that way of life, because it’s a way of life nobody should be part of.5
Free-market hawks may sneer at such idealism but their world would simply not exist without people like Stallman. Even if it did, it would not work very well without Stallman’s brainchild, the computer operating system known as GNU/Linux, and the global network of unpaid collaborators who have developed and continue to develop it. Google itself is built on GNU/Linux, as are Facebook and other social-media sites, and even the computers of the New York Stock Exchange: GNU/Linux is faster and more robust than the commercial alternatives.
Stallman wrote and published GNU in 1983 as an alternative to the older, established Unix operating system,6 with the difference that all of the code was freely available to anyone who wanted it, and could be changed and improved by anyone capable of doing so (hence ‘free and open source’, or FOSS). Stallman’s only stipulation was that nobody could own the code, and any modifications must be shared. GNU became ‘GNU/Linux’ after 1991, when a teenage fan of Stallman’s work, Linus Torvalds, started to circulate the code for the ‘kernel’ that allows GNU to run on different kinds of computers.7 This made it possible to use GNU/Linux (now generally known simply as Linux) on just about every kind of computing device that exists, including automobile engines, avionics, industrial appliances, power stations, traffic systems and household appliances.
The second paradox is that, while the new technologies are in principle supremely parsimonious, their environmental impact has turned out to be the exact opposite. Each wave of innovation needs fewer material inputs than its predecessor to do the same amount of work – yet in practice it consumes more resources. The Victorian economist William Stanley Jevons (see Chapter 5) was the first to draw attention to this paradox, which now bears his name – and it becomes even more striking when considering all the industries and activities that depend on or are mediated by electronics and computers. As economic activity has been computerized, it has become more centralized, and its overall environmental impact has increased – as have control by capital of labor and of people, the wealth-differences between rich and poor, and the physical distances between them.
Is this mounting impact an inevitable ‘price of progress’, or is it the result of progress falling into the hands of people and a system that simply cannot deal with it responsibly?
WHAT IS TECHNOLOGY ANYWAY?
It is important to challenge two conventional assumptions that are often made about technology: first, that we have capitalism to thank for it; and second, that it follows a predetermined course, that the future is waiting to be revealed by clever minds and that progress ‘unfolds’ from Stephenson’s Rocket to the automobile, DVDs and the iPhone.
The economist Brian Arthur, who has made a lifetime study of technological change, argues that human technology is a true, evolutionary phenomenon in the sense that, like life, it exploits an ever-widening range of natural phenomena with ever-increasing efficiency: hydraulics, mechanical, electrical phenomena and so on. He defines technology as:
a phenomenon captured and put to use. Or more usually, a set of phenomena captured and put to use… A technology is a programming of phenomena to our purposes.8
Technology develops through greater and greater understanding of the phenomena, and what they can be made to do, and how they can be coaxed into working together. Arthur uses the analogy of mining: easily accessed phenomena are exploited first (friction, levers) then ‘deeper’, less accessible ones (like chemical and electrical phenomena). As understanding of the phenomena deepens, their essential features are identified for more precise exploitation: the process is refined so that more can be done with less.
As the ‘mining of nature’ proceeds, what once seemed unrelated ventures unexpectedly break through into each other’s domains, and link up (as when magnetism and electricity were discovered to be the same phenomenon in the late 18th century). No technology is primitive; all of it requires bodies of theory, skill and experience; it tends inexorably to greater and greater economy of material means. He describes how phenomena – for example, friction being used to make fire – are exploited with increasing efficiency as they are worked with, played with and understood.
The parallels with biology are striking. Technology is just like a biological process – and there is a tendency at this point (which Arthur goes along with somewhat) to start thinking of technology as ‘a new thing under the sun’ with a life of its own, and rhapsodizing about ‘where it is taking us’.
If you only look at the technologies themselves, in isolation, the parallels are there, including the tendency to see computer code as space-age DNA, and to sit back and be awed as some brave new world unfolds. But what really distinguishes human techology from biological evolution, surely, is that it all happens under conscious, human control – which implies some important differences.
Technologies, unlike living organisms, can inherit acquired traits, and features of unrelated technologies can, as it were, ‘jump species’, as when turbine technology migrated from power stations into jet engines, and punched-card technology for storing information spread from the textile industry (the Jacquard loom) into the music industry (the pianola) and then to computing. The eclectic human agency responsible for this cross-fertilization is well demonstrated by the Victorian computer pioneer Charles Babbage, who was continually investigating the arcane processes developed in different industries – and made the connection with the Jacquard loom at an exhibition in 1842, as did his future collaborator, Ada Lovelace.9
This is even more the case where electronics and computers are concerned – a point that Brian Arthur makes: ‘Digitization allows functionalities to be combined even if they come from different domains, because once they enter the digital domain they become objects of the same type – data strings – that can therefore be acted upon in the same way’.10 Digitization is, moreover, just one of the possible techniques for doing this, as will be explained later. The underlying and really powerful principle is that phenomena from utterly different domains of experience may share a deeper, abstract reality that can now be worked with as if it were a physical thing in itself.
Most importantly of all, technological evolution need never have dead ends – and this is where we come slap-bang up against the contradiction that is today’s technological environment, in which promising technologies can be ditched, apparently never to return, within months of their first appearance.
TECHNOLOGY SHOULD HAVE NO DEAD ENDS
In principle – and in practice for most of the millennia that our technological species has existed – ideas that have ‘had their day’ are not dead and buried for ever. Human culture normally sees to that. Technological improvements can and should be permanent gains – inventions should stay invented. They may lurk in human culture for decades or even centuries, and be resurrected to become the bases of yet more discoveries, so that technology becomes richer, more complex and more efficient.
In the past, discoveries have tended overwhelmingly to become general property, rapidly, via exactly the same irrepressible social process whereby songs and jokes become general property. The genie does not always go back into the bottle and can turn up anywhere – precipitating further discoveries, always making more and yet more efficient use of natural phenomena, and revealing more about those phenomena, which yet more technologies can then use.
Biological evolution proceeds blindly, as it must, over vast epochs via small changes and sudden catastrophes. It contains prodigious numbers of dead ends: species that die out for ever, taking all their hard-won adaptations with them. Living species cannot borrow from each other: mammals could not adopt the excellent eyes developed (in the octopus) by molluscs; we had to develop our own eyes from scratch; so did the insects. Human technologies can and do borrow freely