It’s one of those cliches, that life imitates art. Rarely does this paradox crop up more often than in discussion of the relationship between science and fiction. It seems to many, that Mary Shelley’s Frankenstein looms over much scientific progress today. The recent death of Neil Armstrong, the first ‘Man i’ the Moon’, reminds us that over 60 years before he made his ‘giant leap’ for Mankind, fantasy writer Jules Verne, and then the pioneering cinematographer Georges Meliès, had put the idea into Mankind’s head, albeit in rather silly ways. So who got there first, the Americans — or the French?
Of course, the reality of the lunar mission was totally unlike the absurdities of fictional outings to our nearest neighbour in space, as deliciously pointed up in the animated film Wallace and Gromit: A Grand Day Out, where Wallace indulges his love of Wensleydale cheese by eating moon rocks on toast and the intrepid pair encounter an abandoned gas cooker with a love of winter sports. The way things are moving, it is not entirely fantastical to imagine those ideas being one day realised.
The late Ray Bradbury wrote a story in the 1950s, in which two children watching an African wildlife show on their room-sized TV disappear, presumably eaten by the virtual lions. TV was then a new and very limited consumer technology and there were no computers more powerful than an abacus. Last week Korean company Samsung previewed their wall-sized TV at the CES Tech show in Las Vegas, while a new virtual reality interface called Oculus Rift enables the user to interact with a computer game as if from the inside. Do not be surprised therefore to find your teenage son all shot-up on the bedroom floor, a jubilant, heavily armoured alien squatting triumphantly on the bed. Mummy! Life has a way of imitating art. New bendable thinscreens will make display-capable wallpaper a reality within a decade. Often, it seems TV programme commissioners are ahead of the curve….
Countless speculative references within Sci-Fi to mind control, telepathy and so on are even now coming to pass as systems have been announced within the past year enabling the disabled user to operate robotic machinery with controls electronically connected to the appropriate areas of their brains*. The fictional ‘human reduced to a brain in a jar’ trope comes a step closer. Watch out for Daleks! The computer touch-screen is being supplanted by screens that follow the movements of the eye, but thought-control is expected to be the next generation. And, while it has been known for years that a beam of light exerts a weak propulsive force, the discovery has been announced in the past week of a part of the spectrum where the effect can be reversed and light used to attract particles, thus bringing the fictional Star Trek ‘traction beam’ a step closer.
Other scientific miracles we have had to contend with this past year include the domestic version of the 3-D printer, already now available (for a price) in colour, capable of reproducing many consumer goods that will in future be manufactured on-demand at home, from downloads, rather than delivered by courier from Amazon. Medical technology allowing biologists to culture human organs for replacements by ‘printing’ cells onto a matrix has been around for a few years, but this week saw the astonishing announcement that scientists in Cambridge have imprinted the entire canon of 154 Shakespeare sonnets onto a single strand of artificial DNA, potentially closing the loop between digital and biological memory.
This astounding development opens the way to a world in which information can be programmed into any system, stored, acted upon and retrieved. It might include other life forms, of which one could cite any example: buy your pet dog pre-loaded with obedience training. Memory will become a utility, independent either of brains or computers. Patients with brain damage or memory loss could have artificial replacement memories and skills inserted; as could non-human replicants, robots, and other machines, as foreseen in Ridley Scott’s Blade Runner and elsewhere. People could record their memories and store them for access in old age, the way we store photographs now: but accurately, drawn directly from their current experience. Would children need to spend 14 years in school, or fighter airplane pilots be expensively trained, if the learning could be downloaded directly into their brains at a molecular level? DNA memory not only allows us to use computers, it allows computers to use us.
And in Scotland, physicist Dr Di Falco is perfecting the ‘invisibility cloak’ of fictional stories, famously the Harry Potter saga – although her single-layer ‘metamaterial’ still cannot work flexibly. Invisibility has long been a goal of imaginative humanity, exerting a powerful fascination: hence The Invisible Man, a story by the frequently prophetic HG Wells, that has spawned numerous films. Invisibility, the power to appear and disappear at will, is the stuff of myth and fairytale. A true superpower, it has been hitherto only partly achieved by imitating the chameleon and the cuttlefish, using camouflage: colouring objects to blend in with their background. An ‘invisible car’ was created for a recent Bond film, using video cameras to project the background scene onto the front surface of the object, fooling the onlooker. Special ‘stealth’ paints and surfaces reduce the reflectivity of military machines and installations. In those cases, movement gives the game away. But the Holy Grail of invisibility is to use light itself to confound the viewer, and it seems this is now being achieved experimentally in the USA. Would we know? No sooner have we come to understand the physical principles of the universe, apparently, than they are used to dupe us.
Invisibility has its attractions, but of all the new inventions it is possibly the least responsible. Invisibility will enable the terrorist to get closer to his target; the tyrannical regime to carry on unlimited surveillance; the enemy to hide in plain sight. It is potentially a game-changing development, enabling epochal events like the assassinations of Julius Caesar, Archduke Ferdinand or JFK to become commonplace, with who knows what consequences for the future. It could have an upside: invisible cities, for instance, could be built on sites of outstanding natural beauty and the traveller approaching the hidden ugliness of the city limits would see only mountains, lakes and trees, as in Terry Gilliam’s Brazil. But I am hoping that, when we do invent time travel, someone invisible will be sent back to shut this experiment down, for all our sakes.
This stuff is arriving thick and fast. It is as if we are living in a dream, where anything we can dream of is becoming possible. We are discovering, perhaps too soon, how to change not just a few things around us, but the universe itself; unpicking the woolly fabric of reality to knit new garments. Why? Why is this happening to us now? We are transiting so rapidly between technological paradigms it is as if we have been travelling to distant planets and met advanced scientific cultures without ever leaving Earth. We are becoming one of those higher civilizations ourselves, of the kind that Sci-Fi writers have been imagining for years, where everyone is immortal and Dr Who’s sonic screwdriver unlocks all doors in space and time. The new technology confounds us: we who were not born to live in this Brave New World can only gape like savages in awe of the magical and miraculous powers of the High Priests of science and their teenage acolytes.
For many of us, it looks like the endgame, as if we are rushing to complete something before everything dies. Our social, economic and political systems are fast revealing themselves to be threadbare, dysfunctional anachronisms. People waiting their turn hopelessly at the dry pumps of prosperity and progress are reverting to the old barbarities, the old superstitions. The medieval mindset has never really gone away. War looms, as we know, because we can, have imagined it in our art. And art, as we know, prefigures death on an unimaginable scale. Pilotless aircraft, driverless cars, powered exoskeletons… are we entering the time of The Terminator? Of an endless, unwinnable conflict whose cause has been forgotten for centuries, prosecuted by self-replicating machines equipped with artificial intelligence but no sense of irony?
A Canadian research project, announced within the past month, sets out to prove or disprove the idea that the universal ‘reality’ which we perceive is in fact a computer simulation. It’s an idea that has been imagined before in fiction, from Plato to The Matrix; even if it is not yet true, it soon might be. Was it not Shakespeare who wrote, ‘We are such stuff as dreams are made on’?
Yep, let’s blame the Bard.
*Much later… the story appears in the news that Swedish researchers are working on technology to allow dogs to communicate directly with humans. They have called the technology ‘No More Woofs’.
It is January, 2017. Doctors have learned to communicate with people who are essentially non-functioning, trapped in completely immobile bodies with so-called ‘locked-in’ syndrome and apparently unable to signal any responses to external stimuli. Four test subjects successfully answered 200 ‘yes-or-no’ questions put to them by the researchers, who monitored chemical changes in their brains responding to the inputs.
All four reported feeling happy with their condition.