My Top Ten Scientists Dominic Dulley
SF author and computer scientist (and software engineer)
Dom Dulley cites the scientists and engineers born
in the 20th Century who have influenced him
I’ve always preferred the term software engineer to computer scientist, because I think the kind of work I do as a developer is closer to engineering than to science. Having said that, I’ve never been a scientist so I could be wrong – I often am. This is all a long-winded way of warning you that I’m including engineers in this list as well as scientists, as the boundaries between the two are sometimes blurred. These guys are in no particular order.
Kip Thorne
In 1993 Kip Thorne proposed a mind-bending thought experiment. Imagine, he suggested, that his wife Carolee was in a spaceship equipped with a wormhole that linked it to Thorne’s front room. The spaceship travels at the close to the speed of light for a 12-hour round trip back to Earth. Throughout the trip Thorne would be able to see and speak to his wife through the wormhole, or even hold hands. Through the wormhole, he watches her ship return and land on the lawn, but when he looks out of the window it’s not there. Due to time dilation, she has landed ten years in the future where she can meet a 10-year older version of himself. If Thorne steps through the wormhole he will be ten years in the future, or his future self could step through and travel 10 years into his past.
This, like much of Thorne’s work, instantly spark dozens of story ideas in my mind. A Nobel laureate, many of his theories, particularly those involving the use of wormholes for interstellar or time travel, sound like they’re straight out of science fiction, but are firmly rooted in physics.
As well as advising his friend Carl Sagan on wormhole travel for his novel Contact, Thorne also acted as scientific consultant on the 2014 film Interstellar, insisting that the science in the movie shouldn’t violate the laws of physics. He worked out the equations that allowed the visual effects team to trace light rays as they travelled around a black hole or through a wormhole, and the resulting high resolution rendering of the black hole Gargantua led to the publication of scientific papers on the gravitational lensing effects of spinning black holes.
Elon Musk
In April 2019 I was fortunate enough to watch the launch of a Falcon Heavy rocket from the bleachers at Kennedy Space Centre. I was born the year after the Apollo 11 landing and grew up believing in a future of routine spaceflight that never materialised. This promised future is now finally starting to happen, and that is due in no small part to Elon Musk.
Flamboyant, impulsive and volatile, Musk is not a perfect person. My wife isn’t alone in believing him to be a potential Bond villain with machinations to take over the world, but the goals of SpaceX, Tesla and SolarCity are aligned to reducing global warming through sustainable energy production, and establishing a colony on Mars to mitigate the risk of human extinction by making us a multi-planetary species.
In 2002, frustrated by the cost of buying a Russian ICBM (intercontinental ballistic missile), Musk decided to engineer a reusable concept at a time when most experts in the space industry did not believe that reusable rockets were even possible. In the same way that Tesla disrupted the car industry and made electric vehicles a viable alternative to internal combustion engine cars, Musk’s vision and determination heralded a new age of commercial spaceflight that is progressing at a rate that would have been unthinkable even ten years ago.
Tim Berners-Lee
The first time I opened a web page was on a Mosaic browser in my university library in 1994. The web had been around for only a few years and wasn’t much to look at, and I certainly did not foresee what it would become – or that it would provide a career for me.
Tim Berners-Lee invented the World Wide Web in 1989 (in his proposal he called it ‘Mesh’) and wrote the first web browser in 1990. Had he chosen to retain the rights over his invention it is estimated that he could now be the world’s richest man by a factor of 10, but instead he chose to give the web to society for everybody to use freely.
In his most recent annual open letter on the future of the his invention, he expresses concerns about the direction in which the web and the apps that are built on top of it are moving, including criminal behaviour and online harassment, commercial clickbait and the viral spread of misinformation, and the “outraged and polarised tone and quality of online discourse”.
It is not the technology itself that’s the problem, he suggests, but the social dynamics of the users. He proposes a combination of protective legislation and commercial responsibility to combat the perceived decline of the system. Unfortunately, I suspect his efforts are doomed to failure.
Margaret Hamilton
Three minutes before the Apollo 11 lunar lander reached the surface of the Moon, a series of alarms triggered inside the lander. The computer was overloaded due to a power issue and reported to Armstrong and Aldrin that it would be executing only the tasks necessary for the landing. If the software had not been programmed to function asynchronously so that it could recognise and recover from such a situation, the landing would in all likelihood have been aborted. Not too shabby considering the Apollo Guidance Computer (AGC) had a 1.024 MHz clock speed, 2K of memory and 32K of storage.
As a software engineer and space nerd, I am ashamed to say that I had never heard of Margaret Hamilton until I read an article about her only a few years ago. Director of the Software Engineering Division at MIT's Instrumentation Laboratory and lead programmer of the AGC team, Hamilton developed the flight software for the command and lunar modules, and later Skylab. She worked on every manned Apollo mission and many unmanned ones, developing and refining the code.
Early in the Apollo programme, Hamilton came up with the term ‘software engineering}’ at a time when software development was considered neither an engineering discipline nor a science. I am glad to report that this is no longer the case, partly due to her efforts.
Stephen Hawking
Paralysed and able to communicate only by the twitching of his cheek muscle, Stephen Hawking wrote A Brief History of Time and all his other works, from books to lectures to scientific papers, at a rate of one word a minute. As a writer, I find his patience astonishing.
I don’t pretend to understand his work, but I do know that his research into black holes and the origins of the universe – not to mention his contributions to the fields of gravitation, cosmology, quantum theory, thermodynamics and information theory – were all done without being able to write anything down - by thought alone.
Hawking was a modern icon whose work and personal situation brought him fame and recognition which he used to popularise science and make it more accessible through his books and public lectures, and although he never succeeded in coming up with his ‘theory of everything’, his prediction of the emission of Hawking radiation from black holes did, at least to some extent, synthesise the two apparently incompatible theories of quantum mechanics and general relativity.
Beatrice 'Tilly' Shilling
In 1940, RAF pilots discovered a potentially lethal flaw in the Merlin engines that powered their Hurricane and Spitfire fighters. During combat the pilots were forced to roll inverted before a power dive to avoid a loss of power from a flooded carburettor. German fighters at the time were fuel-injected and did not suffer this problem, giving them an advantage in dogfights.
The problem was solved by engineer Beatrice Shilling, who surmised the cause of the power loss and devised a simple fix which was hugely popular with the pilots and became known as ‘Shilling’s Penny’ (or ‘Miss Shilling's orifice’). It was not in fact the ‘Penny’ which led to her fame, but rather her clear understanding of the problem and its immediate solution until a more effective method could be found. Without her insight many more pilots would have died, at a time when the RAF was already outnumbered by around 4 to 1.
Bill Gates
I built my first PC (personal computer) in 1995, a 486DX2 running Microsoft Windows 3.1. I’m writing this on a Windows 10 laptop, on which I’ve also written several novels and many short stories, and my day job is as a Microsoft .NET developer.
Bill Gates founded Microsoft in his garage in Albuquerque in 1975. Twelve years later at the age of 31 he became the world’s youngest billionaire, and by 1995 he was the richest man in the world. In 2001 he started giving his money away.
The Bill & Melinda Gates Foundation is reportedly the largest private foundation in the world, holding over US$50 billion in assets which it uses to combat extreme poverty and disease, and for educational programmes.
Rosalind Franklin
James Watson and Francis Crick shared the 1962 Nobel Prize in Medicine for modelling the structure of the DNA molecule with Maurice Wilkins, but one person notably absent from the stage that night was Rosalind Franklin.
While Watson and Crick were theorising the structure of DNA, Franklin was experimenting with capturing X-ray diffraction images of DNA fibres at nearby King College, in the same lab as Maurice Wilkins. Working in a basement with antiquated X-ray equipment, she discovered there were two forms of DNA shown in the images: A and B. Noting that the B form appeared to show a definite helical structure, she labelled the image Photo 51 and set it aside to study later.
Relations between Franklin and Wilkins were strained, and in 1953 she decided to leave Kings College. During the move her research notes came into Wilkins’ possession, and he removed Photo 51 and showed the double helix to his friends Watson and Crick. It was this image that filled in a vital gap in their research and allowed them to complete an accurate model of the structure of DNA. They quickly published their findings in the journal Nature, where shorter articles by Wilkins and Franklin implied that their work merely confirmed Watson and Crick’s discovery rather than being integral to it.
Franklin died in 1958, four years before Watson, Crick and Wilkins shared the Nobel, and her work was barely mentioned. In his 1968 book The Double Helix, Watson describes her as a “belligerent, emotional woman unable to interpret her own data.” It is only recently that Franklin’s contribution has been acknowledged.
Tu Youyou
During China's Cultural Revolution in the 1960s and '70s, scientists and intellectuals were often executed or imprisoned in ‘re-education camps’. Tu Youyou’s husband, an engineer, was detained, but Tu was recruited to a secret military project to find an antimalarial for North Vietnam Army soldiers fighting US forces in the Vietnam War. Inspired by a 1,600-year-old Chinese text, she discovered a compound (artemisinin) in sweet wormwood which proved an effective treatment for malaria, and volunteered to be its first human recipient.
Tu’s discovery was kept secret by the Chinese government for decades until relationships between the East and West thawed. Western pharmaceutical companies began mass production of the drug, which is estimated to have saved millions of lives globally, particularly in the developing world.
In 2015, Tu – who had continued to work in obscurity until 2011 – was awarded the Nobel Prize in Physiology or Medicine for a discovery that “led to the survival and improved health of millions of people.”
Alan Turing
In 2009 British Prime Minister Gordon Brown issued a personal apology to Alan Turing, writing, “on behalf of the British government, and all those who live freely thanks to Alan's work, I am very proud to say: we're sorry. You deserved so much better.”
During World War 2, Turing famously built a series of machines that enabled the Allies to crack German communications, potentially shortening the war by 2 to 4 years and saving millions of lives.
After the war he designed ACE (Automatic Computing Engine) at the same time as John Von Neumann was working on EDVAC. These were the world’s first digital computers with stored programmes, a concept that underpins computing theory to this day.
In 1950 he proposed the Turing Test which, despite the rapid advances in artificial intelligence (AI) in recent years, no AI has yet passed. In William Gibson’s Neuromancer, the ‘Turing police’ have jurisdiction over AIs.
In 1952 Turing was convicted of gross indecency after admitting to being in a homosexual relationship, and chose to receive hormonal treatment rather than go to prison. His security clearance was revoked and he was unable to continue his cryptographic consultancy for GCHQ. The following year he was found dead of cyanide poisoning.
In 2013, Alan Turing was formally pardoned by Queen Elizabeth II, and in July 2019 the Bank of England announced that his portrait would appear on the next £50 note.
Dominic Dully
Dominc Dulley is a software engineer who works as a contractor on enterprise web applications. His first two SF novels are Shattermoon (2018, Jo Fletcher Books) and Morhelion (2019, Jo Fletcher Books), are fast-paced space opera adventures involving a family of con-artists. He can be found on Twitter as @DominicDulley.
[Up: Article Index | Home Page: Science Fact & Fiction Concatenation | Recent Site Additions]
[Most recent Seasonal Science Fiction News]
[Convention Reviews Index | Top Science Fiction Films | Science Fiction Books]
[Science Fiction Non-Fiction & Popular Science Books]
[Posted: 19.9.15 | Contact | Copyright | Privacy]
|