Science and serendipity: famous accidental discoveries
Most scientific breakthroughs take years of research – but often, serendipity provides the final push, as these historic discoveries show .
This piece accompanies Marcus Chown's feature on the discovery of cosmic background radiation, from the Spring 2015 edition of New Humanist.
Perhaps the most famous accidental discovery of all is penicillin, a group of antibiotics used to combat bacterial infections. In 1928, Scottish biologist Alexander Fleming took a break from his lab work investigating staphylococci and went on holiday. When he returned, he found that one Petri dish had been left open, and a blue-green mould had formed. This fungus had killed off all surrounding bacteria in the culture. The mould contained a powerful antibiotic, penicillin, that could kill harmful bacteria without having a toxic effect on the human body.
At the time, Fleming’s findings didn’t garner much scientific attention. In fact, it took another decade before this drug was available for use in humans. Retrospectively, Fleming’s chance discovery has been credited as the moment when modern medicine was born.
In 1967, astronomy graduate student Jocelyn Bell noticed a strange “bit of scruff” coming from her radio telescope. It was a regular signal coming from the same patch of sky, of a type that no known natural sources would produce. Bell and her supervisor, Anthony Hewish, ruled out sources of human interference – other researchers, television signals, satellites. None explained the signal, and the scientists wondered if they had detected a sign from aliens. This was ruled out when another was located in a different part of the sky: it seemed unlikely that two sets of aliens would simultaneously be trying to communicate with Earth.
In fact, it was the first discovery of a pulsar (pulsating radio star), a highly magnetised, rotating neutron star that emits a beam of electromagnetic radiation. Pulsars, which had been predicted three decades earlier but had never been actually observed, indirectly confirm the existence of gravitational radiation.
French scientist Henri Becquerel was working on phosphorescent materials, which glow in the dark after exposure to light. The chance discovery came during an experiment involving a uranium-enriched crystal. He believed sunlight was the reason that the crystal would burn its image on a photographic plate.
One stormy day in 1896, he decided to leave it for the day and resume his experiments when the weather was better. A few days later, he took his crystal out of a darkened drawer. The image burned on the plate was “fogged” – the crystal had still emitted rays, despite the lack of sunlight. It was clear that there was a form of invisible radiation that could pass through paper, causing the plate to react as if exposed to light.
His research was continued by Pierre and Marie Curie, who named the phenomenon radioactivity. In the early years after the discovery, the dangers of radiation were not well understood. Today, its use is more closely monitored, and it has a range of uses in industry and medicine.
America in the 1830s was in the grip of “rubber fever”; factories had sprung up to meet the demand for goods made from this waterproof gum. But the craze ended abruptly – rubber froze hard in the winter and melted to glue in the summer.
Bankrupt, self-taught chemist Charles Goodyear spent years trying to make rubber more durable. In 1839 he was showcasing his latest experiment and dropped the rubber mixture on a hot stove. When it dried, it was a charred leather-like substance with an elastic rim. It was still rubber but had transformed: it was vulcanised, or weatherproof. Goodyear insisted it wasn’t an accident, and that the hot-stove incident held meaning only for the man “whose mind was prepared to draw an inference”.
Sadly, Goodyear didn’t reap the benefits of his discovery and died $200,000 in debt. Vulcanised rubber is still in use today, notably in car tyres.
In 1895, German physicist Wilhelm Roentgen was working with a cathode ray tube. The tube was covered, but a fluorescent screen nearby would still glow when the tube was on and the room was dark. The rays were illuminating the screen. Roentgen tried blocking the rays, but most things he placed in front made no difference. However, when he placed his hand in front of the tube, he noticed he could see his bones in the image projected onto the screen. The tube was replaced with a photographic plate, and the first x-ray images produced.
Ernest Rutherford was a physics professor at Manchester University, already well known for his studies of radiation. In 1911, under his supervision, German physicist Hans Geiger and physics undergraduate student Ernest Marsden observed how alpha particles scattered from a gold foil. Rutherford didn’t like to neglect any aspect of an experiment, no matter how unpromising, and told Marsden to check if any particles scattered backwards. He did so, writing later that he felt it was a test of his experimental skills, if nothing else.
But there was a highly unexpected result: some of the particles scattered backwards, rather than passing through the foil with little deviation from their existing path. Rutherford’s analysis was that the scattering was caused by a hard, dense core at the centre of the atom – the nucleus, where its positive charge and most of its mass are concentrated.
Chemistry graduate student Jamie Link was working on a silicon chip at the University of California in San Diego in 2003. The chip shattered. But, as it turned out, it was not a disaster. Link and her supervisor discovered that tiny bits of the chip were still sending signals, operating as tiny sensors. They called the self-assembling particles “smart dust”. These microelectromechanical devices include sensors and computational ability. Hailed as one of the top inventions this century, it is used to monitor the purity of water, detect harmful chemicals in the air and locate and destroy tumours in the body.
The heating effect of a high-power microwave beam was discovered in 1945 by Percy Spencer, an American engineer working for the company Raytheon. He was working on a magnetron capable of beaming high waves of radiation, when he noticed that a chocolate bar in his pocket had melted. Curious, he placed a bowl of popcorn in front of the tube and it began to pop. Spencer then created a high-density electromagnetic field by feeding microwave power from a magnetron into a metal box from which it could not escape. When food was placed in the box, its temperature quickly rose. In October that year, Raytheon patented the technology, and it became available to the public in 1947.