Here's a fun piece from my new book,
Welcome To Weirdsville on scientific experiments that actually, really, might destroy us all.
Have Fun!
ON DESTROYING THE EARTH
We
like scientists. We really do. After all, without them – and the
scientific method – we’d still think lightning was Zeus hurling
thunderbolts, the sun was an enormous campfire, and the earth itself was
balancing on huge turtles. Without science we’d be ignorant troglodytes
– too stupid to even know that we’d evolved from even simpler life
forms.
Yep, we love science – but that doesn’t mean it
doesn’t scare us. After all, when you’re dedicated to cracking the
secrets of the universe it’s kind of expected that sometimes, not often,
you might crack open something a tiny bit … shall we say … dangerous?
The
poster child for the fear that science and engineering can give us –
beyond Shelley’s fictitious Frankenstien, of course -- was born on July
16, 1945, in New Mexico. Not one to miss something so obvious, its
daddy, the one and only J. Robert Oppenheimer (‘Oppy’ to his pals)
thought “I am become Death, the destroyer of worlds” from the Bhagavad
Gita – but Kenneth Bainbridge, the Test Director, said it even better:
"Now we are all sons of bitches."
Sure, the Trinity
Atomic Bomb Test -- the event that began the so-called atomic age,
leading to our now-constant terror that one day the missiles may start
to fly and the bombs begin to fall -- was the first, but since then
there have been all kinds of new, if not as flashy, scientific
investigations that could be ten times more destructive. In other words,
we could be one beaker drop from the destruction of the earth.
Naturally
this is an exaggeration, but it’s still fun – in a shudder-inducing
kind of way – to think about all these wildly hypothetical doomsdays.
Putting aside the already overly publicized fears over the Large Hadron
Collider creating a mini black hole that immediately falls to the core
of the earth – eventually consuming the entire globe – some researchers
have expressed concern that some day we may create, or unleash, a
subatomic nightmare. The hunt for the so-called God particle (also
called a Higgs boson), for instance, has made some folks nervous: one
wrong move, one missing plus or minus sign, and we could do something as
esoteric and disastrous as discovering that we exist in a metastable
vacuum – a discovery made when one of our particle accelerators creates a
cascade that basically would … um, no one is quite sure but it’s safe
to say it would be very, very strange and very, very destructive.
Confusing? Yep. But that’s the wild, weird world of particle physics.
It's sometimes scary. Very, very scary.
A new threat to
everyone on the planet is the idea of developing nanotechnology. If
you've been napping for the last decade or so, nanotech is basically
machines the size of large molecules: machines that can create (pretty
much) anything on a atomic level. The question – and the concern – is
what might happen if a batch of these microscopic devices gets loose.
The common description of this Armageddon is "grey goo." The little
machines would dissemble the entire world, and everything and everyone
on it, until all that would be left is a spinning ball of, you guessed
it, goo.
Another concern for some folks is that, for
the first time, we’ve begun to seriously tinker with genetics. We’ve
always fooled with animals (just look at a Chihuahua) but now we can
REALLY fool with one. It doesn’t take a scientist to imagine – and worry
about – what happens when we tinker with something like ebola or,
perhaps even worse, create something that affects the reproduction of
food staples like corn or wheat. Spreading from one farm to another,
carried perhaps on the wind, this rogue genetic tweak could kill
billions via starvation.
And then there’s us. What
happens if the tweak – carried by a virus or bacteria – screws not with
our food but where we’re the most sensitive: reproduction? Unable to
procreate we’d be extinct as few as a hundred years.

While
it’s become a staple of bad science fiction, some scientists see it as a
natural progression: whether we like it or not, one day we will create a
form of artificial intelligence that will surpass and replace us. Even
putting aside the idea that our creations might be hostile, the fact
that they could be better than us at everything means that it would
simply be a matter of time before they go out into the universe – and
leave us poor throwbacks behind.
There are frightening
possibilities but keep this in mind: if something does happen and it
looks like it’s going to be the End Of The World As We Know It, there is
going to be one, and only one, place to turn to for help: the world of
observation, hypothesis, prediction and experiment.
In other words, we’d have to turn to science. They would have gotten us into it, and only they will be able to get us out.