Putting the life back in science fiction


Next up, the ammonia economy?

This is another spun-off “strange attractor” from Antipope. It had nothing to do with the thread it was on, but the topic is interesting enough–if you’re into futuristic science fiction–that I wanted to summarize it here.

The basic idea is using ammonia as an alternate, carbon-free fuel. This isn’t as weird as it sounds, and there are a bunch of industrial efforts out there that might well project us into an ammonia age rather than a hydrogen one. Unfortunately, ammonia isn’t a panacea, so switching from fossil fuels to ammonia synthesized using solar or wind energy won’t be problem free. For those looking for dramatic conflict, ammonia has it.

Anyway, the fundamentals. Ammonia is NH3. If like me you’re lazy, you can go to Wikipedia’s article on energy densities, and find out that liquid ammonia has about 11.5 MJ/L of energy, slightly better than compressed natural gas (9 MJ/L) and liquid hydrogen (8.5 MJ/L), and less than propane (25.3 MJ/L) or gasoline (34.2 MJ/L), among many others.

As for making NH3, right now we make it in huge plants using the Haber-Bosch process, which makes ammonia using natural gas. Nitrogen is ubiquitous as N2 in the atmosphere, but N2 is a very stable molecule, and it takes a lot of energy to break it and turn it into NH3. Still, people are looking for better ways to do it. NH3 Canada is developing a miniature ammonia synthesizer that’s about four cubic meters in size and can produce 500 liters of ammonia per day, with each liter of ammonia taking 2 liters of water and 7.5 KWhrs of electricity to produce it. As a comparison, the average US home uses 909 kWhr per month or about 30 kWhr per day, which is about what it would take to make a gallon of ammonia using NH3 Canada’s technology. If it works.

To save you the math, that’s about 30% conversion efficiency, which isn’t bad. Ammonia synthesis could be used to store electricity from, say, wind turbines. The nice thing about NH3 Canada is that they want to use small units and stack them in banks, while the older technology uses huge furnaces to get efficiencies of scale.

What can you do with ammonia? You can actually mix it with gasoline and use it to run your car, if you get the mix right, and other researchers are working on creating engines that can run on pure ammonia. While there’s less energy in ammonia than there is in propane, it can be handled similarly. Pure (anhydrous) ammonia is fairly dangerous stuff, but then again, so is liquid hydrogen, and so are giant batteries if they’re fully charged. Energy density makes things dangerous.

Of course, ammonia has many other uses. We all know of it as a cleaner and fuel, and it used to be used in refrigerators before people switched to the much more efficient and dangerous CFCs [but see comments]. But it’s primary use is as a fertilizer and to make explosives, including gunpowder. Industrial nitrogen fixation underlies Big Ag all over the world, and it also underlies industrial warfare. Without huge amounts of gunpowder, things like machine guns don’t work, because there isn’t enough ammunition to make them fully functional killing machines.

Similarly, without huge amounts of nitrogen, the huge amounts of corn, wheat, and soy that are required to feed all seven billion of us wouldn’t exist. Some calculate that at least a third of us wouldn’t exist without fixed nitrogen in our food. The US has taken full advantage of this, and forcing huge supplies of cheap food on the world has been a major part of our foreign policy since the Eisenhower Administration. It was one way of beating communism, and protecting our high-yielding corn from things like being pirated by the Chinese is a matter of national security today.

I’m not a huge fan of Big Ag, even though I’d probably be dead without it. Still, if we want to switch from fossil fuels to renewables, adapting and expanding our existing fixed nitrogen infrastructure is a lot easier than trying to build the infrastructure needed to handle hydrogen.

That’s the good part. There are some downsides.

One is that when you burn ammonia in an engine, it produces NOx, which is a major source of air pollution. This can be fixed if there’s a catalytic converter on the exhaust pipe. I suppose, if you’re powering agricultural equipment, it might be possible to capture the NOx, convert it to nitrate or urea, and use it as fertilizer on the fields, thereby getting a second use from the fixed nitrogen.

One big problem with an ammonia economy is the same problem with renewables, which is that you’re capturing energy from the modern sun, and that’s all you get to play with. Fossil fuels use fossil sunlight from the last few hundred million years, and that’s a lot more energy. There’s no fossil ammonia, so we’d be stuck working in a lower energy environment. Currently, industrially fixed nitrogen takes about 1% of the global energy supply, but that’s a fixed 1%, and if it’s used for other purposes, people can starve. We’d have to ramp up NH3 production to store captured renewable energy, not depend on what’s already being made.

Still, I can envision a world where giant farms host an overstory of huge wind turbines, all hooked up to ammonia synthesizers. The farmer uses the ammonia to run his equipment, then uses nitrates captured from the exhaust to fertilize his fields. Aside from the scale and all the problems with nitrogen runoff and pollution, this isn’t a bad setup.

There are some interesting follow-ons.
–One is politics. If most of the world switches to synthesizing ammonia from sunlight or wind, the countries that depend on petroleum exports are out of luck. The only parts of the Middle East that would continue to matter to the US (and possibly China and Russia) are Egypt and Israel, due to the Suez Canal. This means that the burgeoning crises in the region would have to be dealt with semi-regionally, if at all. And that’s bad for all those refugees. Russia is likely to be a hold-out in switching off fossil fuels, since they get so much power from oil and natural gas, but switching to ammonia would change international politics as much as did the switch to using oil in the early 20th Century.
–A second issue is fertilizer. It is feasible to synthesize huge amounts of ammonia, but other elements are essential for plant growth, and the world is starting to run short of minable phosphorus. We may well have phosphorus wars in the future, but the simpler solution is to recycle sewage and manure onto fields. This has all sorts of public health and disease vector implications, but it will keep people from starving And there are menu implications–you want to eat raw salad from a field that receives sewage? It’s a common practice in the developing world.
–A third issue is air pollution. I can easily see people using ammonia to power things like home generators in areas where the power grid is failing, but if these machines don’t have decent filters on their exhaust, they will put out a lot of air pollution. The resulting smog will degrade the performance of any local solar panels, but it might be simpler than investing in huge batteries and a smart grid to provide power when the sun doesn’t shine.

And who will control the ammonia? A nitrogen-based economy has less energy than does the current oil-based economy. Energy becomes power when it is scarcer, even more than it is currently. Right now, we’re seeing how Big Oil distorts politics all over the world. Small ammonia generators, like the NH3 Canada machine, change the current game that is dominated by a few huge producers, because they mean that small-scale producers can make small amounts of fuel, at least in the short term. Probably this means that the advantage shifts from those who produce ammonia to those who build ammonia synthesizers and can best ship the ammonia from producers to consumers. Over time, I suspect that a few big ammonia producers will dominate the industry in any one area. They will be, quite literally, power brokers.

Still, switching to ammonia could slow down global warming, because the great advantage of NH3 is there’s no carbon emission from using it. It beats things like bio-diesel and biomass cold. Unfortunately, we’re seeing increasing methane emissions from the Arctic, so even if we get civilization’s carbon emissions under control, we may be passing the tipping point as you read this. We’ll see.

If you want to write a SF-thriller set in the next few decades, you could do worse than to power the world with ammonia, and make the Politics of N a centerpiece of the story. After all ammonia isn’t just a fuel, it’s a cleaner, fertilizer, and a refrigerant. Who wouldn’t want to get rich off it? Something to think about.



Book plans, part deux
August 20, 2015, 5:24 am
Filed under: book | Tags: ,

Hi All,

Just a brief note. Several people asked if they could be beta readers, and I really, genuinely appreciate the interest.

Here’s the deal: I’m planning on self-publishing the book this fall while I look for a conventional publisher. There are a number of reasons to do this, ranging from getting it out in 2015, before the political craziness of 2016 gets fully into gear, to field testing what is a somewhat unusual book by an unknown author, to see how it does. This is the Off Broadway run, if you will. It’s a practice that both an agent and guide to non-fiction publishing recommended, and it makes sense.

So you will have ample opportunity to read this and give me feedback, both by email, on an open comment thread, and in online reviews. I’ll be actively soliciting readers to catch bugs and typos, to give me feedback so that I can make it better. So long as it’s self-published, I can issue corrected editions. If and when it gets picked up by a publisher, then I’ll do my best (subject to corporate editing) to acknowledge everybody’s feedback. The commercially published version will perforce be a different book. Aside from wider sales, commercial publishing will help me get the book into schools and libraries. It’s egotistical, but I’d like to have it where people can find it even when it goes out of print.

So yes, thank you for your interest, and watch this space for further announcements.



Tragedy of the…um, what?

This is a short post inspired by a comment train on Charles Stross’ Antipope.

The question at the time was whether Nobel prizes are sexist, with more women than men getting the award. My anecdote was about the late Dr. Elinor Ostrom, who received the 2009 Nobel Economics prize for her work showing that Commons could indeed work. I discussed this a couple of years ago in a post about whether to markets could be managed as commons, which was a topic I was playing with at the time.

The previous posts lists eight principles that Dr. Ostrom found worked to allow members of a commons to successfully manage that commons. This goes against the idea of the Tragedy of the Commons (Wikipedia article link). I hear this most often referred to today by people who refer to the idea as a reason why commons should be privatized and market forces should be used to manage them, because otherwise they’re doomed. This isn’t quite what Hardin meant, and later on, he noted that he should have called it “The Tragedy of the Unregulated Commons.”

Getting at the question of whether the economics Nobels are sexist, I’d point to three lines of evidence:
1. To date, Dr. Ostrom is the only woman who has received an economics Nobel.
2. As I recall from the references, she caught a lot of flack when she received the award. She was derided as a sociologist, not a real economist, and some said that there were other men who were more deserving who were overlooked that year.
3. More to the point, men in particular still refer to the tragedy of the commons as if it’s a real thing. When confronted with Ostrom’s work, they insist that they mean something that’s real, which is a common defense against any such attack.

Here’s the thing: everyone agrees that unregulated commons can be looted. But this statement is also true for any unregulated market (think illegal drugs, human trafficking, poaching…), and it’s true for unregulated capitalism (think illegal drugs, human trafficking, poaching…). If we’re going to use the term “Tragedy of the Commons” as if it’s real, I’d argue that it’s only fair to talk about “The Tragedy of Capitalism” and “The Tragedy of Markets” as the reason why we should manage as many common resources as commons, rather than having them under private, inequitable control that runs them into the ground for the profit of the few. It’s just as true.

However, I don’t expect anyone to be fair, so the better option is to realize that the Tragedy of the Commons is a term that needs to be retired. The reason for retiring it is that self-regulated commons can work very well. Properly designed and regulated commons are a perfectly reasonable management system for everything from community forests to large scale groundwater basins, and eliminating the “TotC” phrase from our vocabulary frees us up to explore these management options where they’re appropriate. Given how important things like groundwater management are for keeping civilization running, I’d suggest that every good management system should be an acceptable option for managing them, and that includes commons.



That brief window
July 29, 2015, 9:35 pm
Filed under: deep time, futurism, Worldbuilding | Tags: , ,

Well, the book manuscript is done, and I’ve got some beta readers going over it while I figure out the strange world of non-fiction publishing. As I understand it, one not supposed to write a non-fiction book on spec, but rather to have a contract to write the book based on how well you can convince the publisher it will sell, based on your audience. And simultaneous self-publishing is a thing too, apparently. Interesting business, especially when I write a book of 100% speculation about a climate-changed Earth, and it’s called non-fiction.

So I have time to blog more regularly.

One of the things I’ve increasingly noticed is how bad we are with big numbers, and dealing with big numbers turned out to be a central feature of the book. In general, when we look at phrases like a few years, or a few decades, or a few centuries, or a few millennia, or a few thousand years, or a few million years, we fixate on the “few” and ignore whatever comes after that. As a result, we get weird phrases like the Great Oxygenation Event, which took a few billion years back when the Earth switched from an anaerobic atmosphere to aerobic one. It doesn’t sound like much, until you realize that animals have been on land less than 500,000,000 years, or less than a quarter of a billion years. The Primitive Animals Invade the Land Event will end with the expanding sun making such life impossible on Earth long before our little event matches the length of the Great Oxygenation Event, yet people some people still think that the Earth was oxygenated very suddenly, rather than incredibly gradually. All that happened was that people ignored all the zeroes, called a process an event, and confused themselves and their audience.

This applies to human history as well. If we take the Omo 1 skull as the oldest modern human, we’ve got at least 195,000 years of history to our species already.

We’re young compared to most species, but we’ve still got a lot of history, and most of it is lost. Our documented history is about the last 5,000 years, and the archaeological becomes fragmentary shortly thereafter. In other words, thanks to writing, we’ve got partial access to about 1% of our apparent history as a species. The conventional interpretation of this is that humans were basically boring for the first 99% of our history, then something changed, and we took off like gypsy moths, expanding into this outbreak of humanity we call civilization. Prior to that, we were peaceable-ish hunter gatherers living in harmony with nature.

What changed? The more I read, the more I tend to agree with the archaeologist Brian Fagan. In The Long Summer, he postulated that civilization arose after the last ice age because the climate stabilized after the ice age, not because humans changed in any real way. There’s some evidence to back him up. Alvin Alley, in the Two Mile Time Machine, talks about Dansgaard-Oeschger (D-O) events in the glacial record. These are times when the global temperature bounces back and forth many degrees, and they are thought to be due to ice from Hudson’s Bay glaciers messing up global thermohaline circulation in a semi-periodic way. Basically, the climate at Glacial Maximum is stably cold, the climate in the interglacial is stably warm, and the times between those periods have the climate oscillating between cold, colder, and coldest in something like a 1,500 year pattern with lots of noise. In such a continually changing global environment, things like agriculture would be difficult to impossible, so it’s no surprise that humans would be nomadic hunter-gatherers. If there were something before the D-O events, the evidence would be lost, and the absence of evidence would make us think that, until 5,000 years ago, we were primitive savages.

If you’ve been following the news, you know that evidence for agriculture 23,000 years ago turned up in Israel (link to article). The last glacial maximum happened from 26,000-19,000 years ago. If one believes that stable climates make things like agriculture possible, then it’s easy to believe that someone invented farming during the last glacial maximum, and that it was lost when the D-O events started up and their culture shattered.

So how often did humans go through this, discover and lose agriculture? We have no clue. Except for that fortuitous find in the Sea of Galilee, when a long drought temporarily revealed an archaeological site that is currently underwater again, there’s no other evidence for truly ancient agriculture.

The last interglacial was the Eemian, 130,000-115,000 years ago. Did the Neanderthals invent agriculture back then? There’s little undisputed fossil or archaeological evidence from that time, and who knows if any evidence still exists. What we do know is that the Eemian people did not smelt a lot of metal, for there were ample ore deposits waiting for us to find them on the surface. We know they didn’t use petroleum or coal for the same reason, and there’s no evidence that they moved massive amounts of Earth or built great pyramids, as we’ve done. Those kinds of evidence seem to last. But if they had small neolithic farming towns, especially in northern Europe, the evidence would have disappeared in the subsequent glaciation.

This pattern applies to our future too, especially if climate change collapses our civilization and forces the few survivors to be hunters and gatherers. Our civilization would lose continuity, our history would vanish, our flimsy concrete buildings would collapse into rubble, and coastal ruins would disappear under the rising sea. What would remain of us, except our earthworks and our descendants? My rough guess is that such an age of barbarism would last between 200 and 2,000 years before the climate stabilized and civilization became possible again. Would the people building their civilization on the other side think they were the first civilized people, too, that their history began when they were created them a few thousand years prior, as we used to think?

That may be the fate of future humanity on Earth, even if our species lasts a billion years. When the climate is stable for thousands of years, there will be outbreaks of humanity–what we call civilization, when we temporarily escape nature’s constraints, grow fruitful, and multiply to fill the place. In between these outbreaks there will be far fewer of us, and we’ll live in smaller, simpler societies. What we will know will be a balance between what we’ve retained and (re)discovered, and what crisis, collapse, and continual change has caused us to lose. Our history, at any one time, will be that brief window of a few thousand years between discovery and loss, with only enigmatic artifacts, like those 23,000 year-old seeds, to tell us that we weren’t the first ones to discover something. They’ll be enough to hint at how much history we’ve lost, but not enough to let us recover it.



Three solutions to the Fermi Paradox
March 27, 2015, 4:23 pm
Filed under: science fiction, Speculation | Tags: , ,

Wow, didn’t realize I hadn’t posted in so long.  I got busy with life and writing.  Here’s something that I was originally going to put in the book, but it doesn’t really fit there.  It’s thoughts about how human experience might explain the Fermi Paradox.

Now, for one thing, I don’t think it’s much of a paradox.  After all, as XKCD explained in “Alien Astronomers”, it would be almost impossible for a radio telescope on Alpha Centauri to pick up our radio broadcasts.  Military and research radar beams, yes, but not our ordinary chatter.  One critical point is that broadcasting powerful radio signals takes a lot of energy, and that’s expensive.  If it’s more cost effective to be efficient, then we’ll do it (as we have with broadcasting and intercontinental cable) and that makes us more radio-invisible.  At our current level of technology, the galaxy could be brimming with civilizations, and we couldn’t see them, nor could they see us.  Being blind isn’t much of a paradox.

Of course, the question is, why aren’t the aliens here already?  If they’ve had even a million years’ more civilization, shouldn’t they have starships?  Well, here’s another answer: starships are expensive, because at high speeds, they’re a drag.  This came out of an arXiv paper (link), and the pop-sci version on Io9.  The basic point is that for a starship traveling at high speeds runs into photons from the Cosmic Microwave background, and if it’s traveling fast enough, those collisions generate about 2 million joules/second in energy, which seems to act like frictional energy slowing the ship down.  So not only does a starship have to hit those high speeds, it has to continuously generate more thrust as particle collisions slow it down.  You can’t just accelerate a starship and coast to another star, except at really low speeds which would take thousands of years to get between stars.  Do you know how to make a machine that continuously functions for thousands of years?  That’s a non-trivial challenge.  So there’s answer #2 for the Fermi Paradox: space isn’t slick enough to coast.  At high speeds, the CMB acts like an aether and causes friction, requiring continuous acceleration.

Answer #3 for the Fermi paradox is the one I was going to stick in my book, which is about what the Earth will look like if the worst predictions of climate change come to pass, and humans don’t go extinct.  This scenario could also explain the Fermi Paradox.  Basically, in the roughly 500 years of the Industrial Revolution (and yes, I know that it was much longer in the run-up), we’ll have burned through all our fossil fuels, our available radioactive elements, minable elements from aluminum to phosphorus, groundwater, and so forth.  After we use up all the cheap energy and readily available raw materials, we’ll be stuck recycling everything using solar and gravitational energy (or biofuels, PV, wind turbines, and hydropower, if you want mechanisms) for hundreds of thousands to millions of years, until the Earth can generate more fossil fuels. Perhaps we had a brief window in the 1970s when, if we’d gone for it and known what we were doing, we *might* have put a colony on the moon.  Highly unlikely, but possible, and the chances of that colony surviving would be fairly low.  We can’t get to Mars now (due to little problems like radiation in interplanetary space), and if we don’t get nuclear fusion to work real soon now (the 1970s would have been a good time for that breakthrough, too), we’re going to be downsizing civilization pretty radically in the coming century, rather than going to Mars or beyond.

Let’s assume that humans are relatively normal for sapient species, in the sense that we got our rapid spurt of technological advance by using up all the surplus energy that their planetary biosphere had squirreled away for the last 300 million years.  By the time we understood the true state of our world and the galaxy, we also realized we were in trouble, because we were already going into a time of overconsumption and too-rapid population growth. By the time we are technologically sophisticated enough to possibly colonize another planet, we won’t have the resources to do so.  Indeed, we’ll be forced to use any terraforming techniques we work out on the Earth just to keep it habitable.  Once we’ve survived this peak experience, we’ll be a mature civilization (or more likely civilizations), but we’ll also be radio-quiet, highly resource efficient, and totally incapable of interplanetary travel, let alone interstellar voyaging.

That’s the #3 answer to the Fermi Paradox: scientific development marches in tandem with resource extraction, and it’s impossible to become sophisticated enough to colonize another planet without exhausting the resources of the planet you’re on.  It’s possible that the universe is littered with ancient  sophisticated civilizations that have already gone through their peak resource crisis and are quietly going on with their lives, stuck on their planets, kind of like kids who went to college to change the world and got stuck with crushing college debts and jobs that weren’t their dreams.  In our case, we’ve still got a billion years or so left before Earth becomes totally uninhabitable, so it’s not a horrible to be “stuck” here, on the one planet we’re evolved to live on. It’s just sad for those of us who thought that Star Trek looked like a really cool way to live.



American Brontosaur

I haven’t posted recently, because I’ve been busy with a book and life throwing things at me. Anyway, as part of research for the book (which explores the idea of what the deep future looks like if severe climate change comes to past and humans don’t go extinct), I wanted to find out how much energy the average American currently uses. So I did the usual Google Search, and tripped over Cecil Adams’ 2011 Straight Dope column about whether Americans use more energy than a blue whale (which was asserted in a 2009 New York Times article). He (actually his brainy assistant Una) cranked the calculation and came up with the basic answer of “no.” Just for thoroughness’ sake, I decided to replicate part of it.

It turns out that, in 2012 (according to <a href=”https://flowcharts.llnl.gov/energy.html”>LLNL</a&gt;), the US used 9.71 quadrillion BTUs of energy (quads), of which 4.17 quads were actually used for something and 5.56 quads were lost in the system. As of December 31, 2012, there were 312.8 million people in the US. Grinding the numbers out, converting BTUs per year into watts and assuming that the population was constant throughout 2012, I got that the US generated about 10,378 watts per person, of which about 4,457 watts was used, 5,943 watts were wasted.

So Cecil (actually Una) was basically right in saying that Americans used about 11 kilowatts of energy per capita per year. According to what they found in their research, a hundred ton blue whale used about 65 kilowatts. So if this mythical average American isn’t consuming the energetic equivalent of a 100 ton blue whale, then, we’re sort of vaguely equivalent to a 15 to 20 ton blue whale (they exist too–they’re called calves).

While I was wallowing around, try to find the appropriate whale equivalent for this average American, it dawned on me that there’s a whole other class of critters that large: sauropod dinosaurs. Of course, they’re extinct, so their current metabolic rate is zero. However, it’s not entirely silly to postulate that they had whale-like metabolisms back when they were alive. We don’t know how much the large sauropods weighed either, but Brontosaurus (yes, I know it’s Apatosaurus, I’ll get back to that), is thought to have weighed in between 15 and 20 tons, if you believe Wikipedia.

In other words, the average American uses as much energy as an average brontosaurus.

Now, of course we can argue that Apatosaurus is not the right sauropod, that due to some metabolic model or reconstructed weight or other, another sauropod is a better metaphor than ol’ bronty. It’s an understandable but unwinnable argument, because the energy use of the average American is kind of a goofy concept too. A big chunk of that energy is used (and lost) transporting stuff around supposedly to benefit us, but we never see it. It is also averaged across everything from the energy use of a bum on skid row to that of a jet-setting star, and it’s a very uneven distribution. What does average mean? Who’s average? Whatever it means, the average human working an eight hour office day works pretty well on somewhere around 75 watts (resting metabolism), so we average Americans are using something like the energy of 150 humans just sitting around doing paperwork.

So, let’s just say that we are, on average, the brontosaurs of the energy world, use an outdated dinosaur name as a metaphor for how much energy we consume. We’re not the biggest energy uses by country, but we’re pretty close.

Now you might think that this energy use means we’re going to go extinct like the brontosaurs, because such energy consumption isn’t sustainable. I think the truth is a little different. As humans, we can live on 75 watts, even 250 watts if we’re working hard and not sitting around. It’s our culture that constrains us to act like brontosaurs, and I’m pretty sure our culture is going to have to change radically if it doesn’t want to disappear. Ultimately, it’s a question of identity: when it’s no longer possible for us to be American brontosaurs, will it still be possible for us to be Americans, or are we going to have to find, join, or develop other cultures that are more energy efficient? Who can we be in the future? That’s one of the questions I’m working on.



Gen. Alexander and the Legacy System from Hell

Here I am venturing into something I know nothing about: the Internet. Recently, I read a 1999 quote from Steward Brand, in The Clock of the Long Now (BigRiver Link), that the internet could “easily become the Legacy System from Hell that holds civilization hostage. The system doesn’t really work, it can’t be fixed, no one understands it, no one is in charge of it, it can’t be lived without, and it gets worse every year.”

Horrible thought, isn’t it? What I don’t know about are the legions of selfless hackers, programmers, techies, and nerds who are valiantly struggling to keep all the internets working. What I do know some tiny bit about are the concerted efforts of the NSA, under General Keith Alexander (who’s due to retire this spring), to install effectively undocumented features throughout the Internets and everything connected to them, so that they can spy at will. Perhaps I’m paranoid, but I’m pretty sure that every large government has been doing the same thing. If someone wants to hack us, they can.

So what?

Well, what I’m thinking about is the question of trust, rather than danger. The idea that cyberspace is dangerous goes well back before the birth of the World Wide Web. Remember Neuromancer? Still, for the first decade of online life, especially with the birth of social media, there was this trust that it was all for the greater good. Yes, of course we knew about spam and viruses, we knew the megacorps wanted our data as a product, and anyone who did some poking or prodding knew that spy agencies were going online too, that cyberwarfare was a thing. Still, there was a greater good, and it was more or less American, and it pointed at greater freedom and opportunity for everyone who linked in.

Is that still true? We’ve seen Stuxnet, which may well have had something to do with General Alexander’s NSA , and we’ve seen some small fraction of Edward Snowden’s revelations, about how the NSA has made every internet-connected device capable of spying on us. Does anyone still trust the US to be the good guys who run the Internet for the world? Even as an American, I’m not sure I do.

This lost trust may be the start of the Internets evolving into the Legacy System from Hell. Instead of international cooperation to maintain and upgrade the internet with something resembling uniform standards, we may well see a proliferation of diverse standards, all in the name of cyber security. It’s a trick that life learned aeons ago, that diversity collectively keeps everything from dying from the same cause. Armies of computer geeks (engineers by the acre in 1950s parlance) will be employed creating work-arounds across all the systems, to keep systems talking with each other. Countries that fall on hard times will patch their servers, unable or unwilling to afford expensive upgrades that have all sorts of unpleasant political issues attached. Cables and satellites will fail and not be replaced, not because we can’t afford to, but because we don’t trust the people on the other end of the link to deal fairly with us and not hack the systems they connect to.

I hope this doesn’t happen, of course, but I wonder. Once trust is lost, it’s difficult to regain. On a global level, can we regain enough trust to have someone run the internet as an international commons? A good place? Or is it too late for that? I’m quite sure that US, Chinese, and Russian cyberwarfare experts all will say that their expertise is defensive, designed to minimize damage, and they may even believe it. Still, in the face of so many soldiers and spies amassing online, why trust our lives to this battlefield? Anything we put online might be wiped out or compromised, victim to a battle we neither wanted nor approved of.

Even though I don’t have a reason to like him, it would be sad if General Alexander’s legacy was starting the conversion of the internet into a legacy system. It will also be instructive too, a lesson in how the buildup of military power can backfire (something I think even Lao Tzu commented on). Fortunately or unfortunately, any history written on a legacy system will most likely vanish when the last expert walks away and the owners pull the plug. That’s the problem with legacy systems, you see. Their data can vanish very, very quickly.




Follow

Get every new post delivered to your Inbox.

Join 35 other followers