Putting the life back in science fiction


That brief window
July 29, 2015, 9:35 pm
Filed under: deep time, futurism, Worldbuilding | Tags: , ,

Well, the book manuscript is done, and I’ve got some beta readers going over it while I figure out the strange world of non-fiction publishing. As I understand it, one not supposed to write a non-fiction book on spec, but rather to have a contract to write the book based on how well you can convince the publisher it will sell, based on your audience. And simultaneous self-publishing is a thing too, apparently. Interesting business, especially when I write a book of 100% speculation about a climate-changed Earth, and it’s called non-fiction.

So I have time to blog more regularly.

One of the things I’ve increasingly noticed is how bad we are with big numbers, and dealing with big numbers turned out to be a central feature of the book. In general, when we look at phrases like a few years, or a few decades, or a few centuries, or a few millennia, or a few thousand years, or a few million years, we fixate on the “few” and ignore whatever comes after that. As a result, we get weird phrases like the Great Oxygenation Event, which took a few billion years back when the Earth switched from an anaerobic atmosphere to aerobic one. It doesn’t sound like much, until you realize that animals have been on land less than 500,000,000 years, or less than a quarter of a billion years. The Primitive Animals Invade the Land Event will end with the expanding sun making such life impossible on Earth long before our little event matches the length of the Great Oxygenation Event, yet people some people still think that the Earth was oxygenated very suddenly, rather than incredibly gradually. All that happened was that people ignored all the zeroes, called a process an event, and confused themselves and their audience.

This applies to human history as well. If we take the Omo 1 skull as the oldest modern human, we’ve got at least 195,000 years of history to our species already.

We’re young compared to most species, but we’ve still got a lot of history, and most of it is lost. Our documented history is about the last 5,000 years, and the archaeological becomes fragmentary shortly thereafter. In other words, thanks to writing, we’ve got partial access to about 1% of our apparent history as a species. The conventional interpretation of this is that humans were basically boring for the first 99% of our history, then something changed, and we took off like gypsy moths, expanding into this outbreak of humanity we call civilization. Prior to that, we were peaceable-ish hunter gatherers living in harmony with nature.

What changed? The more I read, the more I tend to agree with the archaeologist Brian Fagan. In The Long Summer, he postulated that civilization arose after the last ice age because the climate stabilized after the ice age, not because humans changed in any real way. There’s some evidence to back him up. Alvin Alley, in the Two Mile Time Machine, talks about Dansgaard-Oeschger (D-O) events in the glacial record. These are times when the global temperature bounces back and forth many degrees, and they are thought to be due to ice from Hudson’s Bay glaciers messing up global thermohaline circulation in a semi-periodic way. Basically, the climate at Glacial Maximum is stably cold, the climate in the interglacial is stably warm, and the times between those periods have the climate oscillating between cold, colder, and coldest in something like a 1,500 year pattern with lots of noise. In such a continually changing global environment, things like agriculture would be difficult to impossible, so it’s no surprise that humans would be nomadic hunter-gatherers. If there were something before the D-O events, the evidence would be lost, and the absence of evidence would make us think that, until 5,000 years ago, we were primitive savages.

If you’ve been following the news, you know that evidence for agriculture 23,000 years ago turned up in Israel (link to article). The last glacial maximum happened from 26,000-19,000 years ago. If one believes that stable climates make things like agriculture possible, then it’s easy to believe that someone invented farming during the last glacial maximum, and that it was lost when the D-O events started up and their culture shattered.

So how often did humans go through this, discover and lose agriculture? We have no clue. Except for that fortuitous find in the Sea of Galilee, when a long drought temporarily revealed an archaeological site that is currently underwater again, there’s no other evidence for truly ancient agriculture.

The last interglacial was the Eemian, 130,000-115,000 years ago. Did the Neanderthals invent agriculture back then? There’s little undisputed fossil or archaeological evidence from that time, and who knows if any evidence still exists. What we do know is that the Eemian people did not smelt a lot of metal, for there were ample ore deposits waiting for us to find them on the surface. We know they didn’t use petroleum or coal for the same reason, and there’s no evidence that they moved massive amounts of Earth or built great pyramids, as we’ve done. Those kinds of evidence seem to last. But if they had small neolithic farming towns, especially in northern Europe, the evidence would have disappeared in the subsequent glaciation.

This pattern applies to our future too, especially if climate change collapses our civilization and forces the few survivors to be hunters and gatherers. Our civilization would lose continuity, our history would vanish, our flimsy concrete buildings would collapse into rubble, and coastal ruins would disappear under the rising sea. What would remain of us, except our earthworks and our descendants? My rough guess is that such an age of barbarism would last between 200 and 2,000 years before the climate stabilized and civilization became possible again. Would the people building their civilization on the other side think they were the first civilized people, too, that their history began when they were created them a few thousand years prior, as we used to think?

That may be the fate of future humanity on Earth, even if our species lasts a billion years. When the climate is stable for thousands of years, there will be outbreaks of humanity–what we call civilization, when we temporarily escape nature’s constraints, grow fruitful, and multiply to fill the place. In between these outbreaks there will be far fewer of us, and we’ll live in smaller, simpler societies. What we will know will be a balance between what we’ve retained and (re)discovered, and what crisis, collapse, and continual change has caused us to lose. Our history, at any one time, will be that brief window of a few thousand years between discovery and loss, with only enigmatic artifacts, like those 23,000 year-old seeds, to tell us that we weren’t the first ones to discover something. They’ll be enough to hint at how much history we’ve lost, but not enough to let us recover it.



Three solutions to the Fermi Paradox
March 27, 2015, 4:23 pm
Filed under: science fiction, Speculation | Tags: , ,

Wow, didn’t realize I hadn’t posted in so long.  I got busy with life and writing.  Here’s something that I was originally going to put in the book, but it doesn’t really fit there.  It’s thoughts about how human experience might explain the Fermi Paradox.

Now, for one thing, I don’t think it’s much of a paradox.  After all, as XKCD explained in “Alien Astronomers”, it would be almost impossible for a radio telescope on Alpha Centauri to pick up our radio broadcasts.  Military and research radar beams, yes, but not our ordinary chatter.  One critical point is that broadcasting powerful radio signals takes a lot of energy, and that’s expensive.  If it’s more cost effective to be efficient, then we’ll do it (as we have with broadcasting and intercontinental cable) and that makes us more radio-invisible.  At our current level of technology, the galaxy could be brimming with civilizations, and we couldn’t see them, nor could they see us.  Being blind isn’t much of a paradox.

Of course, the question is, why aren’t the aliens here already?  If they’ve had even a million years’ more civilization, shouldn’t they have starships?  Well, here’s another answer: starships are expensive, because at high speeds, they’re a drag.  This came out of an arXiv paper (link), and the pop-sci version on Io9.  The basic point is that for a starship traveling at high speeds runs into photons from the Cosmic Microwave background, and if it’s traveling fast enough, those collisions generate about 2 million joules/second in energy, which seems to act like frictional energy slowing the ship down.  So not only does a starship have to hit those high speeds, it has to continuously generate more thrust as particle collisions slow it down.  You can’t just accelerate a starship and coast to another star, except at really low speeds which would take thousands of years to get between stars.  Do you know how to make a machine that continuously functions for thousands of years?  That’s a non-trivial challenge.  So there’s answer #2 for the Fermi Paradox: space isn’t slick enough to coast.  At high speeds, the CMB acts like an aether and causes friction, requiring continuous acceleration.

Answer #3 for the Fermi paradox is the one I was going to stick in my book, which is about what the Earth will look like if the worst predictions of climate change come to pass, and humans don’t go extinct.  This scenario could also explain the Fermi Paradox.  Basically, in the roughly 500 years of the Industrial Revolution (and yes, I know that it was much longer in the run-up), we’ll have burned through all our fossil fuels, our available radioactive elements, minable elements from aluminum to phosphorus, groundwater, and so forth.  After we use up all the cheap energy and readily available raw materials, we’ll be stuck recycling everything using solar and gravitational energy (or biofuels, PV, wind turbines, and hydropower, if you want mechanisms) for hundreds of thousands to millions of years, until the Earth can generate more fossil fuels. Perhaps we had a brief window in the 1970s when, if we’d gone for it and known what we were doing, we *might* have put a colony on the moon.  Highly unlikely, but possible, and the chances of that colony surviving would be fairly low.  We can’t get to Mars now (due to little problems like radiation in interplanetary space), and if we don’t get nuclear fusion to work real soon now (the 1970s would have been a good time for that breakthrough, too), we’re going to be downsizing civilization pretty radically in the coming century, rather than going to Mars or beyond.

Let’s assume that humans are relatively normal for sapient species, in the sense that we got our rapid spurt of technological advance by using up all the surplus energy that their planetary biosphere had squirreled away for the last 300 million years.  By the time we understood the true state of our world and the galaxy, we also realized we were in trouble, because we were already going into a time of overconsumption and too-rapid population growth. By the time we are technologically sophisticated enough to possibly colonize another planet, we won’t have the resources to do so.  Indeed, we’ll be forced to use any terraforming techniques we work out on the Earth just to keep it habitable.  Once we’ve survived this peak experience, we’ll be a mature civilization (or more likely civilizations), but we’ll also be radio-quiet, highly resource efficient, and totally incapable of interplanetary travel, let alone interstellar voyaging.

That’s the #3 answer to the Fermi Paradox: scientific development marches in tandem with resource extraction, and it’s impossible to become sophisticated enough to colonize another planet without exhausting the resources of the planet you’re on.  It’s possible that the universe is littered with ancient  sophisticated civilizations that have already gone through their peak resource crisis and are quietly going on with their lives, stuck on their planets, kind of like kids who went to college to change the world and got stuck with crushing college debts and jobs that weren’t their dreams.  In our case, we’ve still got a billion years or so left before Earth becomes totally uninhabitable, so it’s not a horrible to be “stuck” here, on the one planet we’re evolved to live on. It’s just sad for those of us who thought that Star Trek looked like a really cool way to live.



American Brontosaur

I haven’t posted recently, because I’ve been busy with a book and life throwing things at me. Anyway, as part of research for the book (which explores the idea of what the deep future looks like if severe climate change comes to past and humans don’t go extinct), I wanted to find out how much energy the average American currently uses. So I did the usual Google Search, and tripped over Cecil Adams’ 2011 Straight Dope column about whether Americans use more energy than a blue whale (which was asserted in a 2009 New York Times article). He (actually his brainy assistant Una) cranked the calculation and came up with the basic answer of “no.” Just for thoroughness’ sake, I decided to replicate part of it.

It turns out that, in 2012 (according to <a href=”https://flowcharts.llnl.gov/energy.html”>LLNL</a&gt;), the US used 9.71 quadrillion BTUs of energy (quads), of which 4.17 quads were actually used for something and 5.56 quads were lost in the system. As of December 31, 2012, there were 312.8 million people in the US. Grinding the numbers out, converting BTUs per year into watts and assuming that the population was constant throughout 2012, I got that the US generated about 10,378 watts per person, of which about 4,457 watts was used, 5,943 watts were wasted.

So Cecil (actually Una) was basically right in saying that Americans used about 11 kilowatts of energy per capita per year. According to what they found in their research, a hundred ton blue whale used about 65 kilowatts. So if this mythical average American isn’t consuming the energetic equivalent of a 100 ton blue whale, then, we’re sort of vaguely equivalent to a 15 to 20 ton blue whale (they exist too–they’re called calves).

While I was wallowing around, try to find the appropriate whale equivalent for this average American, it dawned on me that there’s a whole other class of critters that large: sauropod dinosaurs. Of course, they’re extinct, so their current metabolic rate is zero. However, it’s not entirely silly to postulate that they had whale-like metabolisms back when they were alive. We don’t know how much the large sauropods weighed either, but Brontosaurus (yes, I know it’s Apatosaurus, I’ll get back to that), is thought to have weighed in between 15 and 20 tons, if you believe Wikipedia.

In other words, the average American uses as much energy as an average brontosaurus.

Now, of course we can argue that Apatosaurus is not the right sauropod, that due to some metabolic model or reconstructed weight or other, another sauropod is a better metaphor than ol’ bronty. It’s an understandable but unwinnable argument, because the energy use of the average American is kind of a goofy concept too. A big chunk of that energy is used (and lost) transporting stuff around supposedly to benefit us, but we never see it. It is also averaged across everything from the energy use of a bum on skid row to that of a jet-setting star, and it’s a very uneven distribution. What does average mean? Who’s average? Whatever it means, the average human working an eight hour office day works pretty well on somewhere around 75 watts (resting metabolism), so we average Americans are using something like the energy of 150 humans just sitting around doing paperwork.

So, let’s just say that we are, on average, the brontosaurs of the energy world, use an outdated dinosaur name as a metaphor for how much energy we consume. We’re not the biggest energy uses by country, but we’re pretty close.

Now you might think that this energy use means we’re going to go extinct like the brontosaurs, because such energy consumption isn’t sustainable. I think the truth is a little different. As humans, we can live on 75 watts, even 250 watts if we’re working hard and not sitting around. It’s our culture that constrains us to act like brontosaurs, and I’m pretty sure our culture is going to have to change radically if it doesn’t want to disappear. Ultimately, it’s a question of identity: when it’s no longer possible for us to be American brontosaurs, will it still be possible for us to be Americans, or are we going to have to find, join, or develop other cultures that are more energy efficient? Who can we be in the future? That’s one of the questions I’m working on.



Gen. Alexander and the Legacy System from Hell

Here I am venturing into something I know nothing about: the Internet. Recently, I read a 1999 quote from Steward Brand, in The Clock of the Long Now (BigRiver Link), that the internet could “easily become the Legacy System from Hell that holds civilization hostage. The system doesn’t really work, it can’t be fixed, no one understands it, no one is in charge of it, it can’t be lived without, and it gets worse every year.”

Horrible thought, isn’t it? What I don’t know about are the legions of selfless hackers, programmers, techies, and nerds who are valiantly struggling to keep all the internets working. What I do know some tiny bit about are the concerted efforts of the NSA, under General Keith Alexander (who’s due to retire this spring), to install effectively undocumented features throughout the Internets and everything connected to them, so that they can spy at will. Perhaps I’m paranoid, but I’m pretty sure that every large government has been doing the same thing. If someone wants to hack us, they can.

So what?

Well, what I’m thinking about is the question of trust, rather than danger. The idea that cyberspace is dangerous goes well back before the birth of the World Wide Web. Remember Neuromancer? Still, for the first decade of online life, especially with the birth of social media, there was this trust that it was all for the greater good. Yes, of course we knew about spam and viruses, we knew the megacorps wanted our data as a product, and anyone who did some poking or prodding knew that spy agencies were going online too, that cyberwarfare was a thing. Still, there was a greater good, and it was more or less American, and it pointed at greater freedom and opportunity for everyone who linked in.

Is that still true? We’ve seen Stuxnet, which may well have had something to do with General Alexander’s NSA , and we’ve seen some small fraction of Edward Snowden’s revelations, about how the NSA has made every internet-connected device capable of spying on us. Does anyone still trust the US to be the good guys who run the Internet for the world? Even as an American, I’m not sure I do.

This lost trust may be the start of the Internets evolving into the Legacy System from Hell. Instead of international cooperation to maintain and upgrade the internet with something resembling uniform standards, we may well see a proliferation of diverse standards, all in the name of cyber security. It’s a trick that life learned aeons ago, that diversity collectively keeps everything from dying from the same cause. Armies of computer geeks (engineers by the acre in 1950s parlance) will be employed creating work-arounds across all the systems, to keep systems talking with each other. Countries that fall on hard times will patch their servers, unable or unwilling to afford expensive upgrades that have all sorts of unpleasant political issues attached. Cables and satellites will fail and not be replaced, not because we can’t afford to, but because we don’t trust the people on the other end of the link to deal fairly with us and not hack the systems they connect to.

I hope this doesn’t happen, of course, but I wonder. Once trust is lost, it’s difficult to regain. On a global level, can we regain enough trust to have someone run the internet as an international commons? A good place? Or is it too late for that? I’m quite sure that US, Chinese, and Russian cyberwarfare experts all will say that their expertise is defensive, designed to minimize damage, and they may even believe it. Still, in the face of so many soldiers and spies amassing online, why trust our lives to this battlefield? Anything we put online might be wiped out or compromised, victim to a battle we neither wanted nor approved of.

Even though I don’t have a reason to like him, it would be sad if General Alexander’s legacy was starting the conversion of the internet into a legacy system. It will also be instructive too, a lesson in how the buildup of military power can backfire (something I think even Lao Tzu commented on). Fortunately or unfortunately, any history written on a legacy system will most likely vanish when the last expert walks away and the owners pull the plug. That’s the problem with legacy systems, you see. Their data can vanish very, very quickly.



Are Markets Commons? Perhaps they should be managed that way?
December 19, 2013, 9:38 pm
Filed under: commons, economics, Speculation, sustainability | Tags: , ,

This isn’t my original idea. I’m reading John Michael Greer’s The Wealth of Nature: Economics as if Survival Mattered (Amazon link), and he makes the assertion that a free market, “in which buyers and sellers are numerous enough that free competition regulates their interactions,” is a form of commons, a resource that should ideally be free to all in a society. He goes on to point out that this is in contrast to those who think that all commons should be eliminated in favor of private ownership. The issue he’s getting at is that free markets cannot exist without regulation, something recognized even by Adam Smith, who noted in the Wealth of Nations that “people of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or some contrivance to raise prices” (reference).

I can see a long argument about how true this is, because it’s a provocative concept. Markets and commons are traditionally diametrically opposed in capitalist thinking, it’s hard to consider that they have anything in common. I’m happy to have that discussion, but there’s another related issue that, to me, is even more interesting: Can markets be managed as commons?

We don’t have any data on this, but the late Elinor Ostrom won the 2009 Nobel in economics for her studies of how commons are successfully managed. She found, through studying both successful commons (water districts, community forests, and the like) and unsuccessful commons, that there were eight “Design Principles” that distinguished the successful commons from the failures (Amazon link to reference).

Here are Dr. Ostrom’s eight design principles, as rewritten by David Sloan Wilson in The Neighborhood Project (Amazon link). I’m using Wilson’s version since it’s more general than the original.

1. Clearly defined boundaries. Members of the group should know who they are, have a strong sense of group identity, and know the rights and obligations of membership. If they are managing a resource, then the boundaries of the resource should also be clearly identified.

2. Proportional equivalence between benefits and costs. Having some members do all the work while others get the benefits is unsustainable over the long term. Everyone must do his or her fair share, and those who go beyond the call of duty must be appropriately recognized. When leaders are accorded special privileges, it should be because they have special responsibilities for which they are held accountable. Unfair inequality poisons collective efforts.

3. Collective-choice arrangements. Group members must be able to create their own rules and make their own decisions by consensus. People hate being bossed around but will work hard to do what we want, not what they want. In addition, the best decisions often require knowledge of local circumstances that we have and they lack, making consensus decisions doubly important.

4. Monitoring. Cooperation must always by guarded. Even when most members of a group are well meaning, the temptation to do less than one’s share is always present, and a few individuals might try actively to game the system. If lapses and transgressions can’t be detected, the group enterprise is unlikely to succeed.

5. Graduated sanctions. Friendly, gentle reminders are usually sufficient to keep people in solid citizen mode, but tougher measures such as punishment and exclusion must be held in reserve.

6. Fast and fair conflict resolution. Conflicts are sure to arise and must be resolved quickly in a manner that is regarded as fair by all parties. This typically involves a hearing in which respected members of the group, who can be expected to be impartial, make an equitable decision.

7. Local autonomy. When a group is nested within a larger society, such as a farmers’ association dealing with the state government or a neighborhood group dealing with a city, the group must be given enough authority to create its own social organization and make its own decisions, as outlined in items 1. and 6. above.

8. Polycentric governance. In large societies that consist of many groups, relationships among groups must embody the same principles as relationships among individuals within groups.

What’s interesting about these rules is that, superficially, it looks like these would be great rules for free markets as well. Look at the complaints such rules would solve:

— markets should have boundaries. People get really uncomfortable when everything is for sale, whether they want it to be or not. There’s a general idea that some things should not be for sale, while markets are the appropriate venue for other things. Similarly, not everyone wants to participate in “the marketplace” and the outsiders resent being forced in.
–Markets should be fair, the fabled level playing field. Most would agree that people should get special privileges only so that they can exercise special responsibilities, not because they have special connections. Similarly, corruption and gaming the system should be punished.
–Collective decision making. This one is tough, because everyone wants to constrain the fat cats, whether or not they’re in the market. Still, there are many complaints about top-down rulemaking, and with good reason. This is not to say that markets are all good at self-governing (and here I’m thinking about the body-county in illegal drug marketing disputes), but to the extent that a market is self-governing, having rules that everyone agrees are fair is not a bad thing.
–Monitoring: this one is a no-brainer. Corruption kills markets, and they always need to be monitored to avoid people gaming the system. Interestingly, monitoring in commons can come either from within, from people hired to monitor the system, or from outside officials. Any and all of these can work, depending on the circumstances.
–Fast and fair conflict resolution: this one is another no-brainer. Things work best when disputes can be settled fairly and quickly, either by a tribunal within the market, or by higher authorities, so long as judgement is fast and fair.
–Local autonomy. This can be somewhat problematic when you think about Wall Street, but it’s the flip side of having collective decision making within a market. If the authorities are going to let a market make their own rules, they need to let the market govern itself. Note, however, that authorities can be intimately involved in both monitoring and conflict resolution, so long as the market grants that this is their legitimate role in the market.
–Polycentric governance: This is the idea that the relationship between individuals and a markets is mirrored between markets within a greater market, if such a hierarchy exists. I’m not sure how this might work, but it does embody the same ideas of group decision making (on the level of member markets), monitoring, fast and fair dispute resolution, and so forth. That’s not a bad way to handle commerce on a large scale.

To me, this is the bigger point: even if markets aren’t exactly commons, it certainly looks like the principles that lead to successful commons might lead to successful free markets. Additionally, it’s not particularly driven by any market ideology: both progressives and libertarians could agree on these design principles. Even the big government proponents tend to agree (in my experience) that the best regulations are the ones that people think are fair and fairly enforced. Trying to get such regulations written can be very difficult, but it’s often a major goal of regulation. What also makes this interesting is that, if you accept that markets may be commons, it’s possible to have a free market under a wide range of conditions—so long as the market is properly monitored and managed according to rules.

A truly free market won’t work, but a market commons may well be viable. What’s sad is how far Wall Street currently is from most of these design principles. Perhaps our financial markets are a lot less successful and sustainable than we might wish for? Perhaps they need (shock, horror) more regulation, not less, to last?

What do you think?



Hobbits of the ATM?

No, I haven’t seen the latest offering Peter Jackson yet, but I will soon. Still, in honor of the latest, erm, extension of The Hobbit onto the big screen, I thought I’d pitch out an interesting possibility for the future of at least some of our descendents.

First, a definition: ATM isn’t the money machine. Rather, it’s an acronym for Anthropocene Thermal Maximum, which we’ll hit sometime after we’ve exhausted all the fossil fuels we’re willing to dig up into the atmosphere. If we blow off over something like 2500 gigatonnes of carbon, we’re going to be in the range of the PETM, the Paleocene-Eocene Thermal Maximum (Wikipedia link) about 55.8 million years ago, when global temperatures got as hot as they have been in the last 60 million years. Our descendents’ future will be similar, if we can’t get that whole carbon-neutral energy economy working.

One of the interesting recent findings is that mammals shrank up to 30 percent during the PETM (link to press release). The reason given by the researchers is that increased CO2 causes plants to grow more foliage and fewer fruits (in the botanical sense, so we’re talking fruits, nuts, grains, and all the other things we like to eat). This poorer nutrition led to smaller animals. I think there’s another possible explanation for the decrease in animal size.

My thought was that, if civilization crashes due to radical climate change into a PETM-type world, humans will be at the mercy of the elements, so it’s quite likely that future people will be smaller in size. Perhaps 30 percent smaller? Sitting down with the BMI graph and making a few assumptions, I found that the 30% smaller equivalent of a 71 inch tall male weighing 160 lbs is approximately 60 inches tall. Now, this is an interesting height, because it is the upper limit of pigmy heights in an interesting 2007 study by Migliano et al. in PNAS (link to article). Their hypothesis was that the evolution of pigmies around the world is best explained by significant adult mortality, which they adapted to by shifting from growth to reproduction earlier in their lives. The researchers found that the average age at mortality in pigmies is 16-24, and few live into their 40s. The major cause of death is disease, rather than starvation or accidents.

While I don’t know of any evidence of increased animal disease during the PETM, there is good evidence for increased plant disease and predation by insects (link), so it’s not much of a stretch to hypothesize that the animal dwarfing could have been caused by increased disease, decreased lifespans, and a resulting shift towards smaller body size and early reproduction.

So, here’s the idea: if we blow too much carbon into the air, and our ATM rivals or exceeds the PETM, at least some of our descendents will be the size of pigmies, due to the harsher environment (more disease, less medical care) favoring people who mature earlier and have kids as teenagers. They probably won’t be hobbits unless a hairy-footed morph takes off somewhere (perhaps in the jungles of Northern California?), but they will be technically pigmies.

It’s not the most pleasant thought, but if short lives and statures is troubling, the good news is that post-PETM fossils show that animal species regained their former size once the carbon was out of the air. And, according to Colin Turnbull’s The Forest People, life as a pigmy isn’t necessarily nasty or brutish, even if it’s short.



The Syrian Water War (?)

Not that I’m an expert on foreign policy or Syria (there’s someone with the same last name who is. We’re not related). The one thing I do understand, a little bit, is water politics, and that’s may well be one of the important drivers of the Syrian civil war. As Mark Twain said, “Whiskey’s for drinking, water’s for fighting over.” And good Muslims won’t drink whiskey. Since I’m interested in the deep future with climate change, this might be a portrait of things to come for other parts of the world, including where I live in the southwestern US.

Here’s the issue: between 2006 and 2011, the eastern 60 percent of Syria experienced “the worst long-term drought and most severe set of crop failures since agricultural civilizations began in the Fertile Crescent many millennia ago,” forcing 200,000 Syrians off the land (out of 22 million total in Syria) and causing them to abandon 160 towns entirely (source). In one region in 2007-2008, 75% of farmers suffered total crop failure, while herders in the northeast lost up to 85% of their flocks, which affected 1.3 million people (source). Assad’s policies exacerbated the problem. His administration subsidized for water-intensive crops like wheat and cotton, and promoted bad irrigation techniques (source. I’m still looking for a description of what those bad irrigation techniques were.).

These refugees moved to cities like Damascus, which were already dealing with over a million refugees from Iraq and Palestine. They dug 25,000 illegal wells around Damascus, lowering the water table and increasing groundwater salinity (source). The revolt in 2011 broke out in southern Daraa and northeast Kamishli, two of the driest parts of the country, and reportedly, Al Qaeda affiliates are most active in the driest regions of the country (source).

One thing that worsened the problem was Turkey. The Tigris, Euphrates, and Orantes Rivers flow out of Kurdistan in Turkey into Syria. Turkey, in a bid to modernize the Kurdish region, built 22 dams on these rivers up to 2010 in the Southeastern Anatolia Project. They’ve taken half the water out of the Euphrates, and used it to grow large amounts of cotton within Anatolia, doubling or trebling local income in that traditionally rebellious area.

So is drought destiny? Experts caution that it’s not that simple (source). In 2012, the American Midwest suffered a record drought, While that may have led to Tea Party outbursts in the 2012 elections, it didn’t lead to armed insurrection. (As an aside, you can figure out how well the drought map correlates with the 2012 Presidential election map. Washington might one day take note of this…). Still, when you couple drought, poverty, bad governance, and a witch’s brew of historical grievances and systemic injustice, drought can cause a civil war.

There are a couple of big problems here. The first is that the US didn’t see the revolt coming. Right up until the first protests started, they thought that Syria was immune to the Arab Spring (source). This isn’t all that surprising. Due to the War on Terror, the CIA and other agencies work closely with government intelligence agencies to hunt terrorists (source), and have little or no intelligence capability to learn what’s happening on the “Arab Street.” This led to the US missing the Arab Spring movement pretty much in its entirety. The US military has been talking about climate change for years, and they’re starting to get serious about preparing to deal with it (source), but they don’t seem to have a functional reporting system yet, let alone a good way to respond. To put it bluntly, no one in Washington or other capitals seems to watching things like water supplies, crop reports, rural migration to cities, or even the price of bread. Or if they are, they’re not being listened to. Spikes in bread prices throughout North Africa helped prepare the ground for the Arab Spring uprisings, and the region is still a major wheat importer (source).

The second problem is that, so far, our leaders haven’t officially acknowledged that water’s a problem. Basically, during the drought, Syrian per capita water dropped by almost half. While a lot of this could be returned by better management, growing different crops, convincing people not to eat bread in the place where wheat was first farmed, and so forth, there are probably too many people relative to the water supply, at least during droughts. Part of this is demographics. The population of the Middle East has quadrupled over the last 60 years, and the water supply, if anything, has shrunk (source). The brutal answer is to get rid of those people, which may be one reason why Assad was so willing to use chemical weapons. There are 1.851 million registered Syrian refugees at the moment, and that’s about one percent of the population outside the country. Assad (and whoever follows him) may not be interested in having them return, either. Syria likely would be more stable with fewer thirsty mouths.

What’s the solution? One important part is to get water on the negotiating table. Turkey officially helps Syria with water flows, but it’s not clear how diverting half a river is a friendly gesture, and the two countries are not on good diplomatic terms. If the Turks are using the Euphrates to water cotton, most of that water is lost to the air, rather than flowing back into the river where Syria can get it. Turkey could help stabilize Syria by letting more water out of its dams, but by doing so, it would risk insurrection in Kurdistan, so I don’t think they will voluntarily give up that water. Since Turkey’s water sources are secure for the moment, I suspect that quite a few Syrians are going to be resettling there, just as Iraqis and Palestinians are (or were) living in Damascus. More countries should volunteer to permanently take in Syrian refugees, especially in the north (as Sweden has). Why not? It increases populations in areas that are experiencing population decreases due to low birth rates, and it’s cheaper than trying to fight in the Middle East. Moving people to where there’s water is much less cruel than interring them in refugee camps in border deserts with inadequate resources and no hope.

One of the problems with climate change is that the northern edges of deserts are forecast to get drier, and the Middle East and the Mediterranean basin are one of those edges. If we want to avoid continual unrest in that region, it’s high time we all (in the international sense) start financing regional desalination plants in the Middle East and other dry areas. This has worked to secure water for Israel. Granted, it’s an energy intensive solution, but a large-scale desalination plant is cheaper than a single day of all-out ground war, US style (source).

The other lesson here is that politics and politicians matter. Drought isn’t necessarily destiny, but bad water management choices can turn a chronic problem of scarce resources into a bloody war. If you want to know why I’m not a libertarian, this is why. It’s nice to have liberty, but it’s necessary to have water. Good politicians work to get you enough of both, and we need more of them at the moment.




Follow

Get every new post delivered to your Inbox.

Join 32 other followers