Friday 11 February 2022

Was Climate Change Inevitable



Was humankind's journey of scientific discovery always destined to cause climate change?

There are 510 million square kilometres of land on Earth, and presently 7.9 billion people living on it. That's even counting polar regions and deserts where people can't live in large numbers easily. That's 0.65 square kilometres per person.

Our distribution about the planets surface is already disrupting the pre-existing global ecosystem of the planet. Assuming that this ecosystem should be in equilibrium, humans are now responsible for changes in climate, animal populations, atmospheric gases and a host of other things we could go on listing. By the very acting of existing, we have done that.

Much is discussed about the lack of environmental foresight of world governments and private corporations. The discussion usually goes along the lines of -years ago we could have done something about these things, and arrested the changes we were making to the environment, but never did. However, culturally we have always preferred to live in a world of increasing technological refinements than to live in a static, unchanging one. We have after all allowed this to happen, and continue to do so through the governing decisions we take, or allow to be taken in the world around us. Arguably, this isn't mere acceptance. In a sense, we have in fact decided this is how its going to be. While there are many dissenting voices to this sort of "progress", ultimately it has occurred, and continues to occur. Was this truly avoidable, or would that have contradicted the very identity we have built a civilisation upon?

As a means of illustrating the question, we can consider population as an example, on the basis that one of the hallmarks of our existence is a steadily increasing human population. Other hallmark aspects of our global civilisation may be examined similarly, but for simplicity's sake, we can look at population alone here. Option A is that we  stay our current course and continue to increase in population. Thus we would one day inhabit the planet with such density, that it would result in an impact to the planets ecosystem so profound, that no truly independent network of life could exist alongside us. All individual examples of animal life and natural chemical process would necessarily be interdependent with human beings in some way. Option B, the alternative, is of course that we do not reach that point, and its somehow the case that we change the path we are on and do manage the progress toward a return to a planetary ecological equilibrium somehow. Essentially, we can keep on doing what we are doing and drift farther from an ecological equilibrium, or we can turn our course around and move towards a more ecologically stable planet, so much as it is in our power to do so. But what would this require us to do?

While we can create devices to extract energy in less impactful ways, they are themselves the product of heavy industry. Industry that survives only through the existence of an ever increasing demand.
By © Hans Hillewaert, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=6361901


In some ways, this would require us to fundamentally change the way we exist in the world. That is to say, the very foundations of our civilisation would have to be uprooted and swapped out. The pursuit of security and plenty would have to come second in priority to the sustenance of our environment. There is no currently known example of a time that any significantly populated civilisation has ever performed this feat. There has never been a time, that we are aware of, that a large group of people felt so secure about their ability to survive that they gave up on the pursuit of resources, and instead focussed on managing their impact on the environment. Is that not to say it can't be done? I think not. Its certainly isn't the case that we view the past as a direct blueprint of how we should behave going forward. In general we believe the progress of ideas should improve the conditions of human existence. As time goes on, we have sought to elevate our ideas and practices for the benefit of humankind. Benefit meaning a more reliable and secure existence. However, is there a paradox there? The very nature of progress in this context is at odds with the idea of a static, unchanging, ecological equilibrium. As we improve the world and our conditions, we do not ask that fewer people are around to enjoy it or that it should be the preserve of the few. In essence we make life on earth easier, and it becomes easier for more people to exist, and this contributes to the disruption we are causing. Or does it?

As we reflect upon our development over time, it certainly appears that all cultures of the various varieties of human civilisation that currently exist have progressed towards creating more and more ecological impact. We may be better informed on the degree of this impact than we have been in the past, potentially putting us in a new position, but that is debatable. Regardless, if we can see a problem today, what can we do about it, and what would human civilisation look like in a world in which we free ourselves from these maladies.

The driving force behind all of our ecological impact appears to be resources. It is the quest for ever more resources that leads us to extract from, disrupt and alter the world around us. In every action, there is an energy cost, and that must be borne by something, something we see as expendable. To a certain degree, this is inevitable. All living things require energy to exist, but through evolution, the means by which this energy is extracted by living things is highly efficient. As we seek to supply energy to our civilisation, the means by which we extract this energy are not as efficient, and we produce a lot of waste, which in turn requires energy to mange. As our civilisation grows its energy demands increase, not only due to increasing population numbers but also due to the more energy intensive technologies we employ. So the relationship between "progress" and energy requirements is very non-linear. The obvious solutions to this are to either use/demand less energy or to extract it more efficiently. However, to arrest this process altogether we would have to establish an acceptable energy budget that could be borne by the world, and limit ourselves to that amount. We can continue to improve the efficiency with which we extract energy, but at its core the problem is one of demand. We need to limit our demands to budget allocated to us by the ecosystem.

Speaking as an engineer the first thing that comes to mind is construction. We would naturally seek to construct things as efficiently and infrequently as possible, using materials that are ecologically inert and naturally occurring. As an example, for building and facilities, making things out of stone is an ecologically wise choice given the durability of the material. While there are a few such structure on earth, most are actually made of of less durable constructions made with rock or stone as a component building material. True monolithic stone architecture is, and would be, "forever ". Or at least, endure so long that it would one day rest in the hands of a proceeding species of human being. As such it would necessarily be artistically designed to be as relevant as possible to the many cultural ideas and fashions that it will endure in its life.  As such, the nature of art and science would bring a compromise between functional designs and aesthetically pleasing ones. They would be simple in nature, appealing to the core archetypes of the human soul, and as practical to make and coexist with as possible. We would necessarily begin to examine our core requirements and aesthetic desires in the process. The nature of stone work as a fringe skill would change, and refinements in the science and technology of stone working would occur. Achievements in that field to once again become part of the lasting legacy of the age, as was the case in the distant past. 

Al Deir, Petra. Now almost two thousand years old, and carved from solid rock.
By Azurfrog, CC BY-SA 3.0 https://creativecommons.org/licenses/by-sa/3.0, via Wikimedia Commons

Similarly, to minimise the general consumption of resources we would seek to simplify and minimise of our requirements. It would be logical to be as healthy as possible. Reducing the need to manufacture certain items and economies that surround the idea of supporting ill health. This concept of simplification would extend to war too. In order to eliminate one of most intensively resourced areas of human existence, we would have to live in peace.

Along the same lines, we would probably need to integrate our way of life with the natural environment. The presence of animals and natural habitats conditions would have to be part of our day to day life, as the only way to minimise our impact on these things is to reduce the zone in which we influence them to the point where they exist natively on our doorstep.

With just a few cursory thoughts on the subject, the nature of the changes required appears profound. It would require a degree of adaptation that would necessarily mean cultural, philosophical, artistic and spiritual change. In essence, we would need to look at ourselves and the world in a totally different way. Look around and observe the sort of attitudes prevalent in the world today, and it becomes clear that this could not be achieved in a single lifetime, and would no doubt be a gradual multigenerational process. So there necessarily is a phase, now, where we are aware of the problem but as yet unable to create meaningful mitigations to it. 

If we are making progress towards this type of equilibrium focussed civilisation then it is happening so gradually that it doesn't appear perceptible in any real way at the moment. Flirtations with sustainable energy, recycling and other such endeavours are all based on the concept of maintaining and improving the way we presently live. To continue to increase resource requirements, but alter only the way in which some resources are sourced. Of course, this is to be expected as the economics of our civilisation make it such that something must be produced and sold in order to create the means to persist. In essence we are asking a system that depends upon growth to stop growing. That is to say, we have not yet found and integrated the necessary cultural, philosophical, artistic and spiritual changes required for these acts to be meaningful. If that is the case, then all this was inevitable, we simply never had the means to avoid it.

Thursday 28 July 2016

What is Democracy?

Democracy. It’s a word often thrown about. An end that is used to justify, rationalise and ultimately convince people that an action is either right or wrong. But, what does democracy mean? How does democracy work? What is the mandate given to a democratically elected leader?

As it happens, there is no one answer to each of these questions. Democracy is an idea. How that idea is implemented has varied greatly over the course of history, and indeed how the democratic process has been empowered and enacted has varied greatly with it.


Last month, in the UK, a historic referendum took place. The people were told that they were being given the choice to make a decision whether the UK will remain in the EU, or leave. The voters turned out, 72% of them. The voting was in the favour of those who wished to leave, 52% to 48%. The Brexit (British Exit) campaigners stated that democracy has been enacted, and a democratic decision has been made. We are told that any sentiments that this voting should be ignored would contravene the very principles of the democracy we live in. A stunningly relevant example of the principles of democracy being employed, and the principles of democracy being applied as an authority on interpreting the event. However, the key phrase in that sentence was “...the democracy we live in”. Indeed, just what sort of democracy is the UK? What sort of democracies exist in the world today? To understand the answer to these questions we must go through the initial queries posted at the start of this article.

What does democracy mean? Democracy is a form of governance. The word comes from Greek, and as the inception of democracy is generally attributed to the Greeks, we use a derivation of their word for it. Though, strong evidence exists that it was enacted in other parts of the world earlier, but never mind that for now!

“Demo” means “the people”, and “kratia” means “power” or “rule”. As the name suggests, the word means any system of government whereby the people have the power to rule. While this may seem to cover all systems of government to some, you don’t have to look very far to observe the alternatives. Monarchy, fascism, despotism and military dictatorships all currently exist as alternative forms of government in today’s world. The word democracy itself is not specific on how this form of government works though. There are many forms of democracy in this world. Perhaps it’s best to reflect on this a little before we continue.

As mentioned earlier, the birthplace of democracy is generally held to be ancient Greece. One of the most powerful states of the ancient world, the Greek city states of the time led the world in science, philosophy and social engineering. We have abundant direct evidence of how the city states of ancient Greece were run. Speeches were transcribed, books written and laws passed and recorded to serve as detailed guides on the governance processes of the Greek city states. In actuality, ancient Greece saw many diverse forms of governance applied. It was in Athens particularly that the principles of democracy were chosen to rule. In Athens, and several other minor city states, the fundamental concepts of democracy were put in place. It is from here that all democratic forms of governance applied in the western world emanated by the gradual development of their basic concepts. The concepts of the Athenian democracy were that the people would be able vote on the passing of legislation and executive bills, and that the most popular decision would be enacted. This was in contrast to many other Greek states, whereby laws and would be passed by a ruler, sometimes with input from advisors and sometimes based on that person's will and whim alone.



However, there were conditions to Athenian democracy. Most importantly, only men could vote. Adult men, specifically, and they had to own land, and they could not be slaves. In some cases only men who had completed military training. In fact, universal suffrage would not be seen in Athens until 1952, and by then the mechanisms of Greek democracy would have changed significantly, as we will see. Indeed, democracy, as it is commonly held in the west, was first implemented under conditions that disenfranchised between 80 to 90% of its total population, with between 30,000 to 50,000 Athenians being eligible to vote out of a total population between 250,000 to 300,000. Men held power over women, as they would do until modern times by systematic means. Lack of effective birth control, physical strength, cultural dogma and all of the methods we are familiar with today were in effect in ancient Athens too. To empower slaves with the right to vote would of course undermine the very roots of the society. Naturally children were held to be too immature to cast a vote, and those without land could be said to have no stake in society (by those in power). Naturally, once an individual has something, that person then has something to lose. If that something is land, then so be it, it still forms an effective means of obtaining obedience. Voting was only for the obedient.

So, democracy in ancient Greece was not at all the rule, and when in it was the rule; it only loosely resembled democracy as we imagine it today. Nevertheless, casting a moral judgement is not appropriate when we have had over 2000 more years of history to arrive at our system of values (see footnote below). Let us not forget also that the Greek city states were under constant threat from outside aggressors. A system of government needed to operate in a different time, with different technologies, under different conditions and with far more limited populations that we can easily appreciate today. In addition to the first functional democracy to be tested under fire, ancient Greece gave the world the foundations of western philosophy and the scientific method. The contributions made by its society to human progress are infinite. But, democracy as was understood in ancient Athens, enacted today, would reserve voting rights only for wealthy men. This is not the democracy that people hail when they encourage us to mandate wars and rush to the polls to support them. So what happened next?

Well, first let’s take a moment to examine the principles of democracy now that we have an example to work with. How does democracy work? In Athens it was enough that people could have a say in state decision making and law. Just having a say was novel enough. However, over the course of time the definition has evolved to more firmly define what a democracy is. Currently it is held that a democracy provides for the following:

  1. Elections to determine acts of law and governance
  2. Participation of the citizens in these elections
  3. Rule of law to protect the rights of these citizens to participate in these elections

Note that point 2 does not explicitly define how citizens may vote on acts of law or governance. At a basic level that is it. Within this framework there have been devised many democratic forms of governance.  A republic being the most noteworthy and the one most often set forward as a model of the democratic system. But then, how does a republic work?



A republic is a form of representative government. That means that representatives of the public enact laws and execute governance on behalf of the people. Provided that these representatives are elected by the citizenship, and that these elections are protected as free and fair by rule of law, then the government can rightly call itself democratic. That is not to say that it is a pure democracy. The voice of an individual voter may not be represented when the final decision is made in government. It means only that those elected by the people to represent them are given a fair chance to vote for legislation on their behalf. In principle this may appear to be a step away from the democratic principles ushered in by the ancient Greeks. After all, wasn’t democracy implemented to avoid a system whereby decisions are taken away from the public? The answer is yes, it is. This form of government has arisen as a pragmatic means to preserve the power of the people, or some may say the appearance of popular power, against the increasingly complex role of government. Both interpretations of republican principle can be observed by looking at recent history.

Firstly, let’s look at the United Kingdom, in the year 1860. Why not, it’s as good as year as any. In 1860 the UK controlled a vast empire across the world. 460 million people and 33 million km2. Thats one person in every five and about 1km2 in every 16 on Earth. The UK at the time was a constitutional monarchy, meaning that the country had a head of state, being a king or queen, and a Parliament which was responsible for national government. Parliament being split into two houses, the House of Commons and the House of Lords. Now, it is interesting to note that the system of government has not changed since 1860, though the world at large has.

Citizens of the UK can (and could) vote for a local representative in the general election. This representative would hold a seat in the House of Commons, and represent the best interests of their constituents. Leadership of political parties was decided by voting within the members of the political party. The choice of prime minister is influenced by the voter only inasmuch as that person can vote for a local representative. The political party in power in the House of Commons is whichever one has the majority of seats. The leader of that party is the one who has the most support from members of their political party. The House of Lords are not democratically elected. Membership to the House of Lords is by appointment only. Appointment given by the King or Queen and by the House of Commons Appointment Committee. There is also representation from the Church of England and certain Lords with hereditary privileges. The House of Lords has the power to amend bills passed by the House of Commons, and enter them into law. So the citizen voter of the UK can elect a local official, who can occupy a seat in parliament alongside 649 other seats. That representative can represent (or not) the interests of any individual voter in a vote amongst the House of Commons. The unelected House of Lords can than modify that bill before passing it. Subject of the empire that were not UK citizens had no vote at all. This system of government has been widely accepted as, and meets the definition of a democracy, yet under this governance an enduring class system was maintained that essentially allowed for the exploitation of the vast majority of it’s citizens and subjects. A system that was maintained by having the authority to pass a specific law being completely independent of the voting power held by the citizenship. A divide which exists to the present day.

On the other hand, under a republic the requirement for the public to be well informed on the case specifics of each and every act of law or governance is relieved. In the days of ancient Athens, this might not have seemed necessary. However, the passage of time, the development of science, technology, social and economic theory have lead to increasingly complex issues falling under the jurisdiction of government. For example, laws pertaining to funding of science and technology require an understanding of the long terms development plans of the nation. These in turn require an understanding of the state of technology and the general long term ambitions of the scientific community across the world. In a case specific example, should the UK have made an investment in fibre optic infrastructure in the year 1993? Few members of the public would have said yes at the time, but one would hope that a well informed civil servant should be in a position to make such forward looking judgements. The same principle of informed decision making applies in all aspects of governance. A representative form of government allows the voter to focus on their own goals in life while allowing dedicated civil servants to do the necessary research to make informed decisions on their behalf.

The trend towards greater “people” power can only serve the good of society as a whole so long as the voting public are well enough informed on the subject they will be voting on. It seems that we have long passed the threshold where that is possible. Sir Isaac Newton famously studied only 6 books on mathematics prior to developing differential calculus. The sum of just 6 books turned an intelligent man into the greatest mathematician the world has ever seen. Today, the sum total of Isaac Newton’s contributions are taught to children studying maths by the time they are 16. Today, to become the world’s greatest mathematician, would require at least an advanced degree and a doctorate in some field of mathematics, not to mention decades of intensive study. Even to become a subject matter expert would depend upon an investment of at least 10, years of someone’s life. At this point they would perhaps be fit to make decisions on the behalf of a society on that one subject, but what about everything else?

Moving in the opposite direction for a moment, we can consider the alternative. That is a society where the public are completely deprived of their voting rights. That is to say, an un-democratic society. Putting aside motivations towards petty minded gain (something that no society has yet managed to do), we can hypothesise that this society would somehow maintain a clear direction towards raising the standard of living and level of knowledge possessed by all its citizens across the board. Such a society could draw upon an educated mass of people. The very best of which would be trusted to make decisions on behalf of the others within their particular field of expertise. How would that actually work though? Is it possible that a single choice can be evaluated to be the “best” of all available? Basic practical reality informs us that most decision making is about compromise. Situations that offer a clear cut solution are rare. We would arrive at a situation where decisions are made “for the greater good”, but whereby some aspects of society (be they groups of people, avenues of development or whatever else) are left to suffer as a result. Is that any better than a democracy? Perhaps if such a society had a particular objective in mind, such as rapid adoption of a new technology (such as industrialisation, or space travel), then a set of single minded decision makers could go about reaching that end with the minimal deviation from course. In fact, history has presented us with examples of such societies, and the price to pay in terms of humanity has been been both unacceptable and unreconcilable with the ends. Interesting to note that the empires of the past often occupied territories in ways that essentially put in place such forms of governance. The ends favouring the empire with the price being paid by those occupied.

What about something between the two. A society where democratic privilege is restricted to those that possess the informed decision making ability to exercise it for the greater good. This too, is not a new idea. This form of elitism was a popular thinking in the first part of the 20th century. However, the shortcomings of such a system become clear very quickly. Whom should decide who is worthy to make decisions, and how would that person use this power? Even if we could somehow produce people free from the petty weaknesses that afflict us all, how can the ability to make good decisions even be measured? Many judged Galileo and Tesla to be mad, but time has proven them to have been years ahead of their critics. Henry Ford famously captured the flaw in this system perfectly when he said “if I had given people what they wanted, I would have made a faster horse”.



The lack of clear right and wrong goes against the basic human compulsion to force decision making into a (false) dichotomy. It seems evident that only by a measured examination of current conditions can a decision be made on an informed basis. This decision may well be reflected upon poorly in the future, and may not align well with a moral compass inherited from the past. But only by freeing ourselves from dogma are we able to effect meaningful change. Forms of governance that rigidly confine the framework of what is acceptable and what is not seem to fare poorly over time. Alternatively forms of governance that offer the broadest possible terms of citizenship and allow socioeconomic U-turns tend to last, albeit bearing little resemblance to their earlier states. From this it would be fair to say that progress is a trial and error process. By freeing ourselves to make mistakes, we are able to better inform ourselves. Change comes gradually, and without an end in site. Progress is often brought about with slow gradual meandering steps towards a broadly accepted solution. Greece, having finally granted universal suffrage in 1952, being a handy example.

If we are to accept these facts as true, and history certainly seems to support that notion, then the mandate of a democratically elected leader must be to accept the responsibility to make decisions on behalf of the people, and accept responsibility when those decisions are wrong. A democracy must allow it’s citizens the right to criticise their leadership, and the right for the leader and the people to make mistakes must be protected. The people must be well informed enough to make informed decisions about who shall represent them, and the government must have access to people well informed in every possible field. Most importantly, the citizenship should be defined as broadly as possible, or else it shall sow the seeds of tomorrow's revolution by those it chooses not to recognise. Can any one country claim to have such a system in place? What would a world like this look like?


FOOTNOTES:
1. While the above statements regarding the application of democratic principles in ancient Athens are factually correct within the context of how suffrage evolved over time, it should not be taken to mean that Athens in its classical period was not a democratic state. Many books can (and have) be written on the subject of Athenian democracy, but here it will suffice to say the following. At the start of its classical period Athens was seen as radical extremist state because of its democratic principles. Democracy was seen as danger to the established order of things, not only by other Hellenic states and their neighbours, but by some of Athens' own, and most esteemed, citizens. The Athenian state empowered the many when the whole world said it was wrong to do so, and protected and cultivated the fledgling democratic idea when it was most vulnerable. What followed from that point is the subject of all western philosophical thought. Over two millennia later the idea of democracy has become an accepted, and even desirable as a form of governance, and yet the average citizen living in a democratic country today yields less influence over, and access to, the management of their state affairs than an Athenian would enjoy in 480BC.

Tuesday 12 July 2016

Can a Paleo Diet Really be a Good Thing?


Recent times have seen a surge in popularity of the so called “Paleo Diets”. These dietary guidelines are based on the premise that the human race has not yet developed the necessary adaptations to refined grains, or indeed any form of agriculturally reared crops for them to be efficiently consumed as a food source. Advocates of these dietary regimes claim that consumption of such food items is linked to obesity and other diseases, and that only by reverting to pre-agricultural dietary regimes can we hope to avoid the diseases of the modern world. 

It is an interesting thought. Advocates argue that the human race has developed a digestive system over the course of 6 million years or so, from the last common ancestor we share with the great apes, and that the advent of agriculture is essentially a disruptive force. Agriculture, they hypothesise, has resulted in us depending upon a food source that we are not equipped to consume. 

These arguments have two main criticisms that can be leveled against them. First, is that evolution is actually driven by disruptive events. The introduction of a new food source would provide evolutionary pressure towards a digestive system that would manage agriculturally sourced foods, if not already present. By avoiding such foods in the diet, the dieters would essentially be isolating themselves from this evolutionary pressure. 

The second major criticism is that the Paleo Diet hypothesis ignores the fact that the human civilisation only really came into being as a direct result of the development of agriculture. For the first two million years that homo sapiens were around we existed as hunter gatherers, spending each moment of our lives either sleeping, eating, hunting, gathering or procreating. Agriculture was the breakthrough that first allowed homo sapiens the luxury of plentiful food. This in turn allowed the highly developed human mind to be set free for the first time. For the first time we were able to apply our imagination, intellect, creativity and powers of reason to things other than the pursuit of food. Granted, agriculture must have been the first of these great breakthroughs, and so must have occurred through a food surplus arising through other means. Arriving combined with a moment of inspiration and good fortune perhaps. However, once this milestone was met, great advances came quickly. Astronomy, writing, mathematics, pottery and a series of other seminal leaps forward came in an avalanche beginning around 10,000 years ago, at the start of the agricultural era. 


Placed into context, the human species survived without many significant advances being made for a period of about 2,000,000 years ago  until around 8,000 BCE. While we have no evidence of written language from before 4000 BCE, written language being that key advance that divides what we call history from prehistory, we do know from dating of remains that early humans learnt to use stone tools and fire between 3 million and 1.5 million years ago. We also find evidence of cave paintings and burial rituals as the first humans enter the fossil record, within the last 100,000 years. All of these things point towards the development of cultures. Learning was achieved by one generation and passed down to the next, perhaps by some form of oral tradition, perhaps by example. 

Then comes the Neolithic revolution, the “new stone age”. Here we have the development of agriculture. All of a sudden food supplies become a matter of planning rather than a dependance upon chance and skill. Central food reserves become possible, and as a result, human beings begin to live in larger communities. There is a greater sharing of knowledge and information. Skills are borne and refined out of this increased interaction. We learn to communicate more effectively, and develop written languages. We learn to navigate from place to place using the stars. An understanding of astronomy demands with it a basic understanding of mathematics. Number systems are developed, allowing the first clear examples of an abstract philosophy. The journey of our species begins to pick up speed at this point. The human population, which has remained at a steady 3-5 million individuals for over 50,000 years suddenly jumps to over 100 million in just 8,000 years. Clear evidence of the benefits of an agricultural food source. 


Looking back, it’s not hard to argue that the move away from a “Paleo Diet” was what took the human race off its knees and put us firmly in charge of our environment. Every great advance since has been based upon the social and scientific developments that arose directly as a result of agriculture. To this day it provides the central occupation for the majority of the world's human population. 

While there is truth to the idea that only with plentiful resources are problems such as obesity and type 2 diabetes possible, it also because these diseases become apparent only at the expense of famine, hunger and malnutrition. All who live are destined to die. A statistical analysis will, of course, show a 1:1 ratio between births and deaths. By observing 20th century trends in causes of death, it does not seem reasonable to equate them in terms of direct causality to an event that occurred 10,000 years (or 400 generations) ago. While it is also true that 400 generations may not be enough to witness significant genetic change (through the domestication of animals shows clear evidence to the contrary) it seems unreasonable to assume that any disruptive effects occurred as a result of the development of agriculture, save for those that were overwhelmingly beneficial to the human species.

Put in other words, humankind struggled for well over a million years (by any estimate) to break free of the shackles of hunting and gathering. When those chains were finally broken, we rose up and separated ourselves from all that had ever lived on Earth before, by becoming a highly technological civilization. We rose quickly, and we continue to rise quickly. What has become evident in recent times is that the speed at which we are able to assimilate cultural and social change is not as swift as the rate at which we are able to effect it with our technology. Perhaps this is the mirror with which we should reflect upon ideas such as the Paleo Diet. The instinct to resist change. To disagree with new thinking and claim old ideas as an authority in a world that is moving too quickly for us to feel any sense of control in. The desire to find solutions in ideas simple enough for us to understand easily. The basic need to feel relevant. Is this not the driving force behind so much of what gains popular appeal in today’s world? Such is my own dietary recommendation for thought.

Saturday 25 June 2016

Were We the First Civilisation on Earth?

Our civilisation rose from a species that is the common ancestor of both us and a chimpanzee, and it lived just 6-8 million years ago. Once agriculture was developed, and we could set our minds to things other than hunting, it took only 10,000 years for us to make the world we see around us today. A total of just 8 million years separates a creature we typically recognise as being completely absent of sophisticated technology and culture, to having a high speed internet connection linking almost every home in the world. The history of life on Earth stretches back over 4 billion years. Is it possible that somewhere in that vast stretch of prehistory, there was another 10 million year period where some other life form developed a sophisticated civilisation such as ours?

Were we the first to claim the refinements of higher culture? (arms not to scale)
It is commonly held, and not without good reason, that homo sapiens are the architects of the first advanced civilisation that planet Earth has ever seen. We hold that we were the first to make tools from metal, the first to master agriculture and animal husbandry, the first to harness the power of atoms and the first to reach into space. It is a reasonable assumption to make. We certainly have no evidence of any other civilisation before ours. The human brain appears to have developed the sophistication required for civlisation only very recently (in planetary terms). Indeed, if there was an advanced civilisation before ours on Earth, would they not have influenced some aspect of the world for us to see today?

That is an important question. The ultimate hallmark of our civilisation has been our ability to shape the world around us. To alter our planetary signature. The evidence of our presence is very obvious when the Earth is observed from space. The gases in our atmosphere, the distribution of vegetation on the planet, the emission of highly ordered EM radiation from our planet’s surface. These indicators are so obviously the result of an advanced civilisation that they have been used as the basis for the methodology when looking for life on other worlds.

The Galileo space probe was launched in October 1989 and on it’s way to Jupiter performed a curious experiment. It used its sensors to scan for evidence of life, on Earth. The idea was that if we are going to use spectral analysis of a planet as a test for the presence of life, we should first calibrate that test to ensure it works. The test calibration was performed and we got our first baseline set of data. We could now go about searching for similar signs of thermodynamic disequilibrium, gaseous oxygen and high absorption of red light in alien atmospheres on our quest to find another pale blue dot out there in space.

Spacecraft Galileo had one last job to do before it left on it's way to Jupiter
The experimental basis is a sound one. We know of only one planet with life on it, and it forms our only frame of reference as we search for another. That is, for now at least. Such an experiment naturally looks out into the depths of space. Across distances so vast that even light takes millions of years to reach us. Such an experiment seeks to learn by observing the past. The farther away we look, the farther back we are seeing.

Here on Earth we are limited in our ability to look back into time. The light that left our world millions of years ago is forever lost to us. We can go only by the evidence that we have in our possession. But what is that evidence, and what does it tell us? We can approach this problem from another angle. Let us hypothesise for a moment that there had been an advanced technological civilisation here on this planet before us. How would we know about it today?

The most obvious suggestion would be fossil evidence. There might be fossils showing all of the adaptations we humans have come to rely upon; opposable digits, large brains, binocular vision, bipedal stance. There might also be evidence of organised burial rituals (assuming they can be considered a universal indicator of higher culture). Such forms of evidence are highly subjective, and anthropomorphic by even the most lenient standards. But just how much a reflection of the past is the fossil record in the first place? After all, we know that Tyrannosaurus Rex existed, but less than 50 skeletons have ever been found, and none of those are even complete. The key thing to note is that for fossilisation to occur, a very specific sequence of events must occur. For example, sediment must cover the remains of whatever died, or the body will decompose and be lost forever. Today, we know that T. Rex existed only because of all the thousands, perhaps millions of T. Rex that existed, fifty happened to fall dead near the water’s edge and were covered by sediment, and then were one day recovered by human hands. The simple fact is that fossilisation is actually quite a rare phenomenon. There are aspects of the process that bias certain life forms too, either due to the makeup of their bodies, the location in which they lived or indeed the environment in which they lived. Conventional wisdom holds that around 2% of the species that have ever lived have been fossilised in some form, somewhere on Earth. This leaves a large percentage that we will simply never know about. Let’s not also forget that we have assumed that this civilisation didn’t tend towards cremation or air burial of their dead for whatever reason. Such activities would of course completely eliminate any possibility of fossilisation. An otherwise empty cup of water drawn from a great ocean does not assure us that the ocean itself is empty.


Calcium based structures better lend themselves to fossilisation
What about the Galileo test. Atmospheric signatures created by industrial activities. Granted, human beings civilised about 10,000 years before we were able to significantly alter the atmospheric makeup of the planet. However, an atmospheric signature would meet the standard of evidence we have set in the search elsewhere in the universe. It would imply a thriving, long lasting civilisation. But how long after the fact could we expect this evidence to linger? The longest lasting greenhouse gases are thought to survive a few hundred years in our atmosphere. Slowly succumbing to breakdown of their molecular structure by the sun’s rays. Thermal equilibrium is also restored within a few hundreds of thousands of years following even the most extremely disruptive events, as is evidenced by geological records following the Yucatan meteor impact of 65 million years ago. By most estimates, if the human race were to disappear today, our planetary signature would be gone within 100,000 years. As such, we cannot rely on this to provide evidence of our existence to future civilisations beyond that.

Monuments. Whether they are simple campfires, tools or huge skyscrapers. The human race has left monuments to its presence all over the planet. Surely this is another piece of evidence that we could rely upon to tell us if we were the first to civilise on this planet? We have built monuments of stone and corrosion resistant alloy steel. Great dams and bridges, roads, settlements and highly ordered distributions of waste material. These might all be impressive achievements to some, but in terms of longevity they leave us wanting. Looking across the abandoned buildings of the once great Detroit automotive industry (here), it is very clear how quickly vegetation reclaims the land. Within the space of just a decade, the structures are at serious risk of total collapse. Plant roots and weather erosion are reducing reinforced concrete to rubble at an alarming rate. Within 50 years little will be left beyond piles of rubble, and within 1000 years we can expect even the rubble to have been overrun by nature. The same basic forces work their way at all of our monuments. The great pyramid complex of Giza has stood for 45 centuries, and stands today. The dry desert environment has limited the effects of water erosion to some extent. However, even this mighty monument of human achievement is not immune to the ravages of time, as can clearly be seen. Stone monuments such as the pyramids and Stonehenge are more resistant to the effects of neglect than most modern materials, such as steel. In essence they are formed of a more stable composition of matter. But given a few hundred thousand years, even stones will turn to gravel and then eventually, to dust. The finest stainless steels, the miracle materials of the modern age, while strong and light, are not totally immune to corrosion. In even the most favourable of conditions, and considering the finest of steels, a millimeter is still lost to corrosion over the course of a thousand years. This means that if the pyramids had been made using 2'' thick stainless steel plate, and maintained as well as the stone pyramids have been, they would no longer exist in just 50,000 years..

Then there is the matter of the canvas itself. Over the course of millions of years, tectonic plates shift land masses across the globe. The Earth’s crust is constantly being consumed in some areas, and created anew in others. Seabeds become mountain tops, and great plains find their way to the bottom of oceans. The effects of the natural processes of our planet can eliminate not only the monuments we create, but can absolutely consume the very ground they were built upon.

In fact, therein lies the key factor. Assuming that a previous civilisation had existed, we would be hard pressed to find any evidence of them here on Earth at all. Between the various difficulties in creating enduring evidence, and the highly volatile nature of the planet’s surface we cannot reasonably expect to find evidence today, even if it had once existed, except by an extraordinary stroke of luck. 98% of the dinosaur species will speak to that effect. An observation of our own development also seems to indicate that upon reaching a certain technological level, a civilisation might move towards a more sustainable and ecologically sound existence. They very nature of which would serve to lighten the footprints of such a civilisation. This significantly curtails the window of time for which a civilisation might leave lasting traced of itself.

Indeed, if were to take our own example and look forward to what we might leave behind, then there are precious few achievements that we can hope to carry our torch into the distant future. There is the Voyager and Pioneer space probes. Each carrying a golden record, ensuring the words of our people and the music of Chuck Berry and JS Bach (amongst others) will outlast even our solar system. There is the ever so faint halo of radio waves we emanate, for a brief moment before we transition to more efficient communication technologies. Both the probes and the waves would not be available to future Earthbound civilisations, but might one day hope to meet scientists from other worlds.

The single nearby piece of lasting evidence of our existence would seem to be some small flags, a few footprints, a few landers and an electric car left on the Moon by space programmes of our time. The Moon is free of weather and tectonic effects. While moonquakes do occur, they are very weak and pose little threat to the items we have left on it’s surface. The lack of atmosphere makes the surface of the moon an inert place. Indeed, we can reasonably assume our landers will remain intact until the sun transitions into a red giant, some 3.5 billion years from now. Perhaps quite aptly, the most enduring evidence of our existence is likely to be our crowning achievement as a civilisation. However, the surface of the moon, while mapped, has not been explored in detail. While it is reasonable to assume that there is nothing there to be found, it is here that we are most likely to find evidence of any prior Earth civilisation, provided they managed to attain the same heights we had by the late 1960s.

Footprint on the Moon
The argument that we are the first because we can see no evidence of any others sounds remarkably similar to the we are the only sentients life forms, because we can see no others. The natural egoentric persuasions of the human mind lend themselves well to this type of thinking. However, it was proven that we are not the center of the solar system, and it was proven that we are not the center of the universe. Even when the evidence was in hand, it took many years to convince the world at large of these facts. This should be the most telling example of our prejudices with regards to any information that demotes our position in this world. I propose that we may not even hold the title of first and foremost on our very own soil. I can offer no evidence, and set forward only a hypothesis. One that can be tested, and certainly will be over the coming years. Until then, I can’t help but be curious what might have happened on this planet before we arrived, and what might have happened that we would be given our chance on this Earth. If nothing else, it makes for great fantasy.

Saturday 18 June 2016

Could We Have Got to Mars in 1985?


In 1961 John F Kennedy committed the United States to land a man on the moon, and return him safely before the decade was out. Five years earlier, on October 4th 1957, the Soviet Union had launched the first artificial satellite into space. The first powered and controlled flight by a human being had occurred just 54 years earlier on December 17th 1903. The human race had gone from a twelve second flight over the hills of Kitty Hawk to landing upon another world within a the span of a single lifetime.

However, soon after the Apollo Moon landings political sentiment and support fell away from the space race. It's easy to look back and dismiss the next steps on our journey into space as being exponentially more difficult, or even impossible within that era. Leaders, political leaders, have said that landings on Mars were beyond our abilities, beyond our resources, and beyond their scope for the last 40 years. This is usually accepted as a matter of fact, given that these and similar statements have been repeated for so long and by so many, it's also not hard to rationalise why. After all, there are countless hungry and dying in this world. People who would benefit infinitely more from a simple vaccination, or course of antibiotics, than the exploration of outer space. Certainly, if we had devoted equal resources to these endeavours for the last forty years, we could judge the course of our journey as a species to be a most noble one, and the decision to delay a Mars landing quite justified. However, history did not unfold that way. While many a great effort was made, the progress of mankind did not follow the curve we would have extrapolated from our history in 1969. Cold wars, jungle wars, oil wars, star wars and wars on terror have occupied taxpayer resources for the last five decades. Tremendous advances have been made in technology and social development, and yet the gap between rich and poor has continued to widen in the western world since the industrial revolution. Avarice and paranoia dominate political thinking, and mass media is used to re-evaluate the truth of ideas again and again, as required, leaving little room for scientific endeavour and human progress. Nations, we are told, can no longer afford such pursuits. So, how far could we have gone had we not turned the gas down on the space race, what would it have meant and what would it have cost?

Just 66 years separates these two events
To gain perspective, we must first go back in time to 1961 and establish a context for what had just been achieved, and what was to be achieved in the next decade. Peace was no more a reality in 1961 than it is today. Two world wars had left Europe in ruins. The great empires of the 19th century were left all but bankrupt. Nuclear weapons and brinkmanship had left the world closer than ever to an apocalypse. News media had vastly gained in reach, and events from around the world were being shared by all people, everywhere. During World War II, mankind had developed the first sub orbital rockets, carrying nothing more than bombs. Soon after the war had drawn to a close, the prospect of a new, and much greater “nuclear war” threatened.

Just as in the 18th and 19th centuries, the great empires had been those that controlled the terrestrial oceans, so it was believed that in the 20th and 21st century, control of space would be the ultimate tactical possession.The quest for mastery over the new ocean led on the new superpowers of planet Earth. Resources were committed and investments made to gather the greatest minds in the world, and into unlocking the great new frontier of space. Rocketry had developed to the point where small satellites, and even human beings could be launched into a low earth orbit, 170 miles above the Earth's surface.

The USSR had struck the first two milestones in this race into space. The first satellite, Sputnik 1, and the first astronaut (or cosmonaut), Yuri Gagarin. From this point forward the United States needed, and demanded, a dramatic surge to come back into the race. America needed an achievement to capture the public imagination. A propaganda victory that would demonstrate the superiority of American Democracy over Soviet Socialism. On these terms, an objective was sought out that could carry the United States into the realm of space as it's conqueror. The great minds were conferred with, and they retorted that a landing on the moon would be possible within the decade, and thereafter the rest of the planets would be reached. Here we are then, in 1961.

Werner Von Braun had taken a great interest in space travel from an early age. Looking up to the pioneers of rocketry, Oberth and Goddard, he had studied and built upon their ideas. Joining amateur rocketry clubs in his early 20s, his knowledge on the subject was noted, and he soon found himself recruited into a German military initiative. By his late twenties he was running a team of over 1000 engineers working on rocket artillery for the German army. At the start of the WWII, he began working on rockets for the Nazis, culminating in the development of the V2, the first sub-orbital rocket system.

Following the war, the United States had captured Von Braun and his team of engineers as spoils of war, as part of Operation Paperclip. They were brought back to America to work upon the Redstone Missiles, a type of intercontinental ballistic weapon. Von Braun though, had long hoped that rockets would one day lead mankind into space, and open the doors of the universe. He saw the exploration of space as the natural course of human development. Just as life had once left the oceans, he believed, it would one day leave the Earth. It was he who had campaigned for the United States to initiate a space program, and it was he who had championed a landing on the moon, and Mars, even as early as 1948. This is evidenced in his extraordinarily detailed book "The Mars Project", which presents a technical study into the design, construction, fuel load and every other practical detail of a mission to Mars. This was a technical book, not science fiction, and was written in 1948, just a year after the sound barrier was first broken. His writings show an extraordinary level of understanding with regards to the challenges of operating in space, which later developments in space flight bear testimony to.

The Mars Project, written in 1948, published 1953. Cover from 1962 edition shown
Von Braun would spend the 1960s working as a Director at NASA, and Chief Architect of the Saturn V rocket. In landing on the moon, the NASA engineers had to develop technologies that were just fledgling curiosities at the time of initiation, but are today commonplace in everyday life. These include integrated circuits, flight computers and freeze dried foods to name just a few. To blaze a trail from Earth to the Moon, engineers opened the doors to entire fields of science, ones still being explored in the present time. Machinery had to be designed and built to work in an environment that nobody had ever seen or experienced. Machinery that had to work with perfect reliability, in spite of extreme temperatures and an array of unknown variables. To this day the Saturn V remains the only craft ever made to have transported human beings beyond earth orbit. In 1961, despite the course of events in the 50 years prior, the final achievements of Project Apollo would have seemed impossible to all but a few. However, to those few, the moon was seen as just the beginning. In 1969, this 'impossible' goal was achieved.

Von Braun was one of those who had always believed, and most critically, one one of those who had indeed made landing on the moon possible. His 1948 book stated that, without any extraordinary advances in technology, a landing on Mars might be possible within 30-40 years. He twice revised this book within his lifetime. Most recently with a new forward which noted that the science of rocketry and other related technologies had advanced to such a point that his earlier estimates had been rendered quite conservative. He also noted that the discovery of the Van Allen belts, amongst other things, meant that certain new considerations would need to be made that were not present in his original 1948 book.

Von Braun was also co-author of "The History Rocketry and Space Travel", along with Frederick L Ordway III. Ordway was then Assistant Director of the American National Air and Space Museum, and was another of space travels great visionaries and authors. In the 1975 edition of this book, the preface discusses the "Space Task Group". This was an advisory board for the United States' long term goals of space exploration. Within this book is described how in the climate of the late 1970s, the recommendations of the Space Task Force were dismissed against doubts as to the worth of the Apollo program. Within the advisory board's proposals was a roadmap to land a human being on Mars. The road map was deemed possible, and in fact possible on a sliding scale of investment. Depending on the investment, we could hope to walk on Mars between 1982 and the early 1990s. As to the cost; $7-10 billion a year for the early option, $4-$6billion a year for the slower paced route. Von Braun died in 1977, with it's chief defender gone, the Apollo program wound down and any hopes of a manned mission to Mars delayed indefinitely. The Unites States Defence budget for that year was $286billion (unadjusted).


Charles Donlan, Robert Gilruth, Maxime Faget, and Robert Piland all of the Space Task Group. August, 1960.
Given the scale of the challenges already attempted and overcome by the people involved, it's hard to believe in the impossibility of a program that was proposed at so modest a relative cost. It is not hard at all to imagine that it was a return to humanity's age old weaknesses, rather than any technological or financial challenges, that killed off any hope of a Mars Landing by 1985.

Now 30 years on, hopes of a mission to Mars are heard again. This time not just from the Unites States, but from space agencies around the world, joined by the enormous resources of private industry. In the 21st century, space promises to be big business. Exploration, settlement and even commercial mining are all being discussed in public and private circles. It seems that mankind is once again ready to take another giant leap forward. But it's hard not to think of what might have been, had we taken this step 30 years ago.