Climate Change, Health, and the Anthropocene: Why We Must Study the Past

By Alexander More

Climate Change on a Human Scale

The alarming pace of climate change is cadenced by headlines repeating that this year, this summer, this month was the hottest on record, while polar ice reached the lowest extent ever documented, last year. Twelve U.S. government agencies just released a report declaring in no uncertain terms that human activity has caused global warming and associated catastrophic weather events. With such urgent concerns for current and future climate change, and its impact on human health and survival, what purpose could the study of the past serve?

In one word: perspective.

In order to understand how human populations react to climate change, we must use historical records to anchor scientific climate data in the immediacy of human experience. Scientists often discuss climate change with geological timeframes—tens of thousands to millions of years—or geographic scales—continents, hemispheres—that are distant and disproportionate for the general public.

If instead we show the impact of climate change on human lives and health in a time scale that is familiar, with detailed descriptions by those who witnessed it, we do two things. We bring to light the experience of past populations, which is writing history. And we explain the present by studying the past, which is, again, writing history. With an eye to the future, we also provide evidence to inform policies that can address and, hopefully, prevent further climate change. This is the objective of the emerging, trans-disciplinary field of planetary health, and it is one of the aims of many scientists and historians who study past climate patterns and crises. [1]

Common objections to evidence of climate change—such as “there is no evidence,” “it’s just a phase,” “it will correct itself,” “a hundred years is not enough” “the temperature record is not reliable”—dissolve in the face of tens of thousands of testimonies from hundreds of years of history, in consilience—that is, in independent agreement—with ever-expanding libraries of scientific data. The most compelling warnings about the dangers of current and future climate change come from past populations, who have left us descriptions of the less drastic changes that climatic crises wrought upon their lives. Their words contextualize scientific climate data and provide an invaluable point of comparison for the formidable scale of what we are facing today.

 

The Health of the Past as Benchmark for the Present

In using past human experience with environmental change as a point of comparison we cannot limit ourselves to our most recent history. Think of all of human history as unfolding in a week-long lab experiment. Would a scientist ignore all data collected over a week, in favor of only the last hour? Dismissing or treating superficially what came before the modern, industrialized world deprives us of invaluable data about past climate and its impact on human survival and adaptation.

Popular media, public policy and even textbooks of history are replete with references to the “pre-industrial world,” often depicted, or rather mythologized, as a mix between an idyllic landscape mostly unspoiled by human activity, and a world where people were too primitive, too busy killing each other, or too sick, to affect the environment in significant ways. All this ended when industry clouded the sky with pollution rising from the smokestacks of modernity (ca. 1800), or so the story goes.

Such descriptions are, of course, wildly inaccurate. Yet they remain pervasive and persuasive, like most myths. Our misunderstanding of the past is often rooted in the terminology we use to describe it, that is words that are quite plainly prejudiced and anachronistic, affected by what historians call presentism. Did people in “Antiquity” think of themselves as “ancient?” Would people of the “Middle Ages” be flattered to think of themselves as stuck in a thousand-year period that just happens to be in between the only times when, ostensibly, the really important events happened? Historians might object that these terms, used technically, do not demean or dismiss, but their popular use persists, perniciously, and so do the myths.

The latest in this series is the new term “Anthropocene,” which scientists and news media increasingly apply to the last fifty years, or, at best, the last two hundred years, i.e. periods in which we can detect the human impact on the environment. Naturally, or not, in order to define what is a detectable human impact, we need to find a period where human activity had seemingly no effect on the environment. This turns out to be much trickier than expected, because humans have been modifying the environment, polluting the air, water and soil, for much longer than the last fifty or even two-hundred years. This runs contrary to popular assumptions and chronological definitions of the Anthropocene. Just like the scale of geologic time frames is too remote or abstract to offer meaningful points of comparison to the public, by adopting a short view of history—the Anthropocene, stretching only as far back as the birth of industry—we fundamentally misrepresent the past, and as a result we misunderstand the present.

 

Air Pollution: the Most Underrated Killer

Pollution is a good example of how superficial assumptions about the past led to fictional history and to misguided policy based on it. Pollution also happens to be one of the greatest threats to human health today. The Lancet commission on pollution and health recently found that it is the most common cause of death in the modern world, killing some nine million people prematurely every year.[2] Air pollution is the biggest culprit.[3]

For decades, policymakers, industry advocates and even some scientists have argued that air pollution levels (especially lead pollution) declined to what could be considered “natural” or pre-industrial levels after the U.S. and most European countries enacted abatement policies in the early 1970s that, among other things, phased out leaded fuel. Recent evidence has shown that, though the clean-air reforms did have a significant effect, a return to “natural” levels of lead air pollution never happened.[4]

In fact, mining and smelting have caused elevated levels of lead in the air for the last two millennia. The only time when lead air pollution declined to natural levels was a five-year period when 40 to 50% of the European population died due to the largest documented pandemic ever to ravage Eurasia, the Black Death. During the pandemic, mining and smelting stopped because demand for lead (as well some other mined metals) declined, while the labor force was decimated. The evidence in this case comes from a combination of historical, archaeological and scientific climate records, all independently telling the same story.[5]

An ultra-high-resolution study of lead air pollution shows that lead air pollution dropped to undetectable, natural levels, only once in the last 2000 years, during the Black Death (1349-1353). Historical evidence independently confirms this, with records of Europe-wide cessation of lead mining and smelting. This type of evidence shows that the “Anthropocene”—the time in which we can detect a significant human impact on the environment—started much earlier than the Industrial Revolution (ca. 1800). (Credit: Alexander More – GeoHealth, 1.4).

Incidentally, these findings support decades of medical and toxicological literature that called for ever-lower lead pollution levels. Despite such studies, lead smelters emitting as much as thirty tons of lead into the air each year continued to operate in the U.S. until 2013.[6] The cumulative environmental and public health damages of the last such smelter were settled with the U.S. Justice Department for $83 million.[7] Meanwhile, more literature links lead to peaks in crime rates, plummeting fertility and serious neurological deficiencies.[8]

By underestimating pre-industrial lead pollution, policymakers, industry and some scientists continued to assume that modern smelters did not significantly change “natural levels” of lead air pollution detectable today. The unfounded assumption that the pre-industrial world produced no significant pollution led them to underestimate current pollution levels, for the last fifty years. “Pre-industrial” became conflated with “natural” and we endured, and continue to endure, higher levels of lead pollution because of it, particularly in developing countries. Here lies a clear example of why we must understand and give due weight to history in order to understand our present moment, and craft meaningful policies for the future. If Anthropocene is to be the name we give to our history of polluting the world, this term cannot refer to the last fifty or two hundred years. Try two thousand years.

What does all of this have to do with climate? Atmospheric circulation (prevailing wind patterns) moves pollution around for hundreds, sometimes thousands of miles. It does today, as it did in the past. Wind transports pollutants over time, depositing them in the soil, in lakes, in the ocean and on glaciers.[9] Deposited pollutants form layers that can be dated with ever-increasing accuracy. This is how we can detect and study pollution hundreds of miles from its origin, over thousands of years. Lead extracted from the earth by mining and aerosolized by smelting, or by engines using leaded fuel for decades, eventually deposits in the soil and remains a hazard. Today, due to climate change, we face rising temperatures and re-occurring long-term droughts.[10] Droughts cause the soil to dry up, while winds pick up the resulting dust, including lead and other heavy metals, and return them to the atmosphere, creating peaks of metal and dust pollution. The intensification of wildfires, due to climate change, also affects air quality, causing significant increases of heart and lung disease.[11]

The need to understand these patterns, over time, could not be greater. Scientific analyses, including ubiquitous models, are all based on historical information. Such information is subject to context and contingencies that historians are trained to analyze meticulously. Collaborations between historians and scientists studying climate and the environment are therefore crucial to our understanding of the past and the present, as well as the process of designing future climate policy.

 

Climate Changes Health

This year, the American Public Health Association appropriately adopted “climate changes health” as the motto for its outreach campaigns and its annual meeting. Climate, and its abrupt changes, affect all life on this planet. For recent examples we only need to turn the devastating effects of multiple hurricanes in Puerto Rico, the Caribbean and the Gulf of Mexico this past fall, causing populations to struggle to maintain uninterrupted access to food and water. More distant, but equally disastrous climate crises have caused famines and massive population movements in recent years. The question emerges again: why study similar crises in history?

A salient historical example comes from the “the Great Famine,” a period of climatic crises that caused widespread harvest failures and food shortages, usually dated between 1315 and 1322.[12] Recent studies combining historical and scientific climate data have found that the famine was much more extensive than previously hypothesized, both chronologically and geographically.[13] Collections of written sources along with ice-core and tree-ring records now show that the Great Famine was the harbinger of nearly five centuries of lower temperatures and harsher winters. We call this subsequent period the “Little Ice Age.”[14] Looking at the present, from the vantage point of the past, we can start to think about what the next centuries will look like, after several years of record temperatures and abrupt climate shifts.

The Great Famine (1315-1323 C.E. and later) engulfed all of Europe with harvest failures. It was the harbinger of nearly five centuries of severe climate deterioration, known as the “Little Ice Age.”

During the Great Famine, some of the communities affected most severely by harvest failures were inhabitants of islands, lacking extensive farmland and thus dependent of what could be sourced locally. When local farms failed, so did the food supply. Perhaps the most celebrated example of an island nation, in this period, is the Republic of Venice, which was also one of the most densely populated cities of the pre-industrial world. Because of the lack of extensive agricultural land, the Venetians had to import everything they consumed. Due to its unique geographic location, ever-growing population, and commercial dominance in the Mediterranean, this early republic was able to harness established trade networks dynamically to continue feeding its citizens.[15]

In the face of widespread harvest failures, during the Great Famine, its extensive diplomatic network allowed Venice to send its fleet to areas where food (mostly grain) remained available. Once food reached Venetian cities, strict regulations controlling inflation and quality ensured that the public was able to purchase it at affordable prices.[16] Today, as we face widespread harvest failures due to climate change, food security is a major concern, especially in developing countries which, not unlike pre-modern societies, rely almost exclusively on local agriculture for their survival.

Harvest failures have already caused mass migrations and widespread malnutrition in the Middle East and North Africa.[17] It is jarring to think that some pre-industrial societies were able to withstand severe climate crises—thanks to extensive food networks and government infrastructures—while some modern states, including the United States, cannot provide food, clean water and medical care to an island territory. These are radically different contexts, to be sure, but it is helpful to ask what logistically could the Venetian navy achieve in its vast island territories, that modern American government resources could not, in the case of Puerto Rico and the U.S. Virgin Islands, for instance.

 

Climate Changes Disease

Plague-infected rat flea (Xenopsylla cheopis, Center for Disease Control). Rat fleas have been a major disease vector. Their diffusion is deeply affected by climate change.

A last example brings us back to the Great Famine and what followed it. Decades of increased precipitation during the Great Famine, and later, altered the environment, caused malnutrition and weakened the population of Eurasia on the eve of the greatest pandemic in documented history, the Black Death (1347-53 and later waves of Yersinia pestis). New interdisciplinary studies using historical evidence coupled with ancient pathogen DNA, and epidemiological studies of disease vectors, have shown that the climatic changes in the years preceding the Black Death likely exacerbated the pandemic’s mortality and rate of infection.[18] A weakened population was more susceptible to plague contagion. In addition, climate-driven changes in the environment created the perfect conditions for rats and fleas—the plague’s main vectors—to reproduce at much higher rates, all resulting in a perfect storm that wiped out half of the population of Europe (and beyond), returning in subsequent waves for centuries thereafter.

Could this possibly happen today? It already did.

Epidemiological studies coupled with climate data have recently identified climate change as a driver in the increased occurrence of infectious diseases, such as dengue fever, with 50-100 million infections per year.[19] Scientists have also pointed to climate change as one of the main causes of two widespread outbreaks of disease, in the past two years, that nearly wiped out the entire population of saigas, a central Asian antelope. Increased temperatures caused saiga migrations beyond the animals’ usual range, where they came into contact with a pathogen they were not equipped to fend off.[20] This climate-driven disease cycle could very easily affect human populations, opening new pathways for isolated disease reservoirs to reach urban areas, for example. Simultaneously, melting permafrost, due to global warming, has the potential to unearth diseases long forgotten or yet unidentified, for which we have no immunity or vaccines.[21]

Due to recent climate change, saiga antelopes have migrated beyond their usual habitat and encountered pathogens they were not equipped to fend off. Two epizootics (widespread disease among animal populations) have nearly wiped out the species. They will not be the last victims of man-made climate change.

Pathogens can and do spread from animals to humans. The plague is, again, a good example of this. Plague has jumped from rats to fleas to people in the past and it continues to threaten human populations today, through animal vectors, in areas such as the American West and Sub-Saharan Africa. Scientists have already warned that climate change could result in increased occurrence of plague in these regions, as environmental conditions become more favorable to the replication and diffusion of its vectors.[22]

In the face of these global health threats, historians continue to call attention to the study of past pandemics—preferably in a global perspective—as a way to understand how present and future threats might occur or evolve.[23] There is every reason to expect that climate change will affect how pathogens will interact with, and impact, human populations. There are many historical examples of this, which have been studied and can still teach us a great deal, thanks to advances in research methods, new data, and cooperation across disciplines. As experts, we should also be mindful of what we do not yet know, and endeavor to produce knowledge that will equip us to face future challenges.

 

So what?

Historical processes are never simple. They are the result of context, characters, and contingencies. The environment—and more precisely, the functioning of the Earth’s climate system—is the broadest context available to us. Its study requires analysis of a broad array of sources, which is why this type of research benefits from a trans-disciplinary approach and from healthy skepticism toward established disciplinary and chronological boundaries that often tend to obfuscate rather than innovate and clarify. Studying the past dispassionately can give us clues about established patterns of environmental change and human behavior. Without this information, we would be walking blindly into an uncertain future.

Any prejudice toward any past period of history and its sources (conventional or not) is artificial. That does not mean that we should not be skeptical, or question sources and demand multiple, independent and consilient records to better understand the environment and health of past populations. The more information we gather the better, as long as historians and scientists vet it carefully and make every effort to communicate with audiences well beyond labs and ivy-covered walls. The challenges for this type of investigation are substantial: jargon, methods, categories, massive amounts of data, all require collaboration and a great deal of humility.

The motivation for such an endeavor could not be more urgent, as the deterioration of our environment due to human activity represents the greatest challenge humanity has ever faced.

That is not hyperbole. That is history, and only by knowing history can we grasp and convey the full extent and implications of our current crisis.

 

Alexander More is Assistant Research Professor at the Climate Change Institute and a postdoctoral fellow in the History Department at Harvard University, where he also earned his PhD in History and History of Science. His research focuses on the impact of climate change on human health and the economy, environmental history and the history of public health and medicine. He is a member of the Planetary Health Alliance and currently leads an interdisciplinary group of eleven historians and climate scientists in the Initiative for the Science of the Human Past at Harvard. He served without pay in the office of Senator Ted Kennedy as he worked on the first draft of the Affordable Care Act, and is currently the managing editor of Harvard’s digital historical atlas of Eurasia. His book The Public Good: At the Origins of Welfare and Health Care Policy, focuses on the first public health care system in the western world and will soon appear in press. His research has been covered by more than forty newspapers and media outlets worldwide, including The Guardian, The Washington Post, Forbes, Der Spiegel, Süddeutsche Zeitung, Popular Science and Smithsonian Magazine.

 

[1] Amalia A. Almada, et al., “A Case for Planetary Health/GeoHealth, GeoHealth, 1.2 (2017): 75-78.

[2] Philip J. Landrigan, “The Lancet commission on pollution and health.” Lancet (2017) in press.

[3] Philip J. Landrigan, “Air pollution and health.” Lancet Public Health, 2 (2016): e3-e4.

[4] Alexander F. More, et al., “Next-Generation Ice Core Technology Reveals True Natural Levels of Lead in the Atmosphere: Insights from the Black Death,” GeoHealth (AGU), 1.4 (2017): 211-219.

[5] Ibid.

[6] Leah Thorsen, “Doe Run Smelter is Closing – End of an Era in Herculaneum.” St. Louis Post Dispatch, December 15, 2013, p. A1.

[7] U.S. Justice Department, “North America’s Largest Lead Producer to Spend $65 Million to Correct Environmental Violations at Missouri Facilities.” Press release, October 8, 2010. URL: https://www.justice.gov/opa/pr/north-america-s-largest-lead-producer-spend-65-million-correct-environmental-violations Accessed Oct. 5, 2017. Todd C. Frankel, “Doe Run verdict blames owners Children’s impaired health and future stem from negligence of lead smelter companies, not parents who worked there, jury finds,” St. Louis Post Dispatch, July 29, 2011. Leah Thorsen, “Appellate court overturns $240 million from Herculaneum smelter verdict,” St. Louis Post Dispatch, June 18, 2014, p. A2.

[8] Eloquent examples or recent literature on this subject include: Bruce P. Lanphear, et al., “Low-level environmental lead exposure and children’s intellectual function: An international pooled analysis,” Environmental Health Perspectives, 113 (2005): 894–899. Sven Hernberg, “Lead poisoning in a historical perspective,” American Journal of Industrial Medicine, 38 (2000): 244–254.

[9] Paul A. Mayewski, et al., “An ice record of atmospheric response to anthropogenic sulphate and nitrate,” Nature 346 (1990): 554-556. Erich Osterberg, et al., “Ice core record of rising lead pollution in the North Pacific atmosphere,” Geophysical Research Letters, 25 (2008). Kathy S. Law, and Andreas Stohl, “Arctic Air Pollution: Origins and Impact.” Science, 315 (2007): 1537-1540. Paul A. Mayewski, and Frank White, The Ice Chronicles: The Quest to Understand Global Climate Change, (Hanover: University of New Hampshire Press, 2002): 10-15 and passim.

[10] Sergio M. Vincente-Serrano, et al., “Evidence of increasing drought severity caused by temperature rise in southern Europe,” Environmental Research Letters 9 (2014): 044001. Miroslav Trnka, et al., “Adverse weather conditions for European wheat production will become more frequent with climate change,” Nature Climate Change 4 (2014): 637-643. Neil Berg and Alex Hall, “Anthropogenic warming impacts on California snowpack during drought,” Geophysical Research Letters (AGU), 44.5 (2017): 2511-2518. Gregory P. Asner, et al., “Progressive forest canopy water loss during the 2012-2015 California drought,” PNAS 113.2 (2016): 249-255. Amir AghaKouchak, et al. “Global warming and changes in risk of concurrent climate extremes: Insights from the 2014 California drought.” Geophysical Research Letters (AGU), 41 (2014): 8847-8852.

[11] Recent literature on the subject includes: Kathrin Reinmuth-Selzle, et al., “Air Pollution and Climate Change Effects on Allergies in the Anthropocene: Abundance, Interaction, and Modification of Allergens and Adjuvants,” Environmental Science and Technology, 51 (2017): 4119-4141. William Lassman, et al., “Spatial and temporal estimates of population exposure to wildfire smoke during the Washington state 2012 wildfire season, using blended model, satellite, and in-situ data,” GeoHealth, 1.3 (2017): 106-121. J. C. Liu, et al. “Particulate Air Pollution from Wildfires in the Western US Under Climate Change,” Climatic Change, 138 (2016): 655-666. Orru, H. et al. “The Interplay of Climate Change and Air Pollution on Health.” Current Environmental Health Reports, October, 2017.

[12] Jordan, W. C. The Great Famine: Northern Europe in the Early Fourteenth Century, (Princeton: Princeton University Press, 1996). Philip Slavin, “Market failure during the Great Famine in England and Wales (1315-17),” Past and Present, 222 (2014): 9-49.

[13] Bruce M. S. Campbell, “Physical Shocks, Biological Hazards, and Human Impacts: The Crisis of the Fourteenth Century Revisited,” in Economic and Biological Interactions in Pre-Industrial Europe, from the 13th to the 18th Centuries, ed. Simonetta Cavaciocchi, (Florence: Firenze University Press, 2010): 13-32. Idem, “Panzootics, pandemics and climatic anomalies in the fourteenth century,” in Beiträge zum Göttinger Umwelthistorischen Kolloquium 2010-2011. (Göttingen: Universitätsverlag Göttingen 2011): 177-215. Alexander F. More, “Climate Change and the Health of Pre-modern Europeans,” in “2000 Years of European Climate: First Results From the Historical Ice Core Project,” conference, Harvard University, November 15, 2015.

[14] Brian Fagan, The Little Ice Age: How Climate Made History, 1300-1800. (New York: Basic Books, 2000).

[15] Alexander F. M. More, At the Origins of Welfare Policy: Law and the Economy in the Pre-modern Mediterranean, PhD Dissertation, Harvard University, 2014.

[16] Ibid.

[17] See again Trnka et al. “Adverse weather conditions for European wheat production,” 637-43, and Anthony J. McMichael, “Globalization, Climate Change, and Human Health.” New England Journal of Medicine, 368 (2013): 1335-1343. Colin P. Kelley, “Climate change in the Fertile Crescent and implications of the recent Syrian drought,” PNAS, 112 (2015): 3241-3246.

[18] Sharon N. DeWitte, and James W. Wood, “Selectivity of Black Death mortality with respect to preexisting health.” PNAS, 105 (2008): 1436-41. Boris V. Schmid, et al., “Climate-driven introduction of the Black Death and successive plague reintroductions in Europe,” PNAS, 112.10 (2015): 3020-3025. Nils C. Stenseth, et al., “Plague Dynamics are Driven by Climate Variation,” PNAS, 103 (2006): 13110-13115. R. R. Parmenter, et al. “Incidence of plague associated with increased winter-spring precipitation in New Mexico,” American Journal of Tropical Medicine and Hygiene, 61 (1999): 814-821. Lisa T. Savage, et al., “Climate, soils, and connectivity predict plague epizootics in black-tailed prairie dogs (Cynomys ludovicianus),” Ecological Applications, 21 (2011): 2933-2943. K. L. Gage, “Factors affecting the spread and maintenance of plague,” Advances in Experimental Medicine and Biology, 954 (2012): 79-94. Ann G. Carmichael, “Plague Persistence in Western Europe: A Hypothesis,” Medieval Globe, 1.1 (2011): 157-191, especially at 158.

[19] Raquel Martins Lana, et al., “The Introduction of dengue follows transportation infrastructure changes in the state of Acre, Brazil: A network-based analysis,” PLOS Neglected Tropical Diseases, 11 (2017): e0006070. Clement N. Mneya, et al., “Climate Change Influences Potential Distribution of Infected Aedes aegypti Co-Occurrence with Dengue Epidemics Risk Areas in Tanzania,” PLoS ONE, 11 (2016): e0162649. S. Hales, et al., “Potential effect of population and climate changes on global distribution of dengue fever: an empirical model,” Lancet, 360 (2002): 830–834. T. H. Jetten, et al., “Potential changes in the distribution of dengue transmission under climate warming,” American Journal of Tropical Medicine and Hygiene, 57 (1997): 285–297.

[20] N. J. Singh, et al., “Saiga antelope calving site selection is increasingly driven by human disturbance, Biological Conservation, 143 (2010): 1770–1779. Robert Kock, et al., “Detection and Genetic Characterization of Lineage IV Peste Des Petits Ruminant Virus in Kazakhstan,” Transboundary and Emerging Diseases, 62 (2015): 470-479.

[21] Robinson Meyer, “The Zombie Diseases of Climate Change: What Lurks in the Arctic’s Thawing Permafrost?” The Atlantic Monthly, Nov. 6, 2017. URL: https://www.theatlantic.com/science/archive/2017/11/the-zombie-diseases-of-climate-change/544274/ accessed November 7, 2017.

[22] David A. Eads and David A. Hoogland, “Precipitation, Climate Change, and Parasitism of Prairie Dogs by Fleas that Transmit Plague,” Journal of Parasitology, 103 (2017): 309-319. Jonathan A. Patz, et al., “Disease Emergence from Global Climate and Land Use Change,” Medical Clinics of North America, 92 (2008): 1473-1491. Erica Goode, “Saiga Antelopes Are Struck Again by a Plague in Central Asia.” New York Times, February 8, 2017. Carl Zimmer, “More than Half of the Entire Species of Saigas Gone in Mysterious Die-Off.” New York Times, November 2, 2015.

[23] Monica H. Green, “Taking ‘Pandemic’ Seriously: Making the Black Death Global,” Medieval Globe, 1.1 (2014): 27-61. Eadem, “‘Medieval’ Madagascar: Plague and Inequality.” URL: http://www.thismess.net/2017/10/medieval-madagascar.html accessed Oct. 5, 2017.

2 comments

  1. Pingback: Lo que la historia nos dice del cambio global y la salud del planeta – Ecotoxsan

  2. Pingback: #EnvHist Worth Reading: December 2017 – NiCHE

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: