An oil painting by John Stanley depicting Native Americans hunting, 2013.
Scientists from University College London posited that the European colonization of America that resulted in the mass death of Native Americans actually caused the Little Ice Age.
According to the study, the Native American genocide, often referred to as “The Great Dying” not only reduced the continent’s population by countless millions but subsequently allowed global temperatures to fall drastically.
“The Great Dying of the Indigenous Peoples of the Americas led to the abandonment of enough cleared land that the resulting terrestrial carbon uptake had a detectable impact on both atmospheric CO2 and global surface air temperatures,” said the study’s lead author, Alexander Koch.
The genocide of Native Americans via contact with foreign diseases or murder on behalf of the settlers purportedly left so much abandoned native agricultural land to be reclaimed by nature, that it drew enough carbon dioxide from the atmosphere to cause the Little Ice Age, which was a period of global cooling between the 15th and 18th centuries.
“There is a marked cooling around that time which is called the Little Ice Age, and what’s interesting is that we can see natural processes giving a little bit of cooling, but actually to get the full cooling — double the natural processes — you have to have this genocide-generated drop in CO2,” said Koch.
“Landing of Columbus” by John Vanderlyn (1847).
The team reviewed all available census data of the Americas before 1492. They tracked those figures across time and incorporated historical factors and events which ranged from disease and warfare to slavery and the eventual collapse of the native society.
The research showed a shocking reduction in population from 60 million by the end of the 15th century — which was around 10 percent of the world’s population at the time — to five or six million within 100 years.
Burial of the dead after the massacre at Wounded Knee, 1891.
In order to link that data to carbon uptake, Koch’s team had to assess how much Native American land had been abandoned and been reclaimed by nature in order to match that to our current understanding of global cooling data during that period.
What they found were 56 million hectares, an area of land which is about the size of France, left untended after those who previously lived on it had died. The subsequent regrowth of trees and vegetation is said to have caused an atmospheric CO2 decrease of between 7 and 10ppm (parts per million).
“To put that in the modern context — we basically burn (fossil fuels) and produce about 3ppm per year,” said co-author, Professor Mark Maslin. “So, we’re talking a large amount of carbon that’s being sucked out of the atmosphere.”
Nuclear cooling towers, 2010.
The Industrial Revolution in the 20th century has often been cited as the beginning of catastrophic, man-made climate change, but Reading University professor Ed Hawkins is adamant that additional factors must always be considered.
“This new study demonstrates that the drop in CO2 is itself partly due to the settlement of the Americas and resulting collapse of the indigenous population, allowing regrowth of natural vegetation,” he said. “It demonstrates that human activities affected the climate well before the industrial revolution.”
The study implies that nature can also effectively impact global temperatures by mere reforestation and healthy vegetation. This has left Hawkins — who studies climate change — curious about its potential applications. On the other hand, it also clarifies just how emissions-heavy our contemporary world has become.
“What we see from this study is the scale of what’s required, because the Great Dying resulted in an area the size of France being reforested and that gave us only a few ppm,” he said. “This is useful; it shows us what reforestation can do. But at the same time, that kind of reduction is worth perhaps just two years of fossil fuel emissions at the present rate.”
While making an effort to challenge that present rate is arguably most important at this juncture, the University College of London study certainly proffers a strong argument to look back at history for clues, warnings, and advice.