By: Aidan Das
Over the past few decades, the impacts of human activity on the environment and the hot topics of global warming and climate change have become more prominent in the media and in government policymaking. But what do all these buzzwords actually mean, and how did concern for the Earth’s changing climate actually begin?
To clarify, the term climate change describes differences in global temperature and weather patterns, including physical and biological changes of various environments around the world over long periods of time. This is different from global warming, a term that just refers to the long term increase in global temperatures.
The earliest accounts of the human causes of climate change involved the people of ancient Greece who speculated that human activity, such as the removal of trees, could impact regional weather changes. As time went on, these speculations were proven to be wrong, but this was just the beginning of a continuous series of discoveries as to how humans can impact the environment and global climate.
Fast forwarding to the 20th century after the Industrial Revolution, Guy Stewart Callendar was one of the first scientists to draw a connection between carbon dioxide emissions and global temperature increases. His work was not given much credit at the time when it was first published in 1938, due to people believing that humans couldn’t possibly have such a large impact on the world’s climate.
Another important point on the timeline of the discovery of climate change was the development of the Keeling Curve. The Keeling Curve was a graph created in 1958 that compared the fluxes of carbon dioxide in the atmosphere to global temperatures. This curve shows both the seasonal fluxes of carbon dioxide in the atmosphere each year and the gradual rise in atmospheric carbon dioxide over time. The rise in atmospheric carbon overall mirrors rising global temperatures. From the graph, predictions were made that showed the global temperature rising by a small amount over the next century. When these predictions were made, such a small change over a long period of time was not considered to be a big concern.
Later during the 1970s, as more information regarding climate change came to surface, the contradictory nature of some findings left people confused about what to believe. More pollutants such as aerosols were being put into the air, but these types of pollutants can actually reflect sunlight, causing cooling. There was some evidence of cooling happening at this time, but ultimately Earth still continued to warm due to the fact that carbon dioxide stays in the atmosphere for longer periods of time, while the cooling pollutants only remain in the atmosphere for a few weeks.
As global temperatures rose in the 1980s, the issues associated with this phenomenon became prominent: examples include droughts, wildfires and other natural disasters. By 1989, the International Panel on Climate Change (IPCC) was established so countries could come together and use science to understand how climate change was evolving.
One of the more recent reports from the IPCC in 2018 states that important measures need to be taken to stop the increasing global temperatures before the results of global warming are irreversible. Even with reports like these, some officials are still skeptical and do not want to make drastic changes to reduce emissions, because they believe these changes will negatively impact sectors like the economy.
Today, as humans become less connected to nature and more integrated with technology (especially after the most recent coronavirus events over these past few days), we must still remember how our actions and activities impact the environment. The climate of our world is a complicated system that still needs to be researched and understood. As we acquire this information, we must take the necessary steps to protect the environment and mitigate the harm being done by human activities.
Cover photo courtesy of Pixabay