From analyzing large data sets — or big data — we know that our planet lost the equivalent of 40 football fields per minute last year in tree cover. Big data can a help us tackle the problem, for example by locating harmful emissions or identifying pressure points along the supply chain.
When California Governor Jerry Brown announced in September that the US state would be launching “its own damn satellite” to monitor the effects of climate change, his promise was bold. California is taking local action to a global problem in the absence of federal leadership. The state will develop and eventually launch a satellite capable of detecting the “point source” of climate pollutants, monitoring leaks and other anomalies at specific locations.
The satellite, an initiative of the California Air Resources Board, will complement project partner Environmental Defense Fund’s MethaneSAT, scheduled for launch in 2021. The latter will provide broader, more frequent coverage, quantifying emissions from oil and gas fields producing at least 80% of global output roughly every four days.Combining data captured via satellite imagery and artificial intelligence to monitor forests and land use to provide the ‘where, why, when and who’.
Another example is the Trase platform, which connects independent data sources to reveal the trade flows for commodities such as beef, soy and palm oil which are responsible for an estimated two thirds of tropical deforestation. Using existing data such as customs records and trade contracts, tax registration data, production data and shipping data, Trase pieces together a bigger picture of how exports are linked to agricultural conditions in the places where they are produced.
Most recently, US climate change think tank Woods Hole Research Center is using a satellite-based tool to create a new global carbon monitoring map. The approach is “poised to transform how the world measures and tracks changes in forest carbon”.