We have so much data that we can literally stack it end on end like Lego bricks.
In 2017, NASA was collecting 12.1 terabytes (TB) of data daily, which was then added to an archive of 24 petabytes, or 24,000,000GB of data.
One petabyte equals 2^50 of bytes. In other words, it would take you over 2.5 years of nonstop binge watching to get through a petabyte's worth of 4k movies.
And that’s just NASA. Commercial satellite providers are collecting 100s of TBs worth of daily data, 365 days a week. According to Maxar, if you were to collect all the data captured by their satellites and burn them onto DVDs, these would stack up to nearly 4 times the height of the Empire State Building.
The promise of satellite data
Post the launch of Sputnik I in 1957, 8,900 satellites have been put in orbit till date. These man-made moons have since captured powerful images and relayed critical data to the planet on natural disasters, infrastructure projects, defense activities and of course, the weather!
In an information-driven world, there is perhaps no other source that can provide us the scale and frequency of information on the physical events taking place on earth, like a satellite can.
While earth observation data has predominantly been used for defense planning, internal security, infrastructure and governance, raw data from satellites will be heavily relied upon to track climate change in real-time and address the growing crisis.
In 2015 itself, 21% of the earth observation value-added services market was valued to grow in environmental monitoring by 2025. This is estimated to be higher than the defense sector which is poised to grow at 14%.
Making sense of all this data!
Before satellite data can be applied to climate mitigation or any other use case, it needs to be examined. This is true for any big data and today, less than 1% of global data is analysed.
One reason for this is that many layers of processing have to be done to make big data usable for the end consumer. And this process is complex- not something everyone is equipped to do.
For raw satellite data, for example, the analysis can involve a few of the following steps:
- 'Atmospheric Correction' of the raw data i.e correcting the haziness in satellite imagery that can occur due atmospheric disturbances, cloud cover etc. Machine Learning plays a vital functionality here.
- Deriving insights from the imagery such as forecasts, predictions, patterns etc. where AI comes in handy.
There are other processes and this process is time-consuming and requires specific data wrangling skills, presenting an opportunity in the downstream market of Earth Observation.
A geospatial data refinery
Operating at this step of the value chain, Blue Sky Analytics is building, a 'geospatial data refinery' that processes and analyses raw satellite data using ML and AI tools. In it's first year, our refinery analysed 2TB of satellite data and produced two datasets: BreeZo for air quality monitoring and Zuri for tracking global forest and farm fires. In 2021, it is set out to analyze an additional 27TB of data.
Our goal is to maximise our understanding of climate change via satellites and making this knowledge usable for everyone. Capitalising on the opening space sector and India's long-standing engineering prowess, Blue Sky Analytics is inching towards being India's leading downstream data shop for the world.
Satellites are currently used to capture and track information on a whole range of topics. From tracking tornadoes and flash floods, they can also be used to study sea surface temperatures and track how much sea ice we have lost.
If you're further interested in finding out more about how satellites do this, here's a link. We encourage you to further read up on the true power of satellites.