A lot has been said and written about the massive advantages of data marketplaces for sensor owners — industries, smart cities, utility providers, the transportation sector, … — who will be able to recuperate some of their hardware investments by monetizing the sensor data they generate on a daily basis, turning sunk costs into perpetual revenue streams.
On the other side of the story, we have the parties who can use these data to their benefit and are willing to pay good money for that. Aside from the data processors in the ecosystem, any company looking to commercialize a product that is data driven is given the opportunity to develop the product without having to invest in the hardware. One group of potential buyers are academics, scholars, scientists, and researchers who are always in need of accurate, reliable data to back up their (scientific) research.
The limits of data sharing
Academics can get to new data by data sharing, which is increasingly recommended as a means of accelerating science by facilitating collaboration, transparency, and reproducibility. It will lead to open discussions that can boost research productivity and help scholars improve their work before being submitted to a journal. In a study spanning a period of 7 years (2010–2017) researchers concluded that “openly shared data can help academics to accelerate science, increasing the scale of scientific studies and enabling the involvement of scientists from different geographical areas and a broader range of disciplines”. Findings that suggest the transformative power of data sharing for accelerating science.
But although the benefits of data sharing are obvious, its use is still relatively limited, for a variety of reasons. Apart from the privacy issues, proprietary aspects and ethics, there is a lack of training in data sharing, and sharing data is not associated with credit or reward. But perhaps the most inhibiting factor is time. In most cases, researchers simply don’t have the time to wait for their colleagues to complete their research and share the data. What they need, are up-to-date — and if possible real-time — data.
Real-time data feeds
Having to install their own data sensors, is a process that would not only be very expensive, but extremely time-consuming as well. The thing is that tons of useful data already exist, but unfortunately aren’t accessible to academics and scientists, because they are locked away in data silos of large enterprises. That is to say: they were. Thanks to sensor data marketplaces, offering pollution, power grid and vehicle telematics data feeds and more, academics are now able to purchase real-time data feeds from remote devices. Researchers who require weather data, for example, can now buy access to a feed from a weather sensor that is already being used instead of having to invest in sensors themselves.
An extra advantage of a data marketplace is the simplicity of the process. Imagine the administrative overhead when everybody would have to contact the sensor owners separately … an absolute nightmare and likely never worth the commercial contract.
A better world, thanks to accurate data
Thanks to data marketplaces like DataBroker DAO, academics get access to the data from thousands of sensors all over the world and can buy data directly on the marketplace. Not only leading to a boost in the number of potential spin-outs from academia, as projects no longer have the high startup costs associated with buying and deploying a network of sensors. But hopefully also resulting in a boost of scientific breakthroughs, and a better, cleaner, healthier world.