Are you looking for reliable sensor data, to finalize your academic research, to map out the perfect smart city or to make informed business decisions? Or do you happen to be sitting on a pile of valuable sensor data that you would like to monetize? DataBroker DAO is the (market)place to be, no matter if you want to buy, sell or resell sensor data.
Discover our marketplace
Individuals, companies, researchers & governments are spending hundreds of billions each year on buying and maintaining IoT sensors.The growth of the investment and applications in IoT is staggering, and yet all data captured by these devices is locked up in silos and walled gardens.
On its path from sensor to silo, all this data flows over gateway operators like telecom companies, networks and even the control panels of the sensor manufacturers.
These sensor owners usually have a two year business case for their placement and the data they generate. The primary market for these sensors is expected to reach 1.2 trillion USD in 2019, and estimating that at least 10% of the data generated is sought for others (1-3 times), the value of data locked away will reach 120 billion next year!
Using a smart contract based marketplace on the Ethereum network, it becomes possible to unlock this value. Via their gateway operator, the sensor owners place the data generated by their sensors up for sale (while staking some of their DTX tokens), and buyers can discover and purchase access to the data using the same DTX tokens.
The gateway operator will run their own Ethereum mainnet blockchain node and run the open-source DataBrokerDAO dAPI (distributed API) on top. Data generated by the sensors of their clients is sent (within the same datacenter) to their dAPI which check who has purchased access and send the data directly to the location specified by the buyer on purchasing.
Since the gateway operator is already set up to deal with the data its clients generate, routing purchased data streams to a location on the internet is a straightforward extension.
For doing so, the gateway operator is awarded 10% of each transaction in DTX tokens. The platform also earns 10% and, most importantly, the lion share goes to the sensor owner receiving 80% of the proceeds.
To move from a local to a global marketplace, the DataBroker DAO implementation is replicated with a multitude of gateway operators around the world.
This grants access to a global market for data. Anyone in the world will be able to buy any available data anywhere, making it truly, “a global market for local data”.
With the availability and the ability to trade the DTX token, the platform can move to the mainnet.
One of the larger challenges for DataBroker DAO will be scaling the team fast enough to cope with market demands. Onboarding a new sales team and additional developers is a daunting task. Since September this has been an active focus and this will be the case for the years to come.
The main road to mass adoption is integrating with gateway operators that enable the onboarding of millions of sensors in one go. The DataBroker DAO platform will be integrated with the gateways of these gateway operators. We will be working on both common standards and libraries to ease integration, and perform the initial integrations for the first operators in the DataBroker DAO Alliance.
On the one hand is the focus of DataBroker DAO mostly on the dAPI. This is where the data streams will flow over, and combined with the smart contracts the core value of the platform. We expect quite a few gateway operators to use the dAPI to build a marketplace in their existing platforms. There will however be quite a few that do not have an integrated approach, or no existing systems to integrate with. That is the reason we will build a DataBroker DAO marketplace frontend and offer this frontend in a white labeled form to gateway operators. When first launching the platform this interface will be a basic one, and will require a diverse set of skills (information architecture, design, web and mobile development) over in the first two years to build a tier one frontend. This will be a heavy focus for Q3 of 2018 to reach the next guideline.
Working with our alliance members has taught us a lot. Including the fact that there is a large set of data in the market that companies want to sell to either anyone except a competitor, or to just a whitelisted group of companies. As such this capability will be integrated in the smart contracts from the start, and work in the frontend side will continue into Q4 2018.
Having easy access to large amounts of data opens up a wealth of options for many startups across the globe. Enhancement and aggregation will provide even more valuable data than the raw data itself. The team will look for, and work with partners in the Alliance to provide valuable services based on the raw data by providing libraries and integrations for commonly used tools. Integration with AI tools (like provided by the large cloud providers, or Tensorflow) come to mind. Initial integrations will be made with the dataprocessors in our Alliance in Q1 2019.
Subscribe to our newsletter and receive the latest news.