Frequently asked questions

How will sensors communicate with your platform? How do you verify the data? Technical questions, security questions, commercial questions, general questions … in this FAQ section we try to answer them all.

What is DataBroker DAO?

DataBroker DAO is the marketplace to sell & buy sensor data. As a decentralised marketplace for IoT sensor data, Databroker DAO enables sensor owners to turn generated data into revenue streams. This will open up a wealth of opportunities for various industries. Data will be used and become more effective.

Today companies make use of sensor data to optimize and monitor their operations. The result is a single purpose data landscape. DataBroker DAO drives the evolution towards truly ‘smart living’ by making this data easily accessible to cities, organizations and entrepreneurs at an affordable price. Check out our video to learn more.

In a sense, DataBroker DAO can be likened to a “secondary market” for IoT sensor data and has been referred to as an “eBay” or “Amazon” for IoT sensor data.

Why a token?

From a marketplace perspective, using the public Ethereum chain enables the use of a fully built out financial ecosystem, with a minimum of fees. Traditional fiat payment processors charge between 1 and 3% for money in and money out, while a purchase using the utility token costs around 0.003 USD in fees for purchases of any size.

Using a utility token over fiat currency also brings the advantage of 18 decimals. Combining these small fractions of the token with very low fees, real microtransactions become possible.

Choosing to use a token over fiat or ETH allows us to use the divisibility that is needed to operate micro transactions in a market with over half a trillion individual devices that produce data in the range of every second. Having this abstraction layer on top of ETH also prevents the token to be subject to the large fluctuations of the ETH price. The volatility of the DTX token will be more limited and can be controlled in a limited fashion, with some market making using reserved funds and tokens.

What is the purpose of the DTX Token?

The DTX token is a utility token in the Databroker DAO platform. The DTX token is a ERC20 compliant token with 18 decimals. The token will serve as the credit to buy and sell sensor data within the platform.

How does DataBroker DAO stand out from its competitors?

DataBroker DAO takes a radically different approach from our competitors:

– It is built to go into production this year. This ensures both projects need to use proven technology, and a first movers advantage over these other players.

– It is built to interface with the large players in the ecosystem (manufacturers and gateway operators) in a non invasive way. DataBroker DAO are not replacing anything or anyone, ensuring a low friction environment. The only way to get market scale fast enough.

– It focuses on being a complementary component in the current and future ecosystem, meaning that in the future DataBroker DAO can co-exist, integrate and complement IOTA, Streamr and OceanProtocol.

We wish each and every one of these projects all the success in the world, and the Team will happily integrate them into the ecosystem upon reaching critical mass.

How do you verify the data on the marketplace?

We are not validating the data itself. That would mean there was a central party with access to all data. We are using a staking (for the sensor owner) and claiming (by buyers) system to deal with that. This is illustrated on page 21 of our whitepaper and in this explainer video.

Can you elaborate on the development of ‘whitelabeled platforms’?

So, we have our own interface, and we provide the dAPI to gateway operators. These gateway operators can build on the dAPI to integrate the marketplace in their own UI. They like this because their clients do not need to deal with anyone except them (and DataBroker DAO works behind the scenes).

Now some gateway operators build their UI themselves. Others might not, and they would get a version of our UI, but branded for them of course. An intermediate option with some sort of client side “SDK” that eases integration is also on the planning.

Who will benefit from DataBroker DAO?

Sensor owners
Are able to monetize their data and turn a sunk cost into a potential money maker and at least the opportunity to recoup some of their investments in IoT sensors. The key role of Sensor Owners in DataBroker DAO is to sell the data from their sensors on the platform.

Network operators
Gain scale and speed in the adoption of their network as connected telcos can present a win-back to their enterprise accounts, a clear USP. The key role of Network Operators in DataBroker DAO is to expose the gateway they operate to enable sensor owners to sell their data on the platform.

Sensor manufacturers
Can stop the “race to the bottom” for production and pull resources and capital out of manufacturing and allocate these to profitable SaaS offerings.

Smart city initiatives
Can limit the upfront cost of populating the town with sufficient sensors and turn the expense into an investment with a 2-3 year payback period and a continuous income stream after that.

Agricultural sector
In the Agricultural sector in Belgium today, 10% of farmers are “techie”. They deploy sensors include for wind, temperature, barometric pressure, humidity, PH level in the soil. The platform will provide the possibility to recoup some of this cost.

Academics
Can get access to thousands of sensors and can buy data directly and more cheap on the marketplace, cutting out established data providers.

Data Buyers
Data Buyers are those stakeholders who will purchase data on the platform. The scope of this purchase could be to use the data in its raw form, for their own purposes, or to transform/enrich the raw data to be resold with added value via DataBroker DAO (see Data Processor below). The use of the data purchased by Data Buyers can be quite straightforward, for instance purchasing temperature and rainfall data provisioned by a neighboring office building to get accurate local readings, to the more complex, like purchasing data to train one’s AI.

Data Processors
Data Processors are those Data Buyers who purchase data with the explicit intention of enriching the data and either reselling it or handling it for their clients. The enrichment may take many forms and Data Processors can be categorized by the level of insight provided:
• Simple data services are the most common. Data brokers collect data from multiple sources and offer it in collected and conditioned form — data which would otherwise be fragmented, conflicted, and sometimes unreliable.
• Smart data services provide conditioned and calculated data, with analytical rules and calculations applied to derive further insight from the collected data and aid the decision making process.
• Adaptive data services apply analysis to a customer’s request-specific data, combined with data in a context store. This is a more advanced form of service. It is estimated that there are more than 5,000 data processing companies worldwide relying on a vast array of open datasets, published by government agencies and non-governmental sources.

Any plans for the automotive industry?

We are talking to some truck manufacturers, no car specific alliance partners yet. The project has has had quite a lot of interest from the automotive sector. Specifically to utilise DBDAO as a solution to provide access to automotive sensors to a selected number of whitelisted recipients / buyers. For instance, one was a large global truck manufacturer who leases fleets of its transport trucks. In their terms and conditions, they are the owners of all data from onboard sensors (250+ readings every minute). They are very interested in selling this data to interested buyers but want to ensure that they are in control of who can buy the data (i.e. no direct competitors). Enabling this form of selling – to a whitelisted group of buyers – is now on our product roadmap. In the event that your firm would be interested in such a solution, please let us know.

What’s the revenue distribution?

10% goes to the platform
10% foes to the network operator
80% goes to the sensor owner

Who will sell data?

A number of data sellers have been identified and the overview of the sectors already investing in sensors from Gartner highlights the key potential sellers of data for the years to come. The diagram below identifies the 2 groups (business, consumer) and the sub-groups that constitute each one. It is clear that the business group is the main global driving force in sensor deployment.

What is the difference between dAPP and dAPI?

In the blockchain world, a large number of projects are building distributed applications or dAPPs. These clientside applications interact directly with Ethereum or other blockchains. In many cases, for the sake of user experience, these applications are running on remote shared nodes like the ones Infura provides.

While this is the only way to create user-friendly end-user peer-to-peer applications, it has serious drawbacks for some of our use cases:

  • Single point of failure. During some of the recent token sales, the client-side applications coupled with high demand have brought these shared nodes to a halt. Not for the lack of trying or skill, but due to the sheer amount of RPC calls needed to perform certain functions on Ethereum smart contracts. In a high stakes sector, such failures are not an option.
  • Web interfaces and apps are nice, but the real value is in APIs. In the current SaaS and cloud boom, this is almost a given. You have no real product unless you also have an API for it. Slack, Zapier, Github, CRM, and ERP systems, they all attribute parts of their success to their commitment to APIs.

• More apps, more problems. Adding an extra interface only makes it harder to use for the average user. The sensor owners already have an account with the operators. They have figured out how to work with them and are happy (and if not, they switch operators). That is why the Team add in, what they call, a dAPI. Just like a dAPP, it’s an API application that is deployed at each node.

What about data distribution and storage?

Billions of sensors generate huge amounts of data. And any company using IoT sensor data has its own systems for processing it and is most likely not inclined to replace that system. This means that the Team cannot enforce a new data storage system on them. Even more important, it is not the goal of the platform to store all IoT sensor data for eternity.

Built in the dAPI, there are connectors to integrate with the leading IoT and big data storage vendors, leaving the buyer the choice on where their data needs to be sent. Now there is a valid use-case for blockchain in the storage of the data. The immutability and timestamping capabilities are worth something. On the one hand DataBroker DAO allow batches of data sent to non blockchain repositories to be anchored on the Ethereum mainnet (using the Chainpoint spec10).

As an extra offering, there are connectors to put the data directly in a hosted and shared Multichain (1500tx/s11) and BigchainDB (going up to 1mio tx/s12) network. The current beta uses the Multichain connector exclusively to store the data.

What is the DTX Token initial value?

The goal is to have 1 DATA token covering the average value of the data from a sensor for one week. This allows us enough granularity (at 18 decimals) to work with micropayments, even after significant growth and price increases. DataBroker DAO determine the corresponding price per token by looking at the market predictions in the previous section for 2024. At that time, the Team project to have 2.5 billion USD flowing through the platform for 225 million sensors.

The average sensor has a value of ~12 USD per year, ~1 USD per month, or 0.25 USD per week, and as such, the value of 1 DATA token should initially equate to this number. DataBroker DAO determine the maximum number of tokens issued to be 225 million, the number of sensors on the platform in 2024.

What is the business relationship between SettleMint and Databroker DAO?

SettleMint is the parent company, they developed DBDAO and are spinning it out into a separate company now in combination with the token sale for funds. SettleMint is registered both in Belgium and the UAE; DataBroker is associated with the UAE branch.

DataBroker DAO was conceptualized at SettleMint in late 2016. The dynamics of the market and the opportunities it brings immediately sparked the development of the first proof of concepts. After the first beta was completed in February, it has been on the road to tradeshows, pitch competitions, and blockchain challenges across the world to test its market viability.

Right now the Team is focussing on DBDAO. Getting the platform ready and onboarding new team members is a time intensive job. As new people are hired, They will appoint them depending on their skills to either DataBrokerDAO or SettleMint projects. There is some overlap now, but the goal is to move to fully independent teams apart from the board of directors.

About Settlemint:
SettleMint is a Belgian-based startup focused on creating tools to make building blockchain applications easy by any IT team. All the work and R&D is encapsulated in a distributed middleware called Mint, which consists of 4 SDK: Notary, which deals with anything related to recording information on blockchain, but also IPFS and swarm; Provenance for supply chain tracking; Ballot box for voting; and last but not least, Marketplaces for functionality ranging from tokens to exchanges of digitally traded products. All the while it supports a wide range of public and private blockchain solutions like Ethereum, Bitcoin, Multichain, BigchainDB, and the Hyperledger projects.

Mint is used in the DataBroker DAO. The marketplaces SDK and smart contract templates are used for the marketplace part of the project, while the Notary SDK is used in the archiving and sharing of the data part of the project. A yearly license fee will be paid out of the revenue of the platform as compensation.

Subscribe to our newsletter and receive the latest news.