DataBroker DAO mid-year 19′ update by CEO Matthew Van Niekerk

DataBroker DAO mid-year 19′ update by CEO Matthew Van Niekerk

ABSTRACT

These last months have been very exciting because of the confirmation of the market demand for data monetization. Yet, at the same time, we didn’t have to resources to showcase these steps forward to our supporters. By Q4 2019 we plan to bring 18 months of commercial efforts and a completely new platform into production.

WORDS FROM THE CEO

Before getting to the updates, I would like to thank everyone in our community for supporting the DataBroker DAO  project, sharing your ideas and recommendations openly.

The evolutions of the DTX exchange listings over the past several weeks and the request to have transparency about our achievements and platform roadmap made us realize that we have to work more closely and transparently with our community.

I take responsibility for letting this communication gap form and for not acting sooner to address our community’s expectations. Over the last 18 months, we have heavily invested in the development of the DataBroker DAO platform, hired supplementary resources and prepared the platform for commercialization. We have been driving business growth, increasing the number of members in our alliance and measuring the positive market traction. I’m very proud of the team’s achievements. Our top priority – which we determined from the start – , is to realize the value of DataBroker DAO by accelerating the adoption by data buyers and data sellers in the market. Through this communication update, I want to address the community’s concerns and confirm that we are going to production with the DataBroker DAO platform in Q4 2019 and have a planned approach for formulating an effective communication strategy for the community going onwards and upwards.
Matthew van Niekerk, CEO

 

DTX EXCHANGE LISTING

Back in 2018, we applied to list the DTX token on 15 exchanges, including several top 10 exchanges. We succeeded in listing the DTX token on CoinFalcon, Tokenjar and Probit. The online application process to list on some of the other exchanges are unfortunately very ineffective. Most of the times, we don’t even get no response at all from the platform’s owners. Recently, CoinFalcon delisted the DTX token as part of their general removal of altcoins. Despite having a contractual relationship with their clients, CoinFalcon made this decision with zero communication or notification.

Our lessons learned

From the past, we learned that DataBroker DAO and its community have to work hand in hand and that we have to convince the top 10 exchanges to list the DTX token. Exchanges listing the DTX token without communication and/or formal agreement with DataBroker DAO are a risk for the long term stability of the token.
We need to follow the quality and reputation of exchanges listing DTX closely.

What is our immediate priority?

We are reaching out in our network to establish relationships with decision makers at reputable exchanges. This will take some time but hopefully it will get us past the lack of responses from online application processes.

Bit4You, the first and only fully regulated Belgian exchange with fiat on and off ramp, has been selected as a trustable and qualitative exchange who will list the DTX token within a few weeks. We do not expect immediate liquidity for the community but it will bring trust to our clients. Secondly, we are analyzing the exchanges proposed by the community and will evaluate the most relevant, qualitative and trustable ones to list DTX. See below for the current status of the applications

Current Exchange Application Status

Upcoming

  • Bit4you – expected listing in July

Ongoing

  • Yobit – application and fee sent, awaiting approval (estimated 1 week)
  • Idex – call scheduled this week
  • Probit – to keep being listed, need to invest in market making, contract negotiations in progress

Application in progress:

  • Bittrex – re-application in progress, they require a lot of new documents and verifications.

No response yet:

  • Beaxy – expect answer beginning of July
  • io – submitted before, no response, resubmitted now
  • Gate.io – submitted before, no response, resubmitted now
  • Bitmax – submitted all information, awaiting a response
  • Upbit – submitted all information, awaiting a response

No listing :

  • Kucoin – listing fee is unaffordable
  • Livecoin – listing fee too expensive for the size of exchange

Support request to our community

We kindly request our community to support us in the listing process by getting us in touch directly with the right contacts at these exchanges, as for example Huobi for which any support would be greatly appreciated.

 

ACHIEVEMENTS 2018

We delivered the first stable version of the DataBroker DAO platform and integrated the support of DTX tokens. This version allowed us to test the market interest during multiple worldwide events that allowed us to promote the DataBroker DAO brand.

Platform

The dAPI and smart contracts have been totally re-engineered from the ground up with no visual impact for the platform visitors. In March, we published the platform V2,0 Alpha 1 “Discovery Release” with support of a token curated registry with DTX token staking to ensure the quality of data being sold and purchased. We have implemented a brand new dApp focusing on the discovery of interesting data sets, allowing potential buyers to search easily and find the data they need. The platform V2,0 Alpha 2 “Purchase Release” was focussing on improving the frontend interface, adding the possibility to search data on maps (f.e. 30,000 sensors from Luftdaten) and purchasing sensor data with DTX tokens.

Visibility

We participated in global events to promote the DataBroker DAO brand and really were blown away by the market interest. Articles were published in Techbullion, Jinse.com, Steemit, Crypto Rich and we saw very positive reviews from Cryptoworld, Coinidol, Onyeri, Artyom, Rel Crypto … We published articles for the use of DataBroker DAO for Smart Cities, Agriculture, Smart Locations.

We organized roadshows in multiple countries and got top honors presenting DataBroker DAO in San Francisco, Abu Dhabi, Dubai, Luxembourg, Cayman Islands, New York, Belgium, Germany, Hong Kong, Sydney …

 Alliance

Early 2018 we launched the DataBroker DAO Alliance which consists of these 4 types of stakeholders: sensor owners, data buyers, gateway operators and data processors. Starting with 7 members including Yutix, (agriculture and environmental data), Technilog and IdentityMind, we evolved quickly to +20 members, including global gateway operators like Ericsson, Siemens, Panasonic, Sigfox and Actility.

     

 

ROADMAP 2019

Value proposition and business model

The DataBroker DAO business model has been thoroughly reviewed in Q2 2019, bringing more value to our clients and opening additional market opportunities to our project:

  • White labeling: next to our own brand, major data sellers will be able to adopt our platform, using their own branding. All platforms will be interoperable and will use DTX tokens for sales;
  • Data Exchange Controller Platform (DXP): we have extended the platform with a new feature, allowing companies to safely share data between multiple parties. We measured high market traction for it in healthcare, connected cars, and smart cities sectors.

Platform evolution and commercial release roadmap

In Q2 2019 we adapted the marketing website to generate data buyers and data sellers leads (“Lead Generation”), which are being handled by our Sales Team (“matchmakers”). Currently, we’re capturing one potential data buyer and one data provider per day through the platform.


Q2 2019 – “Lead Generation”

The V3.0 “Pre-Release” scheduled for Q4 2019 will be the first commercial version where data buyers and data sellers will be able to meet each other autonomously. Over the lasts weeks we dedicated our efforts to intensive workshops to create an ideal costumer journey and disruptive user experience, using the platform. The new defined UX wireframes will replace the current DataBroker DAO website and bring new functionalities:

  • Search Data for searching data based on data sources, data tags and data providers
  • Communities, displaying the list of available data sources and data providers per community theme (f.e. environment, geographics …)
  • Data Use Cases, inspirational and informational articles explaining the benefit of using data (f.e. crop monitoring to optimize cultivation, data sharing for smart cities, tree monitoring to support climate adaption and mitigation …)
  • Data Providers Adverts allowing data sellers to promote the type of data they offer for sale
  • User Dashboard with reach features like Follow, Matches, Deals, Notifications, Messages and Account management


Q4 2019 – V3.0 “Pre-Release”

Data Sellers onboarding

From July till September 2019, we will onboard 3 data sellers that will participate to a V3.0 pilot phase before going to production:

  • Data seller 1: Geolocalized data provider (Belgium)
  • Data seller 2: IoT Platform (France)
  • Data seller 3:  Global environmental data provider

 

COMMERCIAL TRACTION

Regarding DataBroker DAO’s commercial traction, we have more than 300 opportunities in our CRM system and are actively working on more than 50 of them simultaneously. About 15 of these clients are currently at the proposal stage and we are negotiating MoU’s and contracts with several of these clients. For some of these, we have drafted and proposed joint press releases but please understand that we will not communicate until this is agreed.

  • Opportunity 1: global car manufacturer for monetization of connected car data
  • Opportunity 2: white labeled data marketplace for a telecom operator in the Middle East
  • Opportunity 3: white labeled data marketplace for a Smart City in Europe
  • Opportunity 4: 5G Life campus / development of a data sharing and monetization solution for Smart Buildings with Ericsson / Securitas / Signify / Degetel

Whether these are covered by NDA or not, we are not willing to communicate prematurely on these opportunities, for different reasons:

  1. We want to avoid creating false expectations for the community,
  2. the partner organizations are mostly large international players in either manufacturing or telecommunications, which often do not want to communicate their intentions with DataBroker DAO for competitive reasons.

 

TEAM UPDATE

In 2018, our team grew with Thomas, Silke and Peter-Jan. Currently, our team consists of 12 people, welcoming Adrien (developer) and Vincent (DataBroker DAO product owner).

We will onboard 3 supplementary members in July 2019 to accelerate our go to market and commercialization of DataBroker DAO.

“ My main objective is to get the platform to the next level. From the past I worked on multiple large IoT projects. The data marketplace story is not new to me as I’ve been in contact with worldwide organizations searching for exactly the same solution. Early this year I’ve met Matthew and Roderik and I was amazed by the job they have done so far so I decided to join the team and the community. ” – Vincent Bultot (Client Solutions Manager)

“ I’m a software engineer passionate about current web technologies. I joined SettleMint because I wanted to work daily with blockchain technology. I want to be part of the revolution that will make a decentralized, secure and transparent internet part of everyday life. ”
Adrien Blavier (Blockchain Developer)

 

We also hired a new community manager that will help us to bring more transparency to our development and opportunities towards our community.

“As a one of the earliest supporters of Databroker DAO, I have been following this project very closely since the beginning, staying in contact with the team and being active in the community. Knowing about all the business work the team has done so far, I’m willing to transparently communicate about the worldwide potential of this platform and satisfy our community by pushing Databroker DAO to the next level ” – Afro (Arnel Dulbic)

 

PATH FORWARD

We are eager to work together with our community in a constructive and efficient way with respect to the confidentiality agreements we have with our clients.

DataBroker DAO is on track. During the last months, we have reviewed our business plan, generated multiple opportunities in various segments which confirms the market traction and our unique market position.

We are very excited on the platform v3.0 launch in Q4 2019, providing a fully new and disruptive user experience with many functionalities we were missing to push full commercialization of DataBroker DAO and help organizations to monetize their data thanks to the DTX token.

DataBroker DAO now ready to rumble

DataBroker DAO now ready to rumble

At DataBroker DAO, we keep our word. As promised in our previous blog post, we are happy to announce that as of today our IoT sensor data marketplace is now up and running.

Are you tech-savvy, and interested in the ins and outs of our platform, we warmly recommend you this elaborate tech talk, expertly written by Peter-Jan Brone, one of our blockchain developers.

Meanwhile, we are feverishly onboarding our alliance members. This will soon result in a marketplace with valuable sensor data that can be bought and sold.

And to further promote our platform, we will be present at numerous fairs, expos and other events in the months to come. Should you be attending the Smart City Expo World Congress in Barcelona (13–15 November) for example, combining the latest solutions and technologies with inspirational keynotes on improving life in cities, you are very welcome to pay us a visit at the Flanders Investment & Trade booth at the Fira de Barcelona.

Join our Telegram channel for all information and questions or drop us a line: hello@databrokerdao.com

Enterprise Readiness

Enterprise Readiness

[Written by Peter-Jan Brone]

After months of hard work we are proud to announce that DataBroker DAO is now running on the new version of our Mint middleware solution. The new version is simply called Mint 2.0 because as we are developers we’d very quickly come up with crazy names that seem funny at first but are eventually just a nuisance to explain to new joiners. I kid you not: I once refactored a micro service and we ended up renaming it to ‘Natalie’, after Natalie Portman, for no reason other than the apparent one.

The diehard followers of our project will know that not so long ago we raised money through a token sale, and we are still very grateful to all the people backing our project and giving us the opportunity to keep working and building out our Data Exchange platform. In light of this we felt it was our responsibility and an opportune time to go a little deeper into the inner workings of the backend now that our End of Summer release is out. Over the past 3 months most work has sunk into re-engineering the tech on which our platform is built. I won’t go into detail about the numerous improvements to the frontend, which were discussed in previous posts already.

TL;DR: Mint 2.0 is bigger, badder and bolder; and DataBroker DAO’s Data Exchange platform is now running on it. And by the way you might say “boy this post is way too long”, but I looked it up and apparently I only overextended the ideal blog post length by 1.4%.

 

Some history

As a background into what exactly Mint is I’d like to refer you to the following overview of its functionalities. In short, Mint is a blockchain agnostic layer that sits on top of the blockchain and acts as both an indexer and adapter layer for all in- and outgoing traffic to and from the blockchain. Mint 1.0 was built using NodeJS and mostly relied on the well-known Ethereum Javascript API web3.js, in combination with MongoDB 3.2 and IPFS as data storage layers. One of the main issues we were facing with this setup was that we needed to maintain an enormous amount of watchers in the web3.js library, one for every sensor deployed to the system. We discovered however that these watchers were suffering from a memory leak. For us this was a good point in time to decide to mature the entire stack and move to newer, more cutting edge technologies.

The new stack

One of our first decisions was to rewrite our entire codebase in TypeScript, which many of you know is a purely syntactical superset of Javascript offering static typing and thus type safety.

Many articles have been written about this already so it’s not very meaningful to go into great detail here. Suffice it to say that programming in TypeScript is bliss, especially for engineers with an object oriented programming background. Once you’ve worked in TypeScript, there’s just no going back, up to the point that even looking at plain Javascript hurts my eyes every time I see it.

Ethers

After being stuck in limbo for the past year between web3 v0.2 with all of its known issues, or upgrading to v1 which is still in beta and thus possibly comes with some instability, we were glad to see a new library was gaining some traction. When we discovered our watchers were leaking memory we decided to move once and for all and rewrite the whole thing using the ethers.js library. After a couple of months of working with it, and having encountered numerous corner cases making me dig deeper in their lower level API, I can say it’s very readable and easily extendable. Kudos to Richard Moore who’s behind the project and is very actively tackling all open GitHub issues on a daily basis by the way.

RabbitMQ

Mint 1.0 was mostly purely synchronous behaviour, which means that when you were interacting with the blockchain through our REST API you were bound by the default 60s Nginx timeout window before your calls would end up in gateway timeouts. Other than this we also felt we were very limited in the amount of open connections our servers could handle at any point in time. If some requests would tend to stall a little, this number became very unpredictable and our overall system ended up becoming rather unstable during moments of heavy load. As we are eyeing to become an enterprise level data marketplace for worldwide IoT sensor data this obviously was not an option for us.

By rewriting all our endpoints to handle requests in an asynchronous manner, meaning that POST requests immediately return a request identifier for which you can poll the response later on, we brought back stability to our systems, even in times when the platform’s under stress. This comes in very handy as not all blockchain applications are necessarily a continuous stream of transactions that need to be sent to the chain. On the contrary, that traffic is more often than not very bursty and using this technique we managed to not only buffer incoming requests but we’re actually able to optimise the processing up to a point we can fit more transactions into a single block, thus increasing our overall TPS.

Aside from that it enabled reworking the codebase to a fully decentralised architecture, we ended up containerising every service just because it felt natural to do so. We’ve also managed to make synchronous communication between consumers possible by implementing RPC calls using simple direct replyTo queues. Obviously these should be treated with a little caution but they are extremely useful for nonce tracking for example. It allowed us to make a nonce tracking microservice able to keep track of your nonce on multiple networks, or even if you’re interacting with multiple Mint instances at a time, again increasing the overall TPS as we’re able to correctly predict nonces without running into too many invalid nonce errors. For the most part though our workers are mere small snippets of code executing on certain data and republishing messages to a wide array of queues for further processing, which means we can scale horizontally.

Another great technology which we implemented and which goes hand in hand with AMQP is MQTT. Coincidentally this protocol was designed for enabling communication between IoT devices, but we just use it for the slimmed down pub/sub messaging protocol it is. The rabbitmq plugin does nothing really other than host a socket on which frontends can subscribe to topics they’re interested in; we however ensure that our entire stack religiously communicates every interaction with the blockchain, tagged by your request id for example. Now if you use your imagination and combine this with react and redux, I’m sure you can imagine just how powerful this solution really is.

Mongo 4.0

What can I say … I freakin’ love Mongo. We decided to step away from ODMs like Mongorito or Mongoose and write all our queries raw using the incredibly awesome aggregation pipeline. I remember back when WiredTiger was first introduced I was amazed at the power of their engine and with the new versions 3.6 and later 4.0 they’ve just blown me away with the capabilities of their tech. One of the great additions I feel in the latest version are the change streams cursors enabling you to have very fine-grained listeners on all your Mongo collections, triggering the necessary queues and updating whatever’s required.

Webhooks

I think it was a Friday in August when Roderik opened up a medium prio ticket stating “Hey don’t you think it would be cool to add webhooks?”. Well by Saturday evening we had implemented three different types of webhooks, so users of Mint can now subscribe to:

  • Events that get triggered
  • Addresses that are interacted with
  • Timed hooks that relay the user’s wallet balance for example

ERC20 token bridge

I guess you can say this token bridge we’ve built is part of our new stack as well. The reason we need it, is to allow for scalability of the marketplace and still maintain a real life use case for our DTX token on mainnet. It mitigates scaling issues because it enables using a private PoA network, where you can decide the gas cost and blocktimes yourself. Furthermore it protects you against the congestion that we see happening on the mainnet from time to time.

It’s actually pretty cool, I’ve pulled this schematic from the internet somewhere describing the deposit flow. In short it’s basically just 2 contracts on 2 different chains that communicate with each other through events for which you have to listen to in a couple of validator nodes that are signing requests using seeds the contract on the mainnet knows the addresses of. Read that twice, it’s really not that difficult when you get how it works. Implementing it however is a real PITA, but that’s just because… blockchain.

What’s next

All of the above doesn’t even describe half of what we’ve been doing, but the rest is top secret and we will never tell you. Most of all we’re glad to have the ERC20 token bridge in production, allowing everyone to transfer DTX tokens from the mainnet onto our private PoA network and making the DTX token truly useful and usable on our platform. With that I’d like to welcome every data sensor owner or buyer to start interacting with it. Withdrawals are disabled for the time being.

Now I hear you all thinking … why is this blog post titled Enterprise Readiness? Well we, the DataBroker DAO team, feel that with this End of Summer release we are finally ready to welcome enterprise level players onto the system. If you have data just sitting there, collecting dust in your data silos, please don’t hesitate to contact our admins. We will guide you through the enlisting process and answer any questions you may have so you can turn that pile of bits into a big pile of gold, DTX tokens, whatever you want to call it!

Oh, and I was kidding about the withdrawals by the way, of course we made these possible too!

https://youtu.be/vAny5zIWk3s

Stay tuned, because winter is coming and we plan on working even harder, now there’s no real reason to go outside anymore.