2016: Our year in tech at Data Republic

What a year!

As CTO at Data Republic, this year I felt like a whirlwind with an end result of establishing our platform. With our Series A capital secured we could pursue our dream and develop technology to make it both simpler and more secure for organizations to monetize their data assets, data scientists to create amazing new products from that data, and for users to enrich their knowledge of their customers, their industry, and their business. Looking back on the year that was, I couldn’t be more proud of what our product, development and infrastructure teams have accomplished.  Each major component of the Data Republic platform requires breaking down into smaller scopes, user journeys, technical architectures, QA methodologies, and much more.  On top of that, we had to make some serious decisions about underlying infrastructure, languages, cloud vendors, and the like. Without overloading you with detail about all the individual components of our platform, which number in the 1,000’s, I wanted to share some of our major achievements and lessons from 2016:

What we built:

  • ‘Private by Design’ Personal Information extraction, encryption and tokenization technology alongside Westpac’s Databank. This P.I. (aka P.I.I.) management technology allows our Data Contributors to securely contribute attribute-level data within the Data Republic environment without compromising consumer privacy.
  • Beta Release of the Data Republic platform – A first of its kind data exchange governance solution which provides the workflow technology and secure environment for our Data Contributors and users to communicate and manage data exchange projects together.
  • Data Brokerage Fulfillment Technology – Encompassing the technical implementation of our world-first multi-lateral legal framework for data exchange, our Open Data Marketplace and secure file transfer and management systems.
  • Secure Analytics Workspaces – We developed an Analytics Workspace solution making ours the first data exchange that can transform data during transmission using nearly any tool available in the analytics ecosystem (such as Hadoop, Spark, R, H20, Talend, various ML systems, and many more) thus enabling Analytics Infrastructure as a Service, Data Analytics as a Service, and what I like to lovingly call a Data DMZ (a place for two organizations to collaborate on data where neither organization can take the data out while maintaining audit, lineage, and governance of the data).

What we messed up and learnt from:

Team growth was both a necessity and a challenge for us in year one as our product, development and infrastructure teams worked together to establish team working processes and momentum. Feature development was slower than anticipated while we \”failed fast\” in the process of establishing our platform.

Coming off our minimum viable product (MVP) and attempting to use that as a base for our actual platform was our first major misstep. To head off development delays and ensure our infrastructure was scalable and ‘future-proof’ we rapidly moved to a micro-services architecture allowing the various components to be enhanced without the need for full reworking of the code base.

After releasing our beta platform and landing on a scalable solution, we determined that it made the most sense to restructure teams around our customer and user needs. Recently, we’ve therefore vertically aligned the product management, development, and commercial sales teams to handle the unique requirements of those that are monetizing data (Data Contributors) and those that consume data (Partners and Users).

It’s always been our priority to deliver the right tools to users within our cloud analytics workspaces, but we ran into delays and unnecessary manual set up work for our first few data exchange projects. We’ve learned that scalable big data platforms are the way to go (such as Hadoop, Spark, et al.) and we’ve taken the steps to ensure this type of architecture is embedded beneath the governance and marketplace aspects of the overall platform.  

2017, here we come!

By the end of next year, we aim to have exponentially expanded the feature capabilities of the platform -having laid the required architecture foundations in 2016.  With the team normalized, we are now entering the ‘storming era’ and with a solid base to build on, I see feature enhancements, feature creation, and user experience expanding and enhancing dramatically.

In 2017, I hope to spend more time with our customer base to ensure that we’re building technology which delights on functionality, experience and security.

If you have ideas or feedback, my door always open.

Looking forward to working together next year.