In March of this year, the Inverse Finance DAO voted to launch the Analytics Working Group (AWG) to better support our growth objectives. The main goal of the AWG is to enable members and users to seamlessly access the data of their interest, and to be able to use it in the environment of their choice. We won’t detail here how better data management leads to better-informed decisions, but because of all the tools and the range of skills involved in a DAO and crypto, it can sometimes be a challenge.
Moving To a Decentralized Team Model
There are usually three prevalent models when building a data team: centralized, decentralized, and federated (you can read more on Castor blog here).
Usually, organizations cross through the three categories as they mature and evolve in their relationship with data :
Centralized: usually leads to a centralized data "platform", where the data team has access to all the data, and services the whole organization in a variety of projects.
Decentralized: each department hires its "own" data people, with a centralized data platform.
Federated: data people are embedded in business units, but a centralized group that provides leadership, support, and training remains.
As a DAO, it might be challenging to move from the first model since the skill set required is very broad: you need people that understand the mechanics of various protocols and crypto markets, on-chain data structure, and smart contracts, can dedicate time to test and mine, and finally provide and advise a large range of clients: holders, protocol users, DAO departments, researchers, media or business partners.
The best way to overcome the skill set availability limitation and better serve the DAO is to provide it with data warehouses where users already specialized can access preprocessed data and bootstrap research or analysis on their own: hence moving to a decentralized model where they are gaining autonomy towards quality data access.
Finally, the gap with the federated model can be bridged by providing complete documentation and educating your users on how to fetch and use the data, allowing them to grow and learn independently by owning the tools they use to process information.
Working with Data On-chain
For every researcher, blockchain is an incredible source of data and insights that you can get for free, should you have the power to harness it.
For a DAO, you need more than just owning your data and have to make its acquisition and diffusion processes resilient as well as available to very different profiles of individuals ranging from technical users to DAO members or simply curious website visitors.
Dune Analytics (https://dune.com) is doing an amazing job in providing a tool to present on-chain data in a friendly manner through SQL queries and dashboards, and that’s legitimately why our Analytics journey started over there. We have used it to produce numerous dashboards monitoring Frontier protocol events, governance activity, and much more…(https://dune.com/naoufel/inverse-dao) but as well as to assess risks arising from new situations on the fly or to answer specific treasury questions.
Past this point, we realized that there were other ways to make available our data to a larger extent notably by using The Graph: a protocol that provides a decentralized infrastructure indexing on-chain data in a structured way. We have used it to produce two subgraphs making available data coming from Frontier protocol or our governance on-chain activity. (https://inverse.finance/analytics)
Using this combination of tools, we first remain flexible in producing one-off analyses or presentations dedicated to a specific topic, and second can provide a powerful API endpoint that can be plugged with any data analytics software, not only by the team to provide analytics or display information on the UI but also by third-parties. This benefits the DAO in three ways:
Improving transparency, allowing external researchers or users to investigate the data and overwatch our activity ;
Increasing protocol use, allowing programmatic access to the protocol data in a standardized way incentivizing more people to use Frontier ;
Optimizing productivity for the DAO as a whole, providing a data source that can be accessed and mined by members with different business use cases.
Growing and Learning As a DAO
A way to leverage analytics beyond studying historical data is to monitor in real-time on-chain activity to allow a quick response to various operational or market risks. Our AWG has been building and operating an in-house alerting system that allows the tracking of a number of specific transactions happening on designated smart contracts or oracles on-chain. This has been designed in cooperation with the Risk Working Group to ensure this alerting system would track the operations relevant to them. There are of course many ways to improve this system in terms of infrastructure, algorithms used, application of Machine Learning models and so on. We are really excited about the opportunity of exploring those options in the future.
We are striving to make our processes decentralized and resilient while ensuring business continuity by providing a Gitbook dedicated to our Analytics efforts. It is much more than a Readme file and you will find educational content such as cheat sheets on how to use tools such as The Graph or Dune Analytics to fetch data and present charts in Google Data Studio, or also build an on-chain alerting system using Python and Discord: https://analytics.inverse.finance/
We are doing our best to make it always more comprehensive and keep it updated with the latest working group activities, so don’t hesitate to come along and join us on our learning journey!