Developer Case Study — Dsensor

MaidSafe
safenetwork
Published in
6 min readFeb 8, 2017

--

Decentralized Mapping Protocol Project — Dsensor

Continuing our series of case studies highlighting early stage application development on the SAFE (Secure Access For Everyone) Network, Dsensor is being developed by James Littlejohn. James explored various platforms to store and protect the data he would be collecting and decided to use the SAFE Network, because it reflected his belief that the network should not be driven by economics, but be focused first and foremost on the data.

MaidSafe’s fully secure, decentralised approach supported James’ view that knowledge or data should be in the complete control of user. While it is early days, Dsensor’s use of the SAFE Network APIs in its proof of concept form shows its potential as a viable platform for the management of data. James was also attracted to the SAFE Network, because of its strong encryption, and its ability to break data into chunks before scattering it around the decentralised network of nodes. This ensures the highest possible security and privacy for users when combined with the decentralised architecture, which avoids offering hackers central points of attack on a network, as we experience in today’s centralised, server-based model.

Being open source and supported by a strong community in the SAFE Network forum also means James has ready access to experts and potential partners, who can help to build out the application and trouble-shoot any technical questions. In the future James may also explore using safecoin to incentivise participation on Dsensor.

The Problem with Science

James Littlejohn has been involved in entrepreneurial projects since the dot com boom and while investigating opportunities around text mining identified an opportunity for lifestyle linking analytics, particularly in the area of wearable tech. In the course of his evaluation he recognised a broader application to data mining and analysis in the field of scientific and academic research. James assessed a number of drivers, including emerging technologies and changing economic conditions, which were beginning to have an effect on the way research was conducted.

Firstly, walled garden applications such as Facebook and wearable technologies were becoming more prevalent, and while they were a rich source of data on human activity, access to that information was restricted. At a time when the internet is supposed to be democratising many aspects of work and social life this is endangering an important source of information on lifestyle and health patterns, which could benefit societies around the world.

Secondly, the sustained economic impact of the financial crisis was creating significant pressure on public funding for research at a time when it was needed more than ever. Technology and the availability of large amounts of data is leading to opportunities for breakthroughs in a wide variety of academic and research fields. If the funding is not available via traditional public sources then there is an urgent to find new forms of investment. The rise of alternative cryptocurrencies could potentially address this point, offering a new, fairer way to incentivise and reward individuals for participating in research projects. For example, James envisages a scenario where the grant funder might ‘tokenise’ a percentage of their funding money and issue it via a science blockchain (like Dsensor). This would help to ensure the funding could be traced directly ensuring good governance of scientific research projects and fairer access to resources.

The final driver for a new model reflects an on-going debate about the model of peer-reviewed scientific research. For a number of years there has been a recognition of some fundamental weaknesses in the model in areas such as the replicability of research. In a poll conducted by Nature in May 2016 more than 70% of researchers admitted they had tried and failed to reproduce the experiments of other scientists and more than 50% failed to reproduce their own experiments. Of course this is in part due to the nature of frontier scientific research, which is reliant on trial and error, but there are clearly inefficiencies in the process.

Furthermore, there are questions about efficiency of current research models — in 2009 Chalmers and Glaziou identified some key sources of avoidable waste in biomedical research. They estimated that the cumulative effect was that about 85% of research investment — equating to about $200 billion of the investment in 2010 — is wasted. A blockchain provides a potential solution to this reproducibility crisis as Dr. Sönke Bartling and Benedikt Fecher outline in their paper, “Blockchain for science and knowledge creation.” Although scientific research should be delivered at arm’s length from the individual contributors it is ultimately reliant on individual scientists to gather and interpret data without bias. It is also often reliant on finite data sets, controlled samples or clinical trials, thus limiting the ability to cross reference the findings against other data sources.

Given the availability of data via the internet and the rise of automation technologies, such as machine learning, James believes that if individuals have control of their information they can decide to contribute their information to research projects without the interference of third parties such as academics or technology providers. Using automation scientists, academics — and more importantly citizen scientists — can draw data from anywhere in the world beyond the confines of a specific controlled sample and review independently to provide a data driven outcome.

Building A Blockchain for Science Research — A Truth Engine for Mankind

James’ investigation of text mining approaches led him to peer to peer models, which were enabling the owners of data to take control of how and with whom their information was shared.

It led to the development of Dsensor.org (Decentralized Mapping Protocol), a peer to peer network for science knowledge to be investigated, discovered and shared. It has been based on the principle of science “SenseMaking” and it is designed to evolve peer review to a computational consensus model. Using Dsensor if a scientist creates a thesis and wants to test it the scientist enters the hypothesis in computational form (called a Dmap in Dsensor speak) . The Mapping protocol then automates the testing of the science, starting by trawling the Dsensor network for relevant data from other peers. That data is then sampled and ‘scored’ based on its prediction power to verify or challenge the thesis until a computation consensus is established. Science attaining this status then becomes ‘computationally active’ in the network meaning any peer has the ability to tap into the collective knowledge and feed in their own unique sensor data get the insights from the science working for them.

James has the ambitious goal to become a “truth engine for mankind” ensuring science is based on openness, transparency and reproducible results, essentially checking the accuracy of peer review. Dsensor intends to deliver this outcome by building a network of trustless peers, which has inherit vast complexity making it economically and technically very costly and difficult to corrupt. Dsensor, currently at proof of concept stage utilises the Ethereum blockchain, using its cryptography and a variant of the proof of work process to create a technological and mathematical state where even with colluding computers it is impossible to game the system. Rather than creating complexity using a proof of work Dsensor creates uncertainty using random sampling, in particular the selection of peers from the network and data sampling techniques. The data sampling techniques are selected by each autonomous peer and the network sampling is a property of the Protocol for selecting peers from a Distributed Hash Table. In theory once the network gains a certain size the economic cost of gaming the network with false sensor data to support a fraudulent scientific computation will become extremely costly.

Additional safeguards include the role of reproducibility in science. This creates an immutable audit trail or “mapping accounting” entries that support the most truthful science available to the network. These networks are called GaiaBlocks and are open to be challenged by any peer on the network. Scoring of the science also provides a rating model for scientific research and individual peers on the network. Peers with poor outcomes will be demoted in favour of more highly rated scientific computations.

--

--

MaidSafe
safenetwork

Building the SAFE Network. The world’s first autonomous data network. Privacy, security, freedom. Join us at https://safenetforum.org/