General

SAFE Network Development Summary – May 2017

We’ve had quite few requests on social media and on email these past few days requesting updates on development progress. These messages serve to remind us that not everyone has the time or the inclination to read the weekly development updates which we post each Thursday onto the forum. So many projects, so little time! So the intention with this post is to provide a summary of the most recent events and our hopes and expectations moving forward.

Image: Richard Tilney Bassett

Roadmap
The best place to start is our development roadmap, which we updated and published late last week. This web page tries to encapsulate all the complexities of development over time on 1 page so it’s pretty high level, but it is this snapshot view that most people seem to appreciate. You will notice that the roadmap outlines the major aspects of development and a rough indication of the order in which we anticipate tackling them.

You will also notice that we haven’t included timescales. In the past we have provided timescales for ‘launch’ of the network. These have always been wrong despite our best efforts. We have found it difficult to estimate timescales since, we believe, so much of what we have been working on is brand new technology, sometimes completely bespoke, and other times building on the work of other projects. Testing is also interesting, it really helps us understand more about how the network fits together and how it is utilised by our community, but invariably leads to more tweaking and testing with previously unplanned and unknown rework and test durations.

We believe that publishing release dates that have a high degree of uncertainty attached is not helpful to anyone and can cause more frustration than not publishing them at all. Network related development is typically where the biggest black holes are and as we get into incremental development client-side, we anticipate time scales will become more predictable.

Stable decentralised network
In late March we released test 15, a network that incorporated both data centre resource as well as enabling user run vaults. Within this release, users were also able to run the SAFE Browser, Launcher and demo app, which continue to facilitate the storage of private and public data, as well as create public ID’s and publish SAFE websites.

After 3 days of running a stable network without any lost data we realised we had reached an important milestone. While we had done this extensively in private tests, it was fantastic to see it running publicly and see the community reaction to it. Of course, life has a sense of humour and shortly after it became apparent that a script had been written that created fake accounts and filled the relatively small network with data, stopping the creation of new accounts or the uploading of new data. This was really helpful to us as it enabled us to find out what happens to the network when it reaches capacity in a real world setting. The fact that it behaved as expected was reassuring, although we’d be lying if didn’t admit to finding the spam attack a little frustrating. This is of course something that the integration of safecoin would stop, as the requirement to ‘pay’ to store data will make the attack expensive, while the incentive of safecoin to farmers would lead to a significantly bigger network.

What now?
Looking forward we are currently focussed in 3 main areas:

  • Catering for mobile devices.
  • Enabling greater user participation.
  • Improving the resilience and robustness of the network.

Mobile
The patience app developers have shown to this point is soon to be rewarded. The process of converting our APIs away from a REST paradigm to SDKs was essential to cater for mobile devices, as the requirement for REST APIs to maintain state would not have worked with mobile devices that disconnect and reconnect regularly. Users of the SAFE Network will gain access through the Authenticator, a secure gateway that protects user credentials from the application itself. The Authenticator is currently being bundled with the SAFE browser and will enable users to securely authenticate themselves onto the network, or enable them to browse publicly available data without logging in.

To implement Authenticator the team required to add a new data type, mutable data. The new data type improves the network efficiency, saves bandwidth, and provides the granular access control required by mobile platforms.

With mobile devices being so ubiquitous throughout the world, enabling mobile client access to the network, mutable data has been receiving significant focus. From a resource provision perspective, both alpha and beta versions of the network will require laptop and desktop and in time single board computers to earn safecoin when it is released. In time, we will look at enabling mobile devices being able to farm for safecoins when plugged into a power outlet and when in range of WiFi, however, as we will detail below this is not a priority for now.

More alphas
Some of the example applications that have been created are currently being ported to suit the new data type and to be compatible with the new APIs. The team are updating the documentation and are testing the applications using a mock network, and they seem to be far more stable than previous iterations which looks positive. We anticipate alpha 2 will encompass the new Mutable Data type and Authenticator, SAFE Browser DOM APIs and Node.js SDK, along with example apps, tutorials and documentation.

Image: Clint Adair

Alpha 3 will see our focus shift onto enabling a greater number of users to run Vaults from home by integrating uTP. Presently users must TCP port forward, or enable UPnP on their routers which requires a little set up in some cases. Adding uTP support will make for a more seamless process for many while making the network accessible to more users. uTP is used in some BitTorrent protocols and when implemented effectively helps to mitigate poor latency and facilitate the reliable and ordered delivery of data packets.

During this phase we will also integrate node ageing, a feature that make the network more resilient to consensus group attacks. The team will also implement the first part of data chains, a feature that has been planned for a while which it is anticipated will ultimately enable the secure republish of data should the network ever lose power, and to provide validation that data has been stored on the network.

Looking ahead
Beyond alpha 3 we will focus on:

  • Data Chains, part 2.
  • Data republish and network restarts.
  • A security audit of the network
  • Test safecoin
  • Real-time network upgrades
  • Network validated upgrades

As has been the case to this point we will continue to release multiple test nets regularly between each alpha network to prove the technology in a public setting, and to mitigate against the code regressing.

We continue to be grateful to the huge support of the people that take the time to run these networks and report back, you all know who you are!

The Power of the Crowd Series – Part One: The Problem


Prologue
This is the first in a series of articles offering a perspective on the internet today and its impact on society, economies and geo-politics. It is my belief that the internet is broken, but rather than engage in a proper debate about how we fix it the policy makers and regulators are simply trying to band-aid the existing infrastructures. Therefore it is up to us, the users of the internet, to take back control, because we recognise the power the open web can offer everyone on this planet. At a time when discord and disunity seem to be more common place we need to champion the opportunities, accessibility and collaboration it was originally designed to offer.

Therefore in the spirit of the Cluetrain Manifesto and the Declaration of Independence of Cyberspace we should be looking to create a new set of guiding principles. We need to frame the debate about the future of the internet, before its fate is decided for us by politicians and business people, who do not share our vision. We first need to have a discussion about the problem child that is today’s worldwide web, air the challenges and difficult decisions we need to take, because we all must accept that compromises need to be made. The internet today is not the same as it was 20 plus years ago. It will be important to hear arguments from all sides, before we attempt to achieve consensus.

Ultimately it is my belief that we need some form of new social contract about the purpose and role the internet plays in our lives. We all see its potential, its ability to be a force for positive change, but we have all seen its dark side. This discussion is an overly ambitious attempt to seek shared values about what we should expect from the internet in terms of our freedom, privacy, accessibility and opportunity….today I’m asking the power of the crowd to join in the debate and help to find solutions.

The Power of The Crowd: The Problem
When Thomas Friedman wrote “The World is Flat” in 2000 it was hailed as one of the most influential assessments of the impending impact of the Internet. Although we were just about to experience the burst of the dot com bubble there was huge optimism about its potential to level the playing fields for everyone around the world in terms accessibility to information and creating opportunities to collaborate. However, Friedman also highlighted the many in-built inequalities in the existing social, economic and political structures that could potentially have an adverse effect. While he suggested that flattening the world would create new opportunities for those who had previously had little or no chance of social mobility it would also create unpleasant consequences for established economies, such as fierce competition for jobs and downward pressure on incomes.

It could be said that he painted a less than rosy picture in which everyone could really only look forward to uncertainty and instability:

“…today’s workers need to approach the workplace much like athletes preparing for the Olympics, with one difference. “They have to prepare like someone who is training for the Olympics but doesn’t know what sport they are going to enter…”
Thomas Friedman, The World is Flat, 2000

Wind the clock forward to 2016 and we are witnessing both the positives and negatives of “The Flat World.” Economies in the developing world have grown rapidly, the global middle class has expanded, leading to greater social mobility, life expectancy and better standards of living, but equally increasing the demands on resources and the environment. Economies in the developed world have slowed down, productivity has continually declined and incomes have not grown in line with inflation. Of course if I was looking at this purely from a technology and entrepreneurial perspective I could argue it has created huge wealth, especially thanks to the first and second generation of internet companies, ranging from Amazon to Uber. Living standards have not declined in developed economies and we have very much benefited from access to the cheaper goods and workforce coming from the developing nations.

Even so, the suggestion that technology, and more precisely the Internet, has broken down barriers, redefined social norms for the better and created a more equal, fairer society would be an overstatement of the facts. In reality a small handful of technology companies (with a few minor exceptions) are the ones who have all the power and control of the infrastructure we use – you just need to look at the world’s rich list for proof that certain individuals and companies have done very nicely! As citizens and consumers we are dependent on them to give us access to services and communications tools, which originally were designed to be accessible to everyone.

Indeed we would do well to remember the words of the Cluetrain Manifesto and the Declaration of Independence of Cyberspace, because it would seem we are a million miles away from their virtuous intentions.

“We have no elected government, nor are we likely to have one, so I address you with no greater authority than that with which liberty itself always speaks. I declare the global social space we are building to be naturally independent of the tyrannies you seek to impose on us. You have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear.”
John Perry Barlow, February 8, 1996


The Cluetrain Manifesto: 95 Theses

1. Markets are conversations.
2. Markets consist of human beings, not demographic sectors.
3. Conversations among human beings sound human. They are conducted in a human voice.
38. Human communities are based on discourse — on human speech about human concerns.
39. The community of discourse is the market.
40. Companies that do not belong to a community of discourse will die.
72. We like this new marketplace much better. In fact, we are creating it.
73. You’re invited, but it’s our world. Take your shoes off at the door. If you want to barter with us, get down off that camel!
78. You want us to pay? We want you to pay attention.
89. We have real power and we know it. If you don’t quite see the light, some other outfit will come along that’s more attentive, more interesting, more fun to play with.
The Cluetrain Manifesto, 2000

Today I believe the very thing that was supposed to break down physical and virtual barriers has failed. Nearly 50% of the global population is without effective access to the internet, never mind possessing the skills and education to exploit its potential. Indeed economically the gap between the world’s richest and everyone else has continued to get worse. Some commentators argue that traditional command and control organisations are dead, because the Millennials and all subsequent generations will refuse to work in the same formal, traditional structures. Sure, in the rarefied atmosphere of Silicon Valley and other tech hubs around the world that approach may be true, but it is not generally the case. If anything it has taken away opportunities for many in established industries as disruptive technologies based on the internet have turned business models upside down.

Politically everyone pointed to the Arab Spring as a sign of “hope” that instant, unfettered communication could lead to significant political and economic change. We saw Obama become the first President to exploit social media to engage audiences and everyone said that was a good thing for democracy. Likewise some would argue Trump has used the same mechanism to give voice to those people who have been forgotten in the great tech rush. Others question whether we’re now in a situation where (allegedly) student hackers in Macedonia can create a news agenda to decide a Presidential election in return for a lucrative income from digital ad revenue.

Furthermore, the threat of cyber-attack, enabled by the internet, has encouraged governments around the world to adopt far more aggressive stances around national security. Cyber-spying is the latest fashion where any Government with enough money can employ professional hackers to steal industrial secrets or bring down the national grid of a nation state they are not getting along with. And of course if your Government doesn’t like what you’re doing they’ve probably just passed a law to allow them to spy on you without requiring much in the way of legal oversight.

So, if this has left you feeling thoroughly despondent…good. We should all be pretty disappointed with how the internet has turned out and the effect it has had or not had on all our lives. Frankly it is time, now that the internet is more than 20 years old, that we sat down and had a proper conversation about where we want it to go next. It cannot carry on as it is, but we face difficult questions with no easy answers.

Of course we could just stick our heads in the collective sand, but I’m pretty certain that will only make things worse…

Developer Case Study – Dsensor

Decentralized Mapping Protocol Project – Dsensor

Continuing our series of case studies highlighting early stage application development on the SAFE (Secure Access For Everyone) Network, Dsensor is being developed by James Littlejohn. James explored various platforms to store and protect the data he would be collecting and decided to use the SAFE Network, because it reflected his belief that the network should not be driven by economics, but be focused first and foremost on the data.

MaidSafe’s fully secure, decentralised approach supported James’ view that knowledge or data should be in the complete control of user. While it is early days, Dsensor’s use of the SAFE Network APIs in its proof of concept form shows its potential as a viable platform for the management of data. James was also attracted to the SAFE Network, because of its strong encryption, and its ability to break data into chunks before scattering it around the decentralised network of nodes. This ensures the highest possible security and privacy for users when combined with the decentralised architecture, which avoids offering hackers central points of attack on a network, as we experience in today’s centralised, server-based model.

Being open source and supported by a strong community in the SAFE Network forum also means James has ready access to experts and potential partners, who can help to build out the application and trouble-shoot any technical questions. In the future James may also explore using safecoin to incentivise participation on Dsensor.

The Problem with Science

James Littlejohn has been involved in entrepreneurial projects since the dot com boom and while investigating opportunities around text mining identified an opportunity for lifestyle linking analytics, particularly in the area of wearable tech. In the course of his evaluation he recognised a broader application to data mining and analysis in the field of scientific and academic research. James assessed a number of drivers, including emerging technologies and changing economic conditions, which were beginning to have an effect on the way research was conducted.

Firstly, walled garden applications such as Facebook and wearable technologies were becoming more prevalent, and while they were a rich source of data on human activity, access to that information was restricted. At a time when the internet is supposed to be democratising many aspects of work and social life this is endangering an important source of information on lifestyle and health patterns, which could benefit societies around the world.

Secondly, the sustained economic impact of the financial crisis was creating significant pressure on public funding for research at a time when it was needed more than ever. Technology and the availability of large amounts of data is leading to opportunities for breakthroughs in a wide variety of academic and research fields. If the funding is not available via traditional public sources then there is an urgent to find new forms of investment. The rise of alternative cryptocurrencies could potentially address this point, offering a new, fairer way to incentivise and reward individuals for participating in research projects. For example, James envisages a scenario where the grant funder might ‘tokenise’ a percentage of their funding money and issue it via a science blockchain (like Dsensor). This would help to ensure the funding could be traced directly ensuring good governance of scientific research projects and fairer access to resources.

The final driver for a new model reflects an on-going debate about the model of peer-reviewed scientific research. For a number of years there has been a recognition of some fundamental weaknesses in the model in areas such as the replicability of research. In a poll conducted by Nature in May 2016 more than 70% of researchers admitted they had tried and failed to reproduce the experiments of other scientists and more than 50% failed to reproduce their own experiments. Of course this is in part due to the nature of frontier scientific research, which is reliant on trial and error, but there are clearly inefficiencies in the process.

Furthermore, there are questions about efficiency of current research models – in 2009 Chalmers and Glaziou identified some key sources of avoidable waste in biomedical research. They estimated that the cumulative effect was that about 85% of research investment – equating to about $200 billion of the investment in 2010 – is wasted. A blockchain provides a potential solution to this reproducibility crisis as Dr. Sönke Bartling and Benedikt Fecher outline in their paper, “Blockchain for science and knowledge creation.” Although scientific research should be delivered at arm’s length from the individual contributors it is ultimately reliant on individual scientists to gather and interpret data without bias. It is also often reliant on finite data sets, controlled samples or clinical trials, thus limiting the ability to cross reference the findings against other data sources.

Given the availability of data via the internet and the rise of automation technologies, such as machine learning, James believes that if individuals have control of their information they can decide to contribute their information to research projects without the interference of third parties such as academics or technology providers. Using automation scientists, academics – and more importantly citizen scientists – can draw data from anywhere in the world beyond the confines of a specific controlled sample and review independently to provide a data driven outcome.

Building A Blockchain for Science Research – A Truth Engine for Mankind

James’ investigation of text mining approaches led him to peer to peer models, which were enabling the owners of data to take control of how and with whom their information was shared.  

It led to the development of Dsensor.org (Decentralized Mapping Protocol), a peer to peer network for science knowledge to be investigated, discovered and shared. It has been based on the principle of science “SenseMaking” and it is designed to evolve peer review to a computational consensus model.  Using Dsensor if a scientist creates a thesis and wants to test it the scientist enters the hypothesis in computational form (called a Dmap in Dsensor speak) . The Mapping protocol then automates the testing of the science, starting by trawling the Dsensor network for relevant data from other peers. That data is then sampled and ‘scored’ based on its prediction power to verify or challenge the thesis until a computation consensus is established.  Science attaining this status then becomes ‘computationally active’ in the network meaning any peer has the ability to tap into the collective knowledge and feed in their own unique sensor data get the insights from the science working for them.

James has the ambitious goal to become a “truth engine for mankind” ensuring science is based on openness, transparency and reproducible results, essentially checking the accuracy of peer review.  Dsensor intends to deliver this outcome by building a network of trustless peers, which has inherit vast complexity making it economically and technically very costly and difficult to corrupt.  Dsensor, currently at proof of concept stage utilises  the Ethereum blockchain, using its cryptography and a variant of the proof of work process to create a technological and mathematical state where even with colluding computers it is impossible to game the system.   Rather than creating complexity using a proof of work Dsensor creates uncertainty using random sampling, in particular the selection of peers from the network and data sampling techniques.  The data sampling techniques are selected by each autonomous peer and the network sampling is a property of the Protocol for selecting peers from a Distributed Hash Table. In theory once the network gains a certain size the economic cost of gaming the network with false sensor data to support a fraudulent scientific computation will become extremely costly.

Additional safeguards include the role of reproducibility in science.  This creates an immutable audit trail or “mapping accounting” entries that support the most truthful science available to the network.  These networks are called GaiaBlocks and are open to be challenged by any peer on the network.  Scoring of the science also provides a rating model for scientific research and individual peers on the network.  Peers with poor outcomes will be demoted in favour of more highly rated scientific computations.

 

Glasgow University Students To Use The SAFE Network

Since its inception in 1451, Glasgow University has built a worldwide reputation as a centre for innovations which have had a profound effect on the world. Its famous alumni have included John Logie Baird, Lord Kelvin and Adam Smith, whose global impact has left a lasting impression on the world we live in today. Continuing the trend, Glasgow University Computing Science Students will be exposed to the latest in decentralised networking technology as MaidSafe’s SAFE (Secure Access For Everyone) Network puts them at the forefront of research and development into next generation Internet applications.

Students within the computer science department, under the guidance of Dr Inah Omoronyia, Lecturer in Software Engineering and Information Security, will work with the MaidSafe team, led by Scottish Engineer David Irvine. They will provide students guidance in building apps on top of the company’s platform, a secure and decentralised data and communications network that replaces the world’s data centres and servers with the spare computing capacity of the networks users. This comes at a time of great debate about the future of the Internet with leading academics, including the founder of the worldwide web Sir Tim Berners-Lee, seeking to improve the security and privacy offered to users.

The SAFE Network provides a zero cost infrastructure for students and the current APIs enable the creation of storage and email applications. This functionality is laid out to developers in tutorials created by the company, and this will be expanded over the next few months as MaidSafe release tutorials every 2 weeks, providing increasingly more complex functionality to application developers.

MaidSafe CEO, David Irvine, commented on the partnership; “We are delighted to be working with a university with such a rich heritage and we very much look forward to using the applications created by their students. Where better to push the envelope of evolutionary thinking than the country that Voltaire opined “We look to Scotland for all our ideas of civilisation”. Glasgow University has an excellent opportunity to be at the forefront of research and development in the field of Internet technologies, alongside the likes of MIT, which will further enhance its reputation – and that of Scotland – as a source of cutting edge innovation.”

Glasgow University has one of the leading computing science departments in the UK and is ranked amongst the top 100 in the world. Lecturer in Software Engineering and Information Security, Dr Inah Omoronyia confirmed “It’s a great opportunity for our students at Glasgow University to get hands-on experience with building apps for the SAFE network. Security and privacy functions is now core to modern day software systems; our students are really excited to learn new ways of building such systems using cutting-edge technology.”

MaidSafe Announces Equity Funding Round on BnktotheFuture

We wanted to share some next steps for MaidSafe and the future of the SAFE Network. On September 12th we will be launching an equity fund-raising round with BnkToTheFuture, a leading global online investment platform for qualifying investors. MaidSafe is looking to raise between £1.75m and £2m during the 30 day campaign.

Next steps

This is an important step for the company and the future of the platform coming on the back of the recent release of the SAFE alpha. We are delighted with the positive feedback we have had to date and the strong support of our community has been hugely important to us. We see this funding round as critical as we move to the next stage of development so we wanted to explain why we think this is the best way forward.

As you know launching the alpha is only one of many steps on our journey. There is a lot of work to do to build out functionality and create a robust platform that will attract users and application developers. That means we need to continue to recruit the best developer talent from around the world and build out a developer programme to support the growing interest in building on the SAFE Network. We will also be looking to develop our own applications, because we do see commercial potential for us as well as for the broader community.

Why BnkToTheFuture?

We have chosen BnktotheFuture for a number of reasons. They have a large online community of qualified investors, interested in the decentralised technology sector. There is a strong relationship with the crypto-communities, which is important to us, and investors will be able to invest in both traditional currency, bitcoin and even some of the more liquid altcoins like Ether. Above all it also means we will remain in control of our strategy and roadmap, which is really important to us as we want the SAFE Network to be available for everyone to use and develop on. This is why the core code-base is open-sourced, to encourage the establishment of a strong community around the technology. Furthermore, we have transferred the core intellectual property (IP) to a Scottish charity (The MaidSafe Foundation) to help ensure that MaidSafe is not seen to benefit unfairly from ownership of the network (although we hold a license in perpetuity to allow us to utilise and license the technology to partner companies).

We appreciate all your support so far and hope you will see this news as further evidence that the SAFE Network is on the fast track to becoming a viable alternative to today’s insecure, unreliable worldwide web. With this investment round we are seeking to accelerate progress further by adding functionality and working towards the beta version and beyond. I hope you will continue to contribute to this exciting journey, because we believe a secure, decentralised approach to the internet will present huge opportunities for users and developers alike.

SAFE Network Alpha Release

After 5 months of testing and 8 test nets it is our pleasure to announce the immediate availability of the alpha release of the SAFE Network. This represents another significant milestone on the way to creating a new, decentralised Internet.

Launcher and demo app

The release is available Windows, Mac and Linux, and comes with 2 components, the Launcher and the Demo App, each with their own installer. You can download both from the alpha page of our website.

The Launcher enables users to create their own account and access the network without providing their credentials (comprised of an account secret and account password) or giving third party applications access to them. This secure gateway enables users to stay in control of their details.

The Demo App must be run alongside Launcher and enables users to create their own SAFE webpage and public ID, store and retrieve private data and share public content. Users can also upload and host existing websites on the SAFE Network without charge.

This alpha release is focused on use of the client software. In a couple of weeks we will provide Vault installers to enable users to contribute their own computing resources. Running 2 parallel networks will enable us to provide a more stable network for end users and app developers, while also enabling ongoing development of the vault, which has a much more active code base at present. Once we are happy with performance, we will revert back to having one network.

As with any alpha software caveats apply to this release and users should be aware that data on the network will be wiped from time to time and there is a possibility that your data will be lost.

Not just for end users

The demo app is the first of several apps currently under development and with an updated release of SAFE Launcher APIs we anticipate that this list will increase in the coming weeks and months. We encourage any developers thinking about creating an app on the network to get in touch and we will help in anyway we can. In this vein a new developer focused SAFE Network forum will be launched next week, more details to follow…

Onward and upward

This is the first of several alpha releases that MaidSafe will make. Future versions will improve performance and increase stability, combine the Client and Vault networks, as well as adding features such as; contact management, messaging, test safecoin and safecoin wallets. More information on our rollout strategy is available here.

This is an incredibly exciting time for MaidSafe and the SAFE Network community. There is a growing movement to return to the original principles of the open web and decentralise the Internet. We believe our technology can contribute to this initiative and that is why we standby our vision to open source the technology so that all users and developers can benefit from it. We believe this is essential to fulfil our commitment to develop a decentralised network that gives users back control and offers far greater protections than today’s Worldwide Web.

Of course, we could not have got this far without the support of all our shareholders and community members. Thanks for all your help, we hope you enjoy the alpha and we look forward to hearing all your feedback! With your input we will iterate the network as quickly as possible to improve it and bring new innovations.

1.1 Billion Reasons Companies Should Encrypt Our Data

As the media pick through the details of the latest large, embarrassing and costly data theft, the current victim, TalkTalk, a UK public telecommunications company, are set for a difficult few months. With revenue of almost £1.8 billion, the company have had an as yet unknown number of their 4 million UK customer details stolen by a perpetrator that ranges anywhere from a 15 year old boy to Islamist militants, if recent reports are to be believed.   

While the post mortem that follows will likely establish the details, the company has already admitted that some of the stolen information was not encrypted. While this was clearly lax for a company that that has been targeted by hackers 3 times in the last year, it seems that under the UK’s Data Protection Act theyare not legally required to encrypt data. The specific wording of the act suggests that firms need only take ‘…appropriate technical and organisational measures…’.

Senior director of security at Echoworx Greg Aligiannis advised “The most concerning revelation from today’s news is the blasé approach to encrypting customer data. Security of sensitive information must be considered a priority by everyone, especially when the life histories of potentially millions of customers are at risk.”

ID-100304695TalkTalk are not alone, research by security specialists Kaspersky Labs suggest that 35% of companies worldwide don’t use encryption to protect data. Surprising given the harsh penalties for breaches. IBM estimates that the average data breach costs $3.8 million, with an average cost of between $145 and $154 per record, not to mention the untold damage to the reputation of the companies affected. When we consider that there were an estimated 1.1 billion records exposed during 2014, we can start to realise the extent of the problem.

With such significant repercussions for being hacked, one must question why encryption technology is not used more widely.

In almost all cases cost will be a factor. Encryption is not cheap. Procedures need to be implemented and maintained by specialist skilled staff and then rolled out through often very large organisations. Asset management, access controls, security incident management, compliance…etc…will all drive the cost, as will new hardware, such as encryption servers. Complexity is another issue that raises many questions: how will the encryption keys be managed? do we let our employees bring and use their own devices into the work place? is the chosen encryption solution compatible with other systems? and what about mobile device management? 

With the number of breaches rising every year and no legal obligation for companies to encrypt our data it would seem that we as individuals need better solutions. For storing data on cloud providers, for example, client-side encryption has existed for sometime that enables users to encrypt their data before it leaves their computer, meaning that companies like Dropbox or Google can’t read your data, although they can delete it. Similarly, the self-encryption component within the SAFE Network also encrypts all network data prior to it leaving the users machine and does so automatically as they upload a file.  Providing encryption that is easy to use and user friendly is surely the key to its wider use. 

However, as good as tools like this are for the storage of our files, we are unfortunately still reliant on companies to look after our personal information and bank account details as things stand. Legislation needs to be tightened up that pushes companies to be much more accountable and responsible with our data. It should demand that not only is our data encrypted, that sufficient policies and procedures are put in place to maximise its effectiveness, as without these, even the strongest encryption can be rendered useless. Providing a high level of data security is simply the cost of doing business, not a nice to have feature.

Events like the TalkTalk hack should also remind us how nonsensical recent Government suggestions that we should ban or attempt to weaken encryption are. It is one of the best lines of defence against adversaries and with its use in all types of commerce, underpins the global economy. 

Image courtesy of David Castillo Dominici at FreeDigitalPhotos.net