In the recently released “Global Agenda Outlook 2013” by the World Economic Forum, one of the main topics that is tackled as part of the ‘agenda’ is titled ‘Thriving in a Hyperconnected World”. The main premise is that the physical world and the digital world are merging rapidly, and institutions and leaders are not prepared to deal with it. Not only are the technologies evolving, but the amount of data being generated is completely unprecedented, yet will only grow.
Two of the major components of this “hyperconnectedness” that the WEF discusses are data and trust. Marc Davis with Microsoft Online Service Division frames it nicely: “[Big data] is a question of the structure of the digital society and digital economy, what it means to be a person, who has what rights to see and use what information, and for what purposes might they use it.”
Globally, countries and industries are dealing with the policy, economic and regulatory structures (on top of the technical interoperability challenges) to control the flow and sharing of data, particularly personal data. Yet there is virtually nothing that is done today without a data component. There are both huge societal benefits to the amount of data generated today, as well as potentially enormous – and life-threatening- drawbacks to this data if not managed properly, if collected erroneously, and if inappropriately shared.
There are many reasons why we each give data up – to open a bank account, to purchase a vehicle, to get healthcare treatment, to find people to date, to unlock a badge from our favorite gaming site. But in these instances, we make a conscious choice to give up certain pieces of data and information about ourselves.
But we don’t know what the internal data quality practices are of the companies to whom we give data; we don’t know how they manage their cyber security practices; we don’t know how their internal access and authentication controls are managed; we don’t know if the company has the ability to do tagging at the data element level to fortify its privacy compliance protocols; we don’t know to whom the company resells our data; we don’t know if the company’s legitimate business partners with legitimate access to our data are also protecting our data with the same degree of integrity.
Knowing what I know about the actual limited capabilities of federal and states governments here in the U.S. to actually integrate and share data, I’m far less concerned with ‘Big Brother’ than I am with Amazon and Apple (both of whom seem to do a far more effective and efficient job of managing my data correctly) doing something creepy with my data (like recommend me purchasing a Justin Bieber CD).
Trust frameworks, transparency, policies, accountabilities – these are all steps on the right path to building trust. To engender trust by people, by society, in how data is collected, managed, and used, requires multiple degrees of sophistication far beyond where many organizations and institutions are today. This includes with technology, policies and regulations, and economic models. Unfortunately, policy will never keep up with the speed of technology innovation, so it may take awhile to get to trust.
Most importantly, however, individuals need to take responsibility for their data: being educated about their data, about how to control it, and to be given more controls over their data (especially when its in the hands of institutions). This part of the discussion is largely absent from the overall debate, and needs to be given its due attention.
Thoughts about how to move this individual responsibility discussion forward?