Articles on ransomware & lto tape storage

Will Data Storage Have An Ethical Dimension In The Digital Future Society?

Will Data Storage Have An Ethical Dimension In The Digital Future Society?

The subject of this article was inspired by an event I attended as part of this year’s Mobile World Congress in Barcelona: the Digital Future Society seminar in conjunction with the DFS Think Tank. 

The goal of the DFS project in its own words are:

“To build an inclusive, equitable and sustainable future in the digital era. To this end, DFS engages experts, policymakers, civic organisations and entrepreneurs to explore, experiment and exchange knowledge abot how digital technologies and society evolve together.”

A new ethics for the digital age

One of the panel workshops I attended was entitled ‘A new data ethics for the digital age’, a subject that on the face of it appeared to focus on discussions around privacy: what rights do we have as citizens to own our data in a digital society, and what obligations do institutions and corporations have to honour their responsibilities as data processors?

As the discussion progressed, however, I found myself thinking more and more about the role of my industry in these considerations: in particular, whether we could consider there exists an ethical or even moral dimension to the use of storage technology as well?

The discussion began both with a question and a thought-provoking statement: who do we trust the most to keep our data safe since governments aren’t trusted any more than business. 

One of the panellists, the CIO of a major Spanish bank, commented that from a comercial perspective, customers would make choices based on evidence and bestow their trust on organisations that looked after their data responsibly, based upon practical regulations like GDPR. 

This made me wonder how customers and citizens will become so informed? Aside from the coverage of well-publicised data breaches, (which nowadays can happen to just about any company, however reputable or responsible), how can consumers really make informed choices about how their data is being managed?

That thought was developed by one of the academic contributors, who asserted that citizens need to be more savvy and knowledgeable - e.g. not passive users, but informed experts. But the NGO representative on the panel argued that for this to happen, there needed to be a greater degree of openness, some aspects of which may conflict directly with powerful commercial interests who would possibly prefer less transparency. And clearly, doing business with an audience that may be less knowledgable or sophisticated than you, raises unavoidable ethical considerations about power and balance.

‘I consent’ versus ‘I just want you to work’

To illustrate some of these points, the speaker asked a question using Google as an example (although not to argue that Google is doing anything ‘wrong’) Do we give consent, with full understanding, for our everyday use of search engines to be analysed (to the depth that it undoubtedly is) for Google’s commercial benefit? Do we truly give permission for our digital shadow to be left behind and interrogated after we have ‘left’? Although the majority of citizens are concerned with issues surrounding privacy when asked in isolation, in the midst of their lives, we all tend to want things to ‘just work’. 

Of course, it’s not just business that faces these challenges. Governments may also be viewed with suspicion because they are simultaneously using and regulating technology, as well as trying to keep pace with developments. As the DFS Think Tank itself comments, using data the ‘right’ way is not just about what is technically possible, but also desirable from society’s point of view.

That paradox was explained in more detail by the final panellist, a professional data scientist, as he described how specialists create the data sets and algorithms they use to provide companies and organisations with insight. Big data undoubtedly has the potential to revolutionse. But even the most well-intended big data analysis can have unintended consequences for ethical considerations. Some of these - such as creating AI and algorithm-driven approaches to data analysis and applications that enforce real-world biases - are now entering public debate. But my takeaway was that the general public has insufficient opportunity to see inside this world and even less ability to understand what happens there.

Big bad wolves and fairy godmothers

This brings me back to my key question: will data storage have an ethical dimension in the Digital Future Society? In my opinion, it most certainly does. Sometimes, I feel that phrases like ‘transparency and accountability’ are in danger of becoming debilitated in these discussions. On the one hand, we want to shine a light on organisations, their algorithms and their usage. But we also want these private and public institutions to be our custodians, keeping us safe and secure from the bad actors who undoubtedly exist. Transparency may create risk if data becomes accessible to the wrong forces by virtue of forcing openness at the wrong time, or in the wrong fashion.

It’s as if we want the guardians of our data, be they governmental or private, to be both the big bad wolf and the fairy godmother at the same time. This duality is not credible or sustainable in my opinion.

Storage is ethical to me because depending on what kind of technology you use to store your data, you make it more accessible or less accessible; more secure or more vulnerable; sustainable and environmentally friendly, or not. And those trade offs can have real world impacts in terms of gender or racial equality, career fulfillment or personal opportunity, vectors that most certainly are moral.

It’s important to note that regulations like Europe’s GDPR may not completely assist in this regard. GDPR doesn’t say anything very specific about what technology you should to store data. It merely affirms the outcome it expects - e.g. for data to be available, amendable, disposable and secure. Keeping data on a flash-array, or in the cloud, may make the information highly available but more vulnerable to cybercrime. Putting it all on magnetic tape and stowing it away in a vault could make it impossible for data subjects - me, you, everyone - to get access to it in a hurry. Keeping more data on an inexpensive medium like tape might permit an organisation to invest more in protecting the content that resides closer to the front line. But individual files preserved on tape may not be as simple to locate as objects stored in the cloud. They may be more difficult to use for analysis that could enhance society in a positive way. How often do we hear ‘if only we had known at the time’ in public life? As with any ethical dilemma, there always are trade offs.

Into the future: society or system?

I think these considerations will only accelerate as we enter the imminent 5G future where ‘always on, always connected’ data gathering is going to be as pervasive as oxygen to the functioning of our world. When every moment of your daily life can be captured in 8K resolution from countless cameras; when all your interactions can be recorded by IoT devices and shared instantly across 5G networks; when AI can be used to make instant decisions based on what you do, or what you appear ready to do, purely because of your skin tone, age or gender; the underlying fabric of this vastly intelligent system, including the storage, presents an ethical choice for companies. Our networks will be as closed or as open as we choose or demand to make them. 

That also made me wonder whether one day, we might no longer use the word ’society’ to talk about our relationships and interaction, choosing to use the word ‘system’ instead. A world that is based upon a grand alliance of humanity and technology may cease to be entirely human, especially as seems likely, there will be millions of robots living and working along side us with varying degrees of intelligence. 

Who looks after the data privacy rights of AI machines, especially those capable of generating new ‘personal’ information through machine learning? Artificial creations such as software bots, physical robots, and synthetic biological constructs are unlike any human character and yet something like individual agency can be said to be evolving in them. For example, the way in which AI can now master games or converse. What happens when those artificial agents begin to ‘learn’ ethics and morality entirely independently? How do we value these qualities?

We have a tendency to talk about data in the digital future in the abstract. But it's huge. Data upon an unimaginable scale, 175 trillion gigabytes by 2025. So we’re talking about the Milky Way rather than the Pacific Ocean. The Digital Future Society seminar was a fascinating and enjoyable discussion around a series of questions and challenges that will become more and more pressing as we move towards the stupendous data-driven visions being shared just down the road at MWC.

“Because of the close relationships among the physical, biological and social aspects of any large-scale technological project, advanced large-scale technology cannot be one-sided, in the service of narrow interests, short-sighted, and beyond control: it is many-sided, socially oriented, farsighted, and morally bridled”,

Marc Bunge, ‘Towards a Technoethics’

Print
Andrew Dodd

Andrew Dodd  Andrew Dodd

Worldwide Marketing Communications Manager at HPE Storage

Other posts by Andrew Dodd

Contact author

Contact author

x

Follow or contact us!

Sales Expert | Technical Support