By: Adrian Cooper, Field CTO for NetApp Public Sector
Data is undoubtedly one of the buzzwords of the 21st Century. In business especially, everything is data-driven, the daily headlines are filled with news of data breaches – and the eye-watering fines that nearly always follow – while almost every website seeks our permission to use our personal data in various ways.
In the wake of the pandemic, data’s importance reached even loftier, life-saving heights, with scientists using data to make predictions on infection rates and the impact on our health service. While senior government officials stated that decisions of national importance will be made on the evidence of the data.
And this growing dependence on data has only been reinforced by the UK Government’s recent release of its National Data Strategy: an ambitious, pro-growth strategy intended to drive the UK in building a world-leading data economy while ensuring public trust in data use. But what does this all mean for us?
Many people are happy to accept that data is now shaping our lives. Businesses too know the power of data in innovating, experimenting and driving a new era of growth. But at the same time, there’s a heightened awareness of the need to protect personal data against cybersecurity threats.
Ultimately, this requires us to find a balance between risk and opportunity – a task made ever harder by the increasingly sophisticated nature of these threats and the growing value attributed to electronic information, with data widely perceived as the new oil. But above all, it requires a deeper understanding of what we mean by ‘data’ – and how it’s truly shaping our world.
Data makes all the difference – but how?
In today’s data-driven world, the dialogue around data is increasingly business-led, rather than technology-led. With organisations asking themselves: how can we use data to cut costs and improve customer experiences? Can we use data with machine learning to gain a competitive advantage? How can we protect data against theft to avoid reputational damage and risk of financial penalties?
At the same time, businesses are looking to the cloud and exploring how data-intensive computing can elevate their operations, service levels and agility, while removing the burden of legacy technical debt. And in the public sector in particular, effective data sharing within and between government agencies is viewed as the key to improving services for citizens and businesses alike.
Indeed, the Data Saves Lives whitepaper published in June 2021, states that: “data made all the difference” in combatting the pandemic. And goes on to establish how data should be collected, stored and how it should flow through the system in a usable way – which one senses will serve as the blueprint for the government going forward
For the abovementioned scientists and government officials, data will largely be defined by statistics pertaining to events, such as COVID-19. For businesses, this may well be personally-identifiable data or information used to enhance products and services..
Despite all of this, however, one fundamental question keeps popping up in my mind when I hear and read about such topics: do we have a common understanding of what is meant by the term data?
I ask this because the term has become so ubiquitous and yet it will have a very different meaning from one person to the next and from one business to another. This has led me to conclude that most conversations are more generically talking about information. While data is measured in bits and bytes that do not carry any specific meaning, information is presented in a meaningful context, in a way that informs decision-making. Ultimately, information depends on data, but data does not depend on information.
Or to put it another way, data is analogous to crude oil in that it needs to be refined (into petrol or diesel) and distributed to the point of demand in order for it to be truly valuable.
So where does this leave us?
Firstly, context is all important. When commencing a conversation about data, we need to be aware of the fact that different individuals, organisations and sectors have varying views on its intended meaning. We should seek to establish a common understanding as to the setting and any refinement or transporting of data that needs to take place in order to fulfil a need or meet a business objective. Business leaders in particular, care little about how data is stored or managed, but they do now have heightened expectations as to the way is can and should be utilised to drive business value.
Secondly, if we’re to accept that data sharing will be commonplace in the not-so-distant-future, we must also accept that new access controls and security methods will be necessary to support control across complex cloud ecosystems. So, consider how immutable copies of data be used to achieve this without adding risk? Or how storage-efficient tech – such as cloning – can reduce waste and mitigate the need to create multiple copies of data?
Finally, we should think about building a data fabric – or an architecture and set of data services that provides consistent capabilities across endpoints, spanning hybrid multicloud environments. In any case, this presents the opportunity to decouple the dependences between applications, data centres and the underlying storage – enabling more holistic approaches to how we can manage the entire lifecycle of data.
Done well, it provides greater protection against cybersecurity threats, with zero-trust models safeguarding against wider data loss. Ultimately, it empowers people and organisations alike to maintain control of their data by leveraging the power and flexibility of the public cloud – accelerating their data-driven future in the process.
To learn more about accelerating and safeguarding your own data-driven future, visit NetApp.