Data sovereignty comes back on the agenda
Revelations
from official monitoring of data in the US have brought the issue of data
sovereignty – where data physically resides – high on many CIOs’ agendas.
Len Padilla, VP of Product Strategy, NTT Communications in Europe, discusses
why this is happening and how CIOs should respond
8
October 2013 by Len Padilla
Share on linkedinShare on facebook
Share on twitterShare on emailShare on print
Growing numbers of
CIOs and IT organisations are turning to the
Cloud to help them gain a competitive edge for the business, cut costs and do
more with less. Nearly three quarters of the UK
CIOs we surveyed in May this year
agreed they were using the Cloud in some capacity within their
infrastructure. One of the key benefits of doing so, the study found, was
enabling truly ubiquitous access to business
applications on data – anywhere, and on any device. This is indeed the
Cloud’s biggest advantage over in-house IT infrastructure – but is it also its
greatest weakness, as recent events have demonstrated?
Revelations about the US government’s PRISM surveillance program by the whistle-blower
Edward Snowden have prompted many organisations to rethink their
investments in cloud. Among the allegations Snowden made were claims that
US-based technology firms colluded with the government to provide on-demand access to data held on their
systems. Many providers have subsequently denied the allegations, but the
damage has been done.
All of a sudden, data sovereignty – the
physical location where data is stored and the kind of organisation that is storing it – now matters to CIOs. Keeping it in data
centers in a country where the authorities could access or monitor it without
consent constitutes a significant business risk. The ability to choose
where it resides – and even transfer it from one jurisdiction to another on demand – is now a prime concern.
Policymakers in Europe have lent their
voices to this concern. German Interior Minister Hans-Peter Friedrich has
said: “whoever fears their communication is being intercepted in any way should
use services that don't go through American
servers.” And the allegations have hit US cloud providers hard – with $21
to $35bn predicted losses, according to the Information
Technology and Innovation Foundation.
Anyone moving
information from their immediate control needs to acknowledge a degree of loss
of control and reliance on third parties. However, the consideration of whether
you trust your supplier must now include their government.
Yet moving corporate applications and data back into localized
clouds – or even back in house altogether – is not the answer either. Cloud
platforms do help firms become more agile, and do help foster
technology innovation, even in the most risk-averse organisations. CIOs need a way
to retain those benefits in a way that also protects the organisation, and the data it
holds, against being compromised in any way.
Clearly, scrutinising cloud providers’
global network and data center footprints, including where they are headquartered is a crucial first step. Arguably as
important is the ability to restrict or move data around on demand – to support
new branch offices in new countries, for example. This is highly
challenging technically as it requires the entire network, server and storage infrastructure to be virtualised and automated to a
large degree. Delivering such enterprise cloud services to the world without touching any
US-located infrastructure at all is even more difficult. The routing of data
travelling through the Internet is automated, so
there is no way of predicting the fastest path and there are any number of
routes which data can take.
So how does this
affect ‘The Cloud’? Gone are the days where you can visit a data center
and be shown the very server where your information
is stored. However, the Cloud’s versatility enables data to be placed
wherever the supplier wants to put it, e.g., to restrict it to one country (or
several) or one data centre (or several) or
even a set of servers within a data center. And
wherever the data is put, they probably will not give you audit rights to check
the implemented security. Therefore due diligence is as important as ever when
choosing a cloud provider and each business moving to the cloud will need to
contemplate the following questions:
·
Is your provider headquartered in the US or does it have a
US presence?
·
Does your supplier have good security credentials?
·
Can you restrict where your data is located with your
supplier’s cloud?
·
Does your supplier allow you to audit them?
·
Does your supplier guarantee data destruction?
Is your cloud
supplier experienced at handling sensitive industries such as financial
institutions?
The cloud’s appeal
has mushroomed as CIOs have embraced its promises of
increased flexibility and cost control. Whilst the cloud is not a guarantee
against covert spying activities or indeed interception of traffic it enables
flexibility and a level of security that cultivate businesses cost effective
expansion across territories. It is also a great
shame that the national-security regimes in one country are now causing this
compelling idea to unravel somewhat.
We hope the PRISM
affair turns out to be just a temporary setback to truly mass-market adoption
of cloud computing. For now, however, CIOs must
factor data sovereignty into every cloud computing decision.