Data Sovereignty comes back
info@hiveware.com

Data sovereignty comes back on the agenda

Revelations from official monitoring of data in the US have brought the issue of data sovereignty – where data physically resides – high on many CIOs’ agendas.  Len Padilla, VP of Product Strategy, NTT Communications in Europe, discusses why this is happening and how CIOs should respond

8 October 2013 by Len Padilla

Share on linkedinShare on facebook

Share on twitterShare on emailShare on print

data_sovereignty_padlockhttp://www.datacenterdynamics.com/__data/assets/image/0007/136933/plus.png

Growing numbers of CIOs and IT organisations are turning to the Cloud to help them gain a competitive edge for the business, cut costs and do more with less. Nearly three quarters of the UK CIOs we surveyed in May this year agreed they were using the Cloud in some capacity within their infrastructure. One of the key benefits of doing so, the study found, was enabling truly ubiquitous access to business applications on data – anywhere, and on any device. This is indeed the Cloud’s biggest advantage over in-house IT infrastructure – but is it also its greatest weakness, as recent events have demonstrated? 

Revelations about the US government’s PRISM surveillance program by the whistle-blower Edward Snowden have prompted many organisations to rethink their investments in cloud. Among the allegations Snowden made were claims that US-based technology firms colluded with the government to provide on-demand access to data held on their systems. Many providers have subsequently denied the allegations, but the damage has been done.

All of a sudden, data sovereignty – the physical location where data is stored and the kind of organisation that is storing it – now matters to CIOs.  Keeping it in data centers in a country where the authorities could access or monitor it without consent constitutes a significant business risk. The ability to choose where it resides – and even transfer it from one jurisdiction to another on demand – is now a prime concern.

Policymakers in Europe have lent their voices to this concern. German Interior Minister Hans-Peter Friedrich has said: “whoever fears their communication is being intercepted in any way should use services that don't go through American servers.” And the allegations have hit US cloud providers hard – with $21 to $35bn predicted losses, according to the Information Technology and Innovation Foundation.

Anyone moving information from their immediate control needs to acknowledge a degree of loss of control and reliance on third parties. However, the consideration of whether you trust your supplier must now include their government.

Yet moving corporate applications and data back into localized clouds – or even back in house altogether – is not the answer either. Cloud platforms do help firms become more agile, and do help foster technology innovation, even in the most risk-averse organisations. CIOs need a way to retain those benefits in a way that also protects the organisation, and the data it holds, against being compromised in any way.

Clearly, scrutinising cloud providers’ global network and data center footprints, including where they are headquartered is a crucial first step. Arguably as important is the ability to restrict or move data around on demand – to support new branch offices in new countries, for example. This is highly challenging technically as it requires the entire network, server and storage infrastructure to be virtualised and automated to a large degree. Delivering such enterprise cloud services to the world without touching any US-located infrastructure at all is even more difficult. The routing of data travelling through the Internet is automated, so there is no way of predicting the fastest path and there are any number of routes which data can take.

So how does this affect ‘The Cloud’?  Gone are the days where you can visit a data center and be shown the very server where your information is stored. However, the Cloud’s versatility enables data to be placed wherever the supplier wants to put it, e.g., to restrict it to one country (or several) or one data centre (or several) or even a set of servers within a data center. And wherever the data is put, they probably will not give you audit rights to check the implemented security. Therefore due diligence is as important as ever when choosing a cloud provider and each business moving to the cloud will need to contemplate the following questions:

·        Is your provider headquartered in the US or does it have a US presence?

·        Does your supplier have good security credentials?

·        Can you restrict where your data is located with your supplier’s cloud?

·        Does your supplier allow you to audit them?

·        Does your supplier guarantee data destruction?

Is your cloud supplier experienced at handling sensitive industries such as financial institutions?

The cloud’s appeal has mushroomed as CIOs have embraced its promises of increased flexibility and cost control. Whilst the cloud is not a guarantee against covert spying activities or indeed interception of traffic it enables flexibility and a level of security that cultivate businesses cost effective expansion across territories. It is also a great shame that the national-security regimes in one country are now causing this compelling idea to unravel somewhat. 

We hope the PRISM affair turns out to be just a temporary setback to truly mass-market adoption of cloud computing.  For now, however, CIOs must factor data sovereignty into every cloud computing decision.

 

 

 
  - Nov 22, 2023, you can now follow development of Hiveware's built-in apps. Just go to top Hiveware domains, then find and click on (DEV). This will show you a pdf of and history of these projects development from a GUI perspective.

- June 15, 2021, Presented CableLabs with Hiveware Inc and Microsoft findings that their DOCSIS 3.1 gateway modem specifications have not led to ISP venders implementing IPv6 end point to end point Reachability. Local Reachability succeeds, but both Intra-ISP and Inter-ISP cable modem Reachability fail.

- Sept 15, 2020, Determined that ISPs that offer Ipv6 like Cox and Comcast, are not inter-connectable. See my explanation, which means Microsoft's socket library, Winsock2, is not to blame.

- May 18, 2020, Hiveware Ipv6-Ipv6/Ipv4-Ipv4 connectability succeeded Debug and Release. This breaks the stranglehold NAT has on Hiveware residential deployability (but only for intra-ISP comms for now, fx, XfinityWifi does not work where the problem lies with either Microsoft, Xfinity or Cox).

- March 17, 2020 opens Hiveware for Ipv4Ipv6Comms initial hive offering until June 19th, 2020.

- March 16, 2020, Hiveware for MyFiles private Digital Asset App Offering closed and March 17th, 2020, Hiveware for MyFiles public Digital Asset App Offering opens and will close again on June 19th, 2020.

- March 16, 2020, Hiveware BigBang Test 2-PC Basic succeeded again, but this time using Ipv6. This is the '1' of the decentralized '3-2-1 persistence' model.

- March 17, 2019, Hiveware for MyFiles public ICO began and ended June 16th, 2019

- December 17, 2019, Hiveware for MyFiles private Digital Asset App Offering began and closes March 16th, 2020.

- January 17, 2019, Hiveware BigBang Test 2-PC Basic succeeded. This is the '1' of the decentralized '3-2-1 persistence' model.

- October 1, 2018, Hiveware LittleBang preview running again, this time using production engine code

- August 17, 2018, Hiveware for MyFiles private ICO will begin

- July 17, 2018, Hiveware ICO ended. SoftCap not reached.

- Jun 3, 2018, first to file for Securities Act of 1933 compliance regarding HVW-generating dapp ownership ICO sale

- May 11, 2018, Microsoft delivers native MFC (C++) on ARM64, opening mobile devices and market up to Hiveware code

- April 17, 2018, Hiveware ICO began

- April 13, 2018, white paper published

- Dec 27, 2017, Hiveware engine (4th rewrite) POC done


 
 
  Site Map