Back to Articles

Moving to the clouds? Take a parachute

Moving to the clouds? Take a parachute

Cloud IT services are not as safe as they seem.
There are three risk areas you ought to pay attention to protect your business.

You think you’re stable, but you’re not.  It’s just an illusion.  You’re standing on clouds, not solid ground.  And clouds can melt at any time.  You’re no more secure than we are.  You just don’t happen to have realized it yet.

The Cloud Hunters Alex Shearer

Why clouds? Back in 1994 computer engineers used a cloud-shaped image on a diagram to represent a cluster of servers and services external to the pictured network.  It was a loosely sketched cloud circled around things that were happening somewhere else and were of somebody else’s business. If this sounds bizarre to you, look at the official definition on IBM Global education white paper, “virtualization is a technique for hiding the physical characteristics of computing resources from the way in which other systems, applications or end users interact with those resources.” It appears that data storage and processing, communication channels, software and other services are provided in the cloud under the slogan “this is none of your business”.

As a practical example, when you’re checking and sending e-mails via gmail, synchronize data or get a Google map on your phone, you are using virtualization. Providing cloud computing in the form of internet service leaves behind it’s physical realization – the user does not know and does not control what networks and operating systems are utilized, where the data goes and where it’s stored.

We all have experienced to a greater or a smaller extent the charm of cloud computing – it’s available, mobile and scalable.  The users falling victims to trusting the convenience of the cloud usually go unnoticed by the world, but the price they pay is unexpectedly high.  One of the most recent examples is the blog of Dennis Cooper.  More than 14 years of daily work – the blog, the reviews, the book and correspondence – all of it have gone overnight without warning or explanation. I am not a fan of Mr. Cooper’s artwork, yet it’s hard to believe it took the pressure of 2 months of complaints and petitions from numerous fans and colleagues on top of  acerbic articles from the media such as The New Yorker, Pen America and The Guardian, for Google representatives to enter into negotiations with the lawyer of Dennis Cooper.  As a result, Google agreed to restore the correspondence and relaunch the blog on a new domain, uploading  it post by post upon undergoing censorship.

  1. Control

The vanished blog of Dennis Cooper illustrates the main disadvantage of cloud computing which is users total dependence upon the monopoly provider. And I don’t mean gradual increase in payments, upgrade to costly service plans, pay as you go data traffic rates, uncontrolled updates and others including fine print. Any data entrusted to cloud technologies can be destroyed, cut or censored, as a result of changes in company policy.

One of the latest examples – 25 million personal accounts of the most popular social media,  e-mail service and a search engine were blocked by providers in Ukraine, as a result of sanctions, banning Russian business in the country. For those of you who might overlook the news having hard times pointing out Ukraine on a map – it’s the largest country in Europe.

Since we have started talking about Google, let’s go back to this dominating service provider with the best reputation in the USA as an example to illustrate the case.  In 2011 without notice, Google made unavailable News Archives with 60 million copies of scanned newspapers which have been published during the last two centuries.  A year ago, Google Groups lost a search by date function, making the archive useless for research purposes.  A year ago Google hit us with the news that Blogger service, storing publications of its authors since 1999 will start blocking adult content, so “private nude or sexually explicit images or videos” will be made invisible for other users.  In 2016 Picasa and Google Cloud Connect which have been servicing millions of users for more than a decade were retired.

It looks like Richard Stallman, creator of GNU and Emacs was right when he warned us way back in 2008 in his interview to The Guardian, “One reason you should not use web applications to do your computing is that you lose control,” he said. “It’s just as bad as using a proprietary program. Do your own computing on your own computer with your copy of a freedom-respecting program. If you use a proprietary program or somebody else’s web server, you’re defenseless. You’re putty in the hands of whoever developed that software.”

  1. Keeping the data safe.

The data in the clouds is physically stored in millions of servers in data centers. Therefore even if a provider deploys the cutting edge technologies he can still have a standard set of Internet-related problems, such as blackouts, connectivity issues, natural disasters, and cyberattacks.  Considering the workload of cloud storage it’s reasonable to assume that the risks would grow with the number of users. Back to our case with Google on August 13, 2015, four successive lightings hit the new data center in Belgium, causing a downtime leaving customers without access to their data for four days.

Cloud providers like any other company happen to have dishonest employees and financial problems nobody is immune to.  Around four providers die per year and our data dies with them too.

The owners of cloud services have all the resources to provide resiliency at least equivalent to that of traditional hosting. The world’s leading providers operate enormous number of servers, for example, Google has more than a million of them. Ample resources enable providers to distribute services to multiple servers in different regions.  However, as a rule, the back-up procedure comes down to creating a simple copy in another file on the same box in the same data room, which would not help much in case of a fire, earthquake or any disaster of the kind.  This theory was borne out by a Scandalous Amazon cloud crash where a physical failure in one availability zone had a knock-on effect on the others.

The trend in cyber attacks is an even more underrated threat for cloud users.  Leaving any data on the Internet bear in mind it can leak whether intentionally or by accident. Google’s CEO Sundar Pichai, Facebook’s CEO Mark Zuckerberg, Twitter’s ex-CEO Dick Costolo and Twitter CEO Jack Dorsey’s are among 10% of social media users whose accounts have been compromised. According to Ponemone Institute the cost of data breach is $7 million per incident. The highest price we pay for the undermined trust of our clients, who as we know vote with their feet.

  1. Confidentiality

Just as we cannot guarantee the safety of our data, we cannot be sure it is timely removed from the cloud storage on our command.  No provider gives such a guarantee.  The files in the cloud are deleted with the help of garbage collection technique.  First the data is marked for deletion (mark) and later it’s actually wiped out (sweep). The transition point from one state to another can be postponed for a considerable period of time. Facebook states “It may take up to 90 days to delete data stored in backup systems”, Microsoft informs about “a suspended state for 60 days,” Google, Dropbox and Amazon do not specify any period at all.  As a result of the largest data breach of 2016, out of 400 million exposed users of Friend Finder Network, 15,7 million accounts had been deleted and still kept on the network for 20 years in the format of email@address.com@deleted1.com, which is obviously not accepted for registration.

Not only our data can be up in the clouds for years but also it can be used without your consent. 

Large Internet companies have been repeatedly criticized for violating confidentiality.  In 2013 Edward Snowden disclosed the evidence that American government paid Google, Yahoo, Microsoft and Facebook for tapping into their servers to track the users in a surveillance program known as PRISM.  In 2016 Federal trade Comission charged Google the biggest penalty ever of $22,5 mln for violating privacy settlement.

Who shouldn’t be up in the clouds. 

Based on the cases listed above it is not recommended to use clouds when working with sensitive data, it’s the kind of data, which is if disclosed can result in a loss. This especially relates to healthcare, defense and law enforcement agencies, financial institutions, insurance companies, agencies and e-commerce, that are dealing with critical consumer records and proprietary corporate information.

“You can’t effectively protect your data if you don’t know where it resides,” states 2016 Data breach Investigations report. The best way to protect yourself, is to sign an SLA with your provider and preferably choose a datacenter, located away from overpopulated areas, where natural disasters and power outages are unlikely. Moreover, provider has to undertake responsibility for data security and recovery caused both by natural calamities and system failures.

It’s time to review your cyber security policy.  Breach prevention and data protection are still a must-have. At the same time you no longer think “if” a breach happens it’s the matter of “when” it does. Just like people the more connected computers are, the more they are exposed and vulnerable.  Today there are more internet-connected devices, than people, and there are no safe neighborhoods on the Internet. The cyber attack statistics are depressing – 5 years ago Kaspersky laboratory detected and removed about 50 to 60 new viruses daily, today it processes 310 thousand new malicious files every day.

Due to an upsurge in cyberattacks and evolvement of an advanced persistent threat, for which there are no guaranteed prevention methods developed, cyber security specialists believe it’s better to shift your focus from stopping an unpreventable attack to mitigating the damage instead.  “What companies need to be doing is switching away from trying to prevent hackers from getting into their networks,” such statement should not come as a surprise from Anton Chuvakin who is a Research VP at Gartner for Security and Risk Management Strategies(SRMS) team.  Here is his alternative suggestion, “Thinking about how they can slow hackers down so they can catch them is much more sensible. If hackers steal your encrypted data but then have to spend three days searching for your encryption keys then you have a much better chance of detecting them.”

Define your priorities and build a multilevel security system depending on the sensitivity of the data you store.  Most of the companies tend to accumulate substantial amount of data that they no longer need.  The simplest yet most underestimated method of avoiding the consequences of cyberattacks is to define the data which can trigger public concern and reduce it’s volume methodically destroying the data which is no longer relevant”

How the leading market players fight cyber threat. 

It’s fair to share a couple of success stories.  When a mega breach could have harmed more than a million users of Weebly – one of the most popular free webhosting services,- hackers got access to the names, e-mail addresses and passwords of all database accounts – about 40 million users.  However, intruders were in for a big disappointment – all of the passwords were encrypted with bcrypt, which is a resistant to attacks hashing icorporating unique salt.  Therefore Weebly had plenty of time to initiate the password reset process against any eventuality.  Besides Weebly did not store customer payment details on its server, so no one was hurt.

Datadog, monitoring and analyzing metrics for cloud providers such as Facebook, Adobe, Samsung, Airbnb, WarnerBro etc. was also attacked in July 2016. The stolen passwords became a worthless trophy for the hackers, since they also were encrypted with bcrypt hashing generating unique salt. Datadog agents – an application, gathering data for Datadog from the clients servers, were not affected by the attack. Datadog chief security officer Andrew Bechere explained, “Agents are isolated from our own infrastructure, only ever communicating outbound from your instances to us via HTTPS. Our agents do not send local credentials to Datadog servers for storage.”

What should we do?

Public cloud fails to meet the requirements of overregulated sectors for data control confidentiality and safety. It was a common belief that the future belongs to public cloud. 98% of applications were expected to move to the cloud. This summer, the founder and CEO of Dropbox Drew Houston announced that Dropbox is moving back to hybrid cloud model, due to the issues of “performance flexibility and, most importantly, the security”. According to the latest IDG survey, “nearly 40% of organizations with public cloud experience report having moved public cloud workloads back to on premises, mostly due to security and cost concerns.”

Despite incredible popularity, cloud technologies yield to dedicated solutions on reliability and safety. If your business relies heavily on speed, safety and reliability of the servers, you would be better off using traditional hosting, picking collocation or dedicated server for storing sensitive data.  Aside from cost benefits the main advantage of such teams lies in their strategic approach.  They initially build scalable system, which can be transformed at a minimal cost in a short term.  Sysadmins also use monitoring systems, which allow to proactively react to any problems as well as automated backup systems which notify them of successful or unsuccessful completion (execution).

If you are interested in cutting your IT expenses, you can assign the task of server administration to a professional sysadmin team on outsource. If you decide your business needs dedicated servers operating failure-free 24/7, with cutting on internal IT costs, it might be wise to settle for an IT outsourcing.  A server management company will handle 24-by-7 operation. It usually provides security and performance tuning “squeezing” the most out of both hardware and software.

While Google doesn’t seem to worry about media going frenzy over a deleted blog of a famous artist, a team of sysadmins on outsource will always have a backup, an incident response and disaster recovery plan and will compromise in order to sustain reputation of theirs and their clients.

Eva Bragger

Marketing manager at Supportex

www.supportex.net