Going Rogue- How did that data get in the cloud?

How much of your corporate data is sitting on an unused virtual machine running on the infrastructure of a cloud service provider? “Ah, but Mike, I don’t have any VM’s running “in the cloud!” Oh really? Want an easy way to check? Go to your Finance organization and ask for a report of corporate credit card use at Amazon. You may be surprised.

Now, I’m not knocking Amazon at all. A great company doing some really innovative stuff. They’ve made it so easy to start up a virtual machine that I worry when my kids are going to start using it!

But for that very reason of ease of use, you need to know if someone in your organization, frustrated with the response of “It’ll take IT a month to provision you that” just went “rogue”. He just couldn’t wait a month for a web server to be provisioned. So he went over to EC2 and start spinning up things and copying data and taking credit cards because IT couldn’t do it fast enough.

It’s this type of scenario that’s contributing to why many organizations are looking at how they can provide, to the business, the same type of flexibility and speed of an EC2-style environment but from within their own datacenter. This is the essence of “Private Cloud”. And when combined with the ability to link a Private Cloud to a Public Cloud, a Hybrid Cloud. The nirvana of being able to “burst” virtual machines off of my infrastructure and on to service providers infrastructure, all the while maintaining security.

Yea… I’m going to put a  sensitive virtual machine or data out into “the cloud” that I have less visibility and control of than my own datacenter? Really?

Well, maybe. But only after you do the next step.

Assess and Measure Risk

imageHow can we, from a security standpoint, really make this work? Like any good security person will tell you, it’s about assessment and measurement of risk. Just because you can, doesn’t mean you should. In the case of virtual machines and data, the VM’s and the data that reside on them need to be assessed, measured for risk, classified and tagged. As I point out in the slide on the left, we need to start calculating a risk score on a VM or the data and based on that risk score, we keep the VM or data in-house or allow it to move to a datacenter out of our control.

Note that I have only 4 variables on the slide

  1. Data Sensitivity
  2. Workload IP (intellectual property)
  3. Business Requirements
  4. Job Security

Obviously, there can be many more variables that can and should be considered.

  • What about the current threat levels that I get from tools like RSA Netwitness?
  • Is there a new piece of malware out there that attacks the technology I used to develop the application?
  • Is it near the end of the quarter and someone is a little antsy and wants things in-house until after the quarter?

All these things and more should be considered when deciding whether stuff should run in your datacenter or a datacenter out of your control.

For example, say I have two servers. One is a web server with a bunch of static images that’s just there to serve up the images in a catalog and the other is it the application server that falls under PCI because it’s dealing with credit cards. As a simple exercise, we could tag the first as “Non-PCI” and the second as “PCI”.

Today, if you are doing this calculation exercise, it’s probably a manual process. But if you’re talking about cloud-scale, this will have to be an automated process.

A look to the future of automated security

Think about this for a second. All sorts of threat info is coming into your Security Operations Center. Based on that information, the security tools kick off changes to the virtualization and cloud infrastructure (that is SO easy to automate) and VM’s either move in or out of different locations or states based on the real-time data.The assessment and risk measurement isn’t a one time thing. It needs to be a continuous process.

In our server example above, if you want to step the classification process up, your DLP solution scans the servers and if PCI data is found, the classification or tag would change, resulting in the VM being pulled out of the public datacenter and back into the private datacenter.

Obligatory Star Trek Reference

How cool would that be? Just like Star Trek, sensors detected a threat, shields come up automatically (I never could understand why someone had to give an order for that to happen!), phasers start charging and the klaxon goes off. You adjust your tunic with The Picard Maneuver and take command of the situation before the Romulan de-cloaks to fire her final, crippling shot! Yes, I just mixed my TOS/TNG references.

Isn’t that how it should be? No surprises? Pre-determined workflows kicking off to protect my assets. Computers doing what computers do best, the manual, tedious tasks we don’t want to do so we can concentrate on the bigger issues like how many people are following you on Twitter? (1,462 but who’s counting)

So, as we come full circle and you’re now considering running that report on Amazon purchases over the past year and catching up with Star Trek on Netflix, remember that these Risk Scores are not calculated by the guy with a corporate credit card and a need for a web server.

And I would hope you’d agree that doing this in the physical world is MUCH harder. The automation capabilities of the virtual/cloud infrastructure can really enable security to work in a more measurable, consistent and adaptive way. The bad guys are adapting all the time.

Thanks for reading. Please comment. I’d love to hear feedback. I’d especially like to hear dissenting views. After all, I’m not a dyed in the wool security guy. I don’t wear a red shirt.

mike
(Not Lt. Expendable)

1 comments

    • John Baker on July 4, 2012 at 6:02 am

    I thought the Romulans were the first species the Federation found that did not need to de-cloak to fire weapons. Discuss!!

    Good post. Very relevant.

Comments have been disabled.