Historically, federal agencies have been wary of using public cloud due to security concerns. Yet the agility and cost savings offered by cloud infrastructure is proving to be a major incentive, leading to a recent big push for agencies to re-engage with public cloud providers.
Security is still a central issue, and many agencies are looking at cloud service providers whose products adopt a cloud-first strategy, viewing them either as a firewall in the cloud or as a way to outsource security to the cloud.
However, no one is yet entirely clear about how this will work. Agencies are grappling with how to perform audits in the cloud, or how to best comply with some of the National Institute of Standards and Technology ( NIST) requirements or other regulatory standards on securing and encrypting data. How does an agency get encrypted data to the cloud? Once it’s there, can it be read with the agency’s applications? Even simple elements like authentication, SAML, and single sign-on in the cloud have not completely been worked out.
A factor that has both contributed to the popularity of the cloud and destabilized cybersecurity is the internet of things (IoT). Though it is still finding its legs in the consumer space, IoT is actually quite mature in the industrial realm because it offers a much more immediate financial incentive for the industrial, mining, and energy sectors. IoT is also becoming quite popular for the physical security of army bases and other high-security government facilities. Whenever a process needs to be automated, IoT is usually being applied.
The danger is that by doing this these industries and facilities might be exposing themselves to risk. Often IoT devices are running a basic web server with a user or smart device user interface that is connected to the cloud. This is highly exploitable.
There are many agency-specific best practices that focus on the types of applications and servers being used. In terms of large-scale policies, most agencies are still relying on NIST, whether for compliance or for a cybersecurity framework. Agencies also useSecurity Technical Implementation Guides (STIGs ), a cybersecurity methodology for standardizing security protocols within networks, servers, computers, and logical designs that enhance overall security. Each agency also has its specific standards as well.
Even with all these macro and micro standards, however, some agencies remain unaware of the extra steps required to audit cloud environments. For instance, an agency may be using a client server-based application that uses encryption and two-factor authentication. Agents would be logging on to a website that was connected to an application hosted in Amazon. However, if that agency’s notes contain sensitive, classified information that is not legally allowed to leave that agency, that is a significant issue.
It is good that federal IT admins are becoming comfortable with putting data in the cloud and encrypting the data at rest, and perhaps even the data in transit. However, for true data security, the data that is now part of the application must also be encrypted – and a back-end audit is needed for that system.
There are three best practices that will help federal agencies regain control of security and compliance in the cloud.
First, when an agency is designing applications, especially custom applications, they need to build application controls into their APIs that allow them to audit those applications. So when an agency thinks about applications, it also needs to make sure the applications are cloud ready. This means that not only can the application run on Amazon or Microsoft, but also that it can be inspected deeply through APIs versus through just a gateway.
Next, think about what the policies are or should be for auditing or pen testing in the agency’s own environment. Some government cloud service providers don’t allow pen testing as part of their service agreements. Most people don’t realize that they are not allowed to test their application or performance because it could damage the cloud. For those providers that do allow testing, there can be very specific parameters and require signing up for specific plans to do it.
A third best practice is to consider what needs to be moved to the cloud, and when. One agency had 1,800 applications it wanted to move to the cloud over a five-year span. That’s fine – if there was a plan to support them and be able to retrieve all that data. A concrete plan for auditing and testing must also be in place before making such a move.
As federal agencies consider the possibilities of the cloud, they must weigh the cost and efficiency gains against the security concerns. The horrors of past government breaches still echo down the hallowed corridors of government buildings. No other entity has more sensitive information of greater weight than the federal government, so best practices must be in place before cloud migration is safe. Even with the NIST framework and agency-specific rules, IT leaders at government agencies must take the steps required to ensure the highest standard of security, remain compliant, and audit the cloud. Following the best practices outlined above should serve as a helpful guide to get started.
A version of this article originally appeared in Federal Times.