Confidential Computing
What is Confidential Computing?
Confidential computing refers to cloud computing technology that can isolate data within a protected central processing unit (CPU) while it is being processed. Within the CPU’s environment is the data that the CPU processes and the methods used to process this data. This is only accessible to specially authorized—for the purpose of providing privileged access—programming code. The CPU’s resources are otherwise invisible and cannot be discovered by any program or person, and that applies to the cloud provider as well.
More and more, organizations are turning to hybrid and public cloud services, making it more important than ever to find data privacy solutions. The main objective of confidential computing involves providing companies with a greater sense of confidence in the security of their data. They need to know it is well-protected and kept confidential before they can feel comfortable moving it to a cloud environment.
This confidence is just as important when it comes to sensitive or business-critical workloads. For many companies, the move to the cloud involves trusting in an unseen technology. This may raise difficult questions, particularly if unknown individuals, such as the cloud provider, can gain access to their digital assets. Confidential computing seeks to allay these concerns.
The data encryption concept is not new to cloud computing. For years, cloud providers have been encrypting data at rest, sitting in a database or a storage device. They have also encrypted data in transit, moving through a network. These have long been central aspects of cloud security. However, with confidential computing, in addition to data that is at rest and in transit, data in use is also protected with encryption.
How Confidential Computing Works
Applications process data, and to do this, they interface with a computer’s memory. Before an application can process data, it has to go through decryption in memory. Because the data is, for a moment, unencrypted, it is left exposed. It can be accessed, encryption-free, right before, during, and right after it has been processed. This leaves it exposed to threats like memory dump attacks, which involve capturing and using random access memory (RAM) put on a storage drive in the event of an unrecoverable error.
The attacker triggers this error as part of the attack, forcing the data to be exposed. Data is also exposed to root user compromises, which occur when the wrong person gains access to admin privileges and can therefore access data before, during, and after it has been processed.
Confidential computing fixes this issue by using a hardware-based architecture referred to as a trusted execution environment (TEE). This is a secure coprocessor inside a CPU. Embedded encryption keys are used to secure the TEE. To make sure the TEEs are only accessible to the application code authorized for it, the coprocessor uses attestation mechanisms that are embedded within. If the system comes under attack by malware or unauthorized code as it tries to access the encryption keys, the TEE will deny the attempt at access and cancel the computation.
This allows sensitive data to stay protected while in memory. When the application tells the TEE to decrypt it, the data is released for processing. While the data is decrypted and being processed by the computer, it is invisible to everything and everyone else. This includes the cloud provider, other computer resources, hypervisors, virtual machines, and even the operating system.
Why is Confidential Computing a Breakthrough Technology?
Confidential computing is a breakthrough technology because it meets a need unique to cloud computing and in increasing demand: trustless security in a cloud computing environment. Cloud computing will likely continue being the go-to solution for private users who need to know that their software, computational workloads, and data are not left open for cloud providers or people they do not want to have contact with their computing.
Currently, if a bad actor successfully obtains or forges the credentials of a cloud provider, they can gain access to sensitive processes, data, and software. In a traditional, on-premises computing environment, unless the core infrastructure is without security at its edge, the most direct way of accessing it is to execute some type of in-person attack. An internal data center behind lock and key therefore provides users with a sense of security.
Whether their confidence is justified or recommendable is a moot point. The sense of control over the computing environment still engenders a feeling of trust. The same level of trust can be achieved with confidential computing—but in a cloud environment, where the digital assets are thousands of miles away. This can pave the way for organizations to heartily adopt the newest cloud technologies without worrying about data privacy or potential compliance issues.
Organizations subject to compliance regulations may feel far more comfortable shifting their workloads to the cloud. Even an accidental breach can leave a business subject to stiff penalties or potential lawsuits. Without confidential computing, services like Google Cloud and Kubernetes can only provide so much confidence to those wary of cloud security. However, with options like Microsoft Azure confidential cloud computing, sensitive data is isolated from unauthorized access, not just by people but by other applications and processes within the computer.
Some of the benefits of confidential computing include:
- Protecting sensitive data while it is being processed: With confidential computing, data can be encrypted at rest and while in transit. This allows sensitive workloads to be managed by cloud assets because the data is protected while being processed. A lack of encryption during data processing was one of the biggest barriers for organizations that wanted to shift their computations to cloud environments. Confidential computing removes the barrier, paving the way for a departure from expensive in-house computing environments and a move toward a more flexible, agile, and cost-friendly cloud.
- Protecting sensitive intellectual property: In addition to protecting data, the TEE infrastructure can be implemented to guard business logic processes that companies want to keep secret. It can also be used to safeguard machine learning processes and the inner workings of entire applications. While data is always a priority for protection, the processes an organization uses in proprietary programs can be just as valuable, if not more so. This is particularly true when a process is integral to enhancing the efficiency of an organization or providing an offering with which another company cannot compete. The organization can deploy confidential computing in the cloud without worrying about an essential element of their offering being stolen by a competitor.
- Collaboration with partners in the creation of novel cloud solutions: One company may have sensitive data while another has a process they want to keep secret. However, the two companies want to combine forces to develop a new cloud solution. With confidential computing, they can share resources as they collaborate on the project without worrying about secrets leaking in the process. This can make it possible for even some of the biggest players in the industry to combine minds and resources to solve pressing problems.
- Giving customers more confidence as they choose a cloud provider: With confidential computing, a company can choose a cloud company based on business and technical needs alone. They will not have to concern themselves about how their data is stored and processed, or whether customer data may be intercepted as it enters or leaves memory, for example. Also, if the cloud provider is a company’s competition, they can still feel comfortable doing business with them because confidential computing removes the inherent conflict of interest.
- Protecting data processing that happens at the edge: If an edge computing architecture employs the use of cloud services, confidential computing can be the final piece in the puzzle for securing both the data and the edge computational workload.
The Confidential Computing Consortium
Some of the biggest CPU manufacturers got together in 2019, forming the Confidential Computing Consortium (CCC). This included VMware, Tencent, Swisscom, Baidu, AMD, Alibaba, Google, IBM/Red Hat, Microsoft, Intel, and Oracle.
The goals of the CCC are to set standards for the industry that will promote the open-source development of confidential computing resources. The consortium has already sponsored two projects, Red Hat Enarx and Open Enclave SDK, which make it easier for developers to build applications that can run without being modified for different TEE platforms.
Members of the consortium had actually already started developing confidential computing tools even before the organization was formed. The projects include IBM Cloud Shield and IBM Cloud Hyper Project Services. Also, Intel SGX enables TEEs on its Xeon CPU platform, which was available starting in 2016. With these innovations already part of confidential computing’s foundation, the technology has a strong base on which to build and may be a go-to solution for cloud computing in the future.

How Fortinet Can Help
Built around the FortiGate-VM next-generation firewall (NGFW), Fortinet Public Cloud Security solutions support confidential computing, providing holistic protection for both data and workloads residing in the cloud. FortiWeb-VM and FortiWeb Cloud provide web application firewall (WAF) protection for cloud-based web software, mitigating attacks that target both known and unknown vulnerabilities. FortiMail delivers multi-layered protection for messaging systems against threats like spam, phishing, and Business Email Compromise (BEC). And FortiDDoS defends data centers from distributed denial-of-service (DDoS) attacks using a multi-layered approach.
Fortinet can also help organizations optimize cloud performance while maintaining confidential computing. The FortiADC application delivery controller provides unmatched application acceleration, load balancing, and web security. The DNS-based FortiGSLB Cloud global server load balancing tool can complement FortiADC for large global networks.
All these solutions can be tied together using the FortiCNP cloud-native protection tool, which natively integrates Cloud Service Providers’ (CSP) security features with the Fortinet Security Fabric. FortiCNP’s patented Risk Resource Insights (RRI)™ helps organizations manage cloud risk with actionable insights that help security teams prioritize their efforts.