Protecting Secret Zero No Longer Requires Black Belt In Security

Protecting source code and data has added considerable time and complexity to development. Steve Van Lare, VP of Engineering, Anjuna Security discusses how from-the-ground-up data and code security speeds development and frees innovation.

September 1, 2022

While pressures on developers continue to intensify, building security for code and data directly into the codebase no longer needs to be a concern due to the advances in confidential computing technology in the public cloud. The challenge with encryption is how to distribute keys rather than issues with algorithms. Previously for most systems, one still needed a secret for getting keys from a key manager, and that secret is the root of trust. The problem of protecting the code and operations’ secrets was met by building secret zero—a master secret protecting all other secrets—directly into the code. Besides the work involved in building such a system of cascading secrets and the lengths required to protect secret zero, the overall endeavor left a considerable single point of vulnerability or failure. If someone gains access to secret zero, the rest is like a house of cards. Such a scenario is not purely hypothetical. This is exactly what happened in the case of the devastating Solar Winds breach.

Protecting Secret Zero

One way to protect secret zero is by making application code unreadable. This is now more important than ever. Doing so is even more complicated, as applications must be designed for mobile and cloud deployment, full scalability and multi-cloud flexibility. Accomplishing this has primarily been achieved through obfuscation. But even when code is obfuscated, a dedicated hacker can still read the code and find the secret zero. Blockchain technologists have recognized this vulnerability and have turned to Multi-Party Computation (MPC) to lessen the gap by distributing the private key over multiple computers, requiring an attacker to compromise each simultaneously to get access. Even this approach has been seen to be vulnerable, and organizations are moving MPC to the secure enclaves of confidential computing in public clouds to close the gap.

Confidential computing adds new hardware security capabilities to help address these problems. Each processor now has a key built into it. Rather than embedding secret zero in code, developers can have the CPU generate a hash of their code and digitally sign it. When the signature is validated, it provides an attestation that the code has not been tampered with. A key release process based on hardware attestation allows the key to be used as a token. This approach combines locking down the key with valuable attestation so that developers in essence gain two benefits with the same solution.

Confidential Computing Becoming Practical

To offset the previous requirements of crafting or modifying code and processes to work in the now widely deployed confidential computing environments of public clouds, new technology makes its use transparent so no special development is required. Some of this technology can also make the deployments completely portable across the various confidential computing environments used by different public cloud infrastructures. In this way, a multi-cloud deployment capability exists from the get-go and does not require additional work. This portability also extends so that the same code or workload can use the secure enclaves within the private cloud and on-premises environments.

As other technologies have assisted software developers, such as microservices, libraries, development hubs, APIs, and API managers or gateways, using confidential computing to ensure code security helps reduce developer workloads. These technologies help foster innovation and get developers focused more on features and functionality than custodial mechanics. Of course, other forces and requirements add to developer stress and demand, so having ones to offset some pressure is a great benefit

Addressing Encryption Gaps

Confidential computing also helps reduce complexity and work on the operational side. Right now, most organizations depend on encryption to protect data—and even code—while in storage (data at rest) or during transmission (data in motion). While these measures are secure, have little to no penalty for performance or scale, and are relatively easy to enact, they face two primary issues. First and foremost, they protect only two of the three states for data and code—at rest and in motion. Although generally not widely discussed, data at runtime (execution) is wide open to rogue insiders, third parties, or attackers since, to be acted upon, data must be in the clear. This encryption gap is becoming a growing concern, and its existence has not escaped the notice of more sophisticated attackers and hackers. Second, most organizations face a complex and potentially confusing jungle of encryption schemes. Because of this, it is sometimes difficult to know which encryption mechanism is protecting which assets. Questions such as how keys are maintained and protected and where the encryption terminates are sometimes difficult, if not impossible, to answer. One age-old security adage is that complexity tends to undermine security—if security is too difficult, it will be undercut or ineffective.

Confidential computing solves this problem in two ways. First, confidential computing can lock down data and code during execution, so that no unauthorized party has access to what is in the CPU or memory, even if they have root access to the server. Besides protecting a secret zero and ensuring immutability and attestation, confidential computing can prevent code from even being viewed. This is especially important in high-value, proprietary applications and ones using valuable AI or machine learning algorithms. Second, some companion technology that eliminates requirements for confidential computing code and process modifications can also be leveraged to extend the runtime encryption to data in transit and at rest. The result enables a unified encryption mechanism for all states of data and code. Such an approach frees both development and operations from having to implement data and code protection.

See More: Another Encryption Algorithm Meant to Protect Against Quantum Attacks Bites the Dust

New Protection Critical for AI, Machine Learning and MPC

Development requires testing and tuning using real data sets in the growing case of machine learning and AI algorithms. AI, ML algorithms can be protected and kept in the complete control of their owner through the use of MPC in a confidential computing environment. Partners or customers can let developers test and tune algorithms on their proprietary data without loss of control or without developers ever actually seeing the data. Conversely, developers can open their code up to various parties to run and tune real-world code without losing control of proprietary algorithms or without other parties having any ability to see the code.

Confidential computing environments in the public cloud can be a game-changer for development and operations, fostering greater innovation and greater availability of applications. Rather than bake security into the development—and deployment—process to ensure immutability and protect proprietary data and code, technologists can rely on the hardened security of confidential computing environments. The result can be better production environments and a fast-tracking of application innovation.

How do you think using confidential computing can revolutionize SecDevOps? Tell us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window .

MORE ON CLOUD INNOVATION

Steve Van Lare
Steve Van Lare

VP of Engineering, Anjuna Security

Steve is VP Engineering at Anjuna. Steve has 30 years of experience in enterprise software. He was formerly the Vice-President of Engineering at Automation Anywhere. In the security area, Steve was Vice-President of Engineering at 41st Parameter, a leader in fraud detection. Previously, he was a VP at Oracle and Agile Software. Steve has a BS and MS in Electrical Engineering from Santa Clara University.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.