Cyber Security and Strategy

Sovereign Clouds And IT Security: Cyber Security in The Cloud Era

How can we develop trust in the security of critical digital infrastructures and sovereign clouds? And what is needed ultimately to justify this trust? A geostrategic analysis of cyber security in the cloud era.

by Michael Barth and Alexander von Gernler

With regulations such as NIS2, the Cyber Resilience Act and the cloud standard BSI C5, Germany and Europe are setting standards. What is frequently branded as EU prohibitionism has a multitude of positive effects.

Michael Barth, Head of Strategy Department

Geostrategic Perspectives And Cyberspace

by Michael Barth

The world has become a less safe place. Instead of bloc confrontation like that experienced during the Cold War, today different players on different fronts operate with, in part, fluid alliances. By recognizing cyber as a further organizational area in addition to the army, navy and air force, almost all of the world's armed forces have documented through coherent action just how decisive superiority in the information space is – for their own information and command capability, but also in order to disrupt the infrastructure and defense capability of the enemy.

A New Era For IT Security

IT security was once considered a niche area that could cause limited damage. Today, it is a matter of digital sovereignty and therefore national security. Severe disruption of civil information infrastructure for power plants, energy networks, hospitals, pipelines, airports or railroads is automatically of great interest from a military perspective. This becomes evident not least in the war in Ukraine and from previous hacker attacks on civil infrastructure.

Attacks for military purposes can quickly and unintentionally spill over into civil structures. At the same time, the operating space of cyber domains offers possibilities for the manipulation and impairment of infrastructures which are below the threshold of armed conflict. The risk of escalation for state actors is thus lower than in other areas of the confrontation. They are therefore the preferred means.

Consequently, it is logical that the EU wants to use its legislation to as far as possible close these gaps in the cyber domain in order to safeguard the provision of public services. The NIS2 directive and the Cyber Resilience Act are particularly noteworthy. Here, it can no longer be assumed that security can be achieved by purely technical means. Instead, instruments such as ISMS, ISO 27001 and IT basic protection in line with specifications from the German Federal Office for Information Security (BSI) are aimed at reducing attack surfaces, strengthening organizational structures as well as establishing a forward-looking risk management including plans to deal with the worst-case scenario.

Germany And Europe Are Setting Standards For Cyber Security

Defense in depth, i.e., the principle of additional security mechanisms, should the first fail, is applied not only at a technical level, but also from a political perspective. Here, Germany and Europe are setting standards. In addition to NIS 2.0, the Cyber Resilience Act and AI Act on a European level, Germany is implementing, e.g., the cloud standard BSI C5 on a national level.

What is frequently branded as prohibitionism has a multitude of positive effects. As with the GDPR, C5 has also resulted in multinational companies such as Amazon, Google or Microsoft conforming to European regulations. In terms of its gross domestic product, the European Economic Area is more than equal to the USA. In this respect, Europe often undervalues itself and, owing to the diverse voices in the Council and Commission, does not appear as homogeneous as the USA. There, a quite specific paradigm such as zero trust may on occasion be decreed for Federal Authorities by way of presidential order.

It is evident that central functions such as access to cloud services or the separation of client and information in the cloud need to be additionally hardened by means of trustworthy services to enable usage in the context of classified information.

Michael Barth

Sovereign Key Technologies For Trustworthy Sovereign Clouds

In recent years, we have seen not only the increasing relevance of zero trust for sovereign information processing, but also the desire to consolidate official information processing using cloud technologies, as has long been the case in business and commerce. The cautious approach of the public sector is understandable: Unlike private enterprise, it would be a mistake for the State to spontaneously apply the trial-and-error method, because the subsequent effects take a long time to deal with. Even the procurement law in its current form is only partially suitable for implementing contracts with uncertain outcome in terms of technology.

At the same time, niche areas such as authorities entrusted with processing classified information ask themselves whether and how they can benefit from the promises that use of the cloud offers. Also, it must be clear that the requirements in terms of security features will be greater than for classic administrative tasks. This is because the information processed is by definition such that if misused it can be at least detrimental to the continued existence of the Federal Republic of Germany.

For this reason, it is clear that central functions such as access to cloud services and also the separation of client and information in the cloud need to be additionally hardened by means of trustworthy services to enable usage in the context of classified information. In this respect, these trust anchors would therefore be indispensable key technologies, the use of which should be demanded by state users to ensure continued control over the critical data.

Technical Perspectives Regarding IT Security And Sovereign Cloud

by Alexander von Gernler

For a long time, the cloud was considered to be "just somebody else's computer". In other words, if I can't trust the cloud operators, I shouldn't use the cloud for calculations and applications. Simply saving data is OK, just as long as it is properly encrypted. Using the cloud for remote file storage, however, unlocks barely any of its benefits. What I don't have under my physical control can't, from a technical viewpoint, be secure. This is why, well before the advent of the cloud, the era of service level agreements and security by contract began. It was relied upon that the service providers would not ruin their own business by handling data carelessly.

From Perimeter to Zero Trust

Along with the cloud movement, networks have also developed further in recent years. The previous perimeter paradigm (inside good, outside bad) went from microsegmentation (firewalls between individual parts of the network), intrusion and exfiltration detection, and software-defined networking to the assumption of zero trust, i.e., that local networks should actually be regarded as irrelevant or compromised. Previously, if the filtering of data took place at packet level and based on technical characteristics (IP address, port), the filter criteria became more and more powerful with each higher ISO/OSI layer. In today's zero trust setups, the most important criterion is the identity of the users. All other criteria can additionally contribute to a defense-in-depth approach.

One hope still not realized in practice is computation on encrypted data, in other words homomorphic encryption. This quickly reaches its limits as soon as more complex mathematical operations are necessary. The known methods scale poorly in terms of time and computing effort. Many have not yet been used extensively enough to develop real trust.

If I can't trust the cloud, then I shouldn't use it. To ensure supply reliability, we urgently need a functioning market of national providers and sovereign cloud solutions.

Alexander von Gernler, Head of Research and Innovation

New Hope: Confidential Computing

A promising candidate recently appeared on the scene: confidential computing. Here, special features of common processors (e.g., AMD or Intel) are used to completely encrypt the main memory of an enclave and to decrypt it only when loading data to the CPU. This can be verified cryptographically from outside, similar to measured boot in trusted computing. In this way, it is no longer necessary to trust the cloud provider, but instead just the processor manufacturer. This is acceptable as the processor manufacturer had to be trusted prior to this. Moreover, a manipulated CPU provides considerably poorer attack possibilities than a manipulated cloud stack.

At present, confidential computing still suffers from measurable deficits in terms of performance. The question concerning the final architecture is also as yet unanswered. Furthermore, a compatible framework that can be used to reliably and scalably check whether the desired virtual machines or Kubernetes pods are really in a secure enclave is also important.

Three Requirements Regarding The Security of Sovereign Clouds

If we now turn our attention to a secure and sovereign official information processing – also for classified information – regardless of whether in a private or public cloud, what should our requirements be as a whole?

  • First: The available confidential computing system should be functional, tested and performant.

  • Second: Data should not be processed at a whim, but in accordance with a plan from the relevant authority, i.e., the German Federal Office for Information Security.

  • Third: To ensure supply reliability, we urgently need a functioning market of national providers and sovereign cloud solutions.

For the purposes of controllability, available proprietary solutions, such as m365 planned by Delos in the national datacenter, require the possibility of inspection by the BSI and users. And it is absolutely essential that we do not fall short of the existing standards such as BSI C5.

An abridged version of this article also appeared in Edition 1/2024 of the trade journal ix.