When the modern-day internet began emerging in the early 2000s, finding hosting services and resources to run the new wave of dynamic web applications was hard. You needed a database to store application data. These were slow, expensive, and unreliable, regularly bringing applications to a grinding halt when a single instance failed. You needed a server to run interpreted languages like PHP, Python, or Ruby. These were equally expensive, often needed configuration, had security issues, and frequently ran out of memory or CPU resources, again bringing applications to a grinding halt.
For anyone on a small budget, running web 2.0-era applications required constant configuration tweaking, tight performance streamlining, and cost reduction, all within the typically tight confines of what a provider would even let you change and manage yourself.
Between those heady days and now, an increasing patchwork of hosting providers emerged to cope with the complexity and scale that web applications demanded.
For the past 10 years, a significant proportion of applications have moved to a new generation called “cloud hosting”. The term “cloud” is a bit vague, and there’s a popular (but not altogether accurate) phrase that says, “The cloud is just someone else’s computer”. The cloud abstracts and simplifies the complexity of managing the infrastructure mentioned above. Instead of thinking about servers, you think of services and instances of services.
In the modern infrastructure world, when a database is struggling, you add another instance. If you have so many database and application instances that you’ve lost track of what’s happening, add another service or three for that, too.
Taking this abstraction to an extreme, “serverless” has reached its peak popularity in the past few years. This approach aims to reduce servers and services to something more like a function call. Of course, a server still handles all these function calls and responses behind the scenes, but the argument is that you shouldn’t need to worry about that and should only focus on sending and receiving data.
More than 20 years later, web-based application developers’ lives are surely easier, aren’t they?
No, not really. There are many issues with developing and maintaining apps that run in the cloud. Thankfully, several European operators are trying to make developers’ lives easier again.
Before getting to them, here’s a quick terminology guide.
- Private cloud: Services used by only one customer.
- Public cloud: Services shared by more than one customer.
In both cases, customer data and details remain private, and everything could run in one or more locations. The main difference is that the provider carves out a digital tranche of territory just for that customer. This is probably defined in software, but it could be in hardware, and it could be a dedicated server running remotely or locally to the customer.
With that in mind, let’s dig into the problems in the cloud computing world.
The cloud is consolidated and monopolised
Cloud computing has hundreds of providers, yet most people only think of three: Amazon Web Services (AWS), Microsoft Azure (Azure), and Google Cloud Platform (GCP) — known as the “hyperscalers” of the hosting industry.
The web is a big place, brimming with publicly and privately available sites, so precise numbers of what runs where are hard to come by. However, according to statistics from builtwith.com, about 12% of websites — approximately 86.8 million in total — run on AWS. The other two “only” host roughly another 12% combined. If you look at hosting companies that call themselves “cloud”, then according to techjury.net these percentages increase to 32% for AWS, 23% for Azure, and 10% for GCP.
Yet with these statistics, defining what constitutes a website is complicated. Hyperscalers offer hundreds of different services that developers use for one or more parts of an application, some of which perform crucial functions that break an application if unavailable.
This has caused problems in the past. Remember the various times when large amounts of online services were unavailable? That was probably due to one of these major companies experiencing an outage. This has led to many developers taking a multi-cloud or hybrid-cloud approach with their applications, spreading risk by hosting services across multiple providers. This solves a technical issue but brings more revenue to all cloud providers and increases complexity.
This consolidation puts a tremendous amount of power into a handful of companies. If they change their policies, thousands of businesses could be left without a place to run. More concerning is that all of the top three — in fact, all of the top five — are US companies, except for Alibaba, based in China.
The US already has data privacy, security, and law enforcement policies that concern many companies and jurisdictions, and while all the companies mentioned provide hosting options in a global variety of jurisdictions, what if politics in the US no longer respected these digital borders? No matter how unlikely some things can seem, consolidation is always dangerous.
Diversifying the cloud
Developers and their companies do not want to completely switch away from the cloud. Rather, they are looking for new options from the hyperscale hosts, especially in Europe, where there is a mixture of increased regulations and insecurity around using American services, alongside a degree of nationalism encouraging people to use European services.
These trends create new global opportunities for alternative hosting providers, new and old, especially in Europe.
I spoke to three of the largest hosting providers in Europe to find out if they are noticing the same trends and what they think the next 20 years of web hosting might look like. Two of them — France’s OVH (the host of around 4% of websites) and Germany’s Hetzner (around 5.5% of websites) — have existed since the late 1990s, before the web 2.0 revolution and “cloud” was a term. The third is the UK’s Civo, which is just seven years old, but has founders with many more years of experience helping companies bring their applications to the internet.


The cloud hasn’t lived up to its promise
As developers rushed to the cloud, it promised to make developing and running large, complex applications easier and more cost-effective. Anyone who has sat back to look at the myriad tools and processes they now have to use and maintain for a cloud-native application might wonder how true that is. The Cloud Native Computing Foundation (CNCF) landscape map has become so large that other tools and working groups are needed to help people navigate it.
While the hosts of yesteryear would charge a reliable and steady amount per month, hyperscale cloud companies tend to charge by usage, which leads to unpredictable and spiky costs that are often hard to interpret and act upon. A recent report from cloudzero states that more than 20% of respondents have no clear idea of their cloud costs, which can consist of thousands of rows of data. A report from IT support firm AAG found that 82% of respondents find cloud spending challenging.
The European difference
OVH, Hetzner, and Civo all said the interaction points to the hyperscalers have become overly complex, with too many layers of abstractions needed to get started. People are now accustomed to more user-friendly interfaces. If AWS arrived today with its byzantine UI, the service might struggle to attract as many users.
For Europe’s cloud competitors, this presents an opportunity. “With everything we do, we put ourselves in the shoes of a user,” said Dinesh Majrekar, CTO of Civo. “By using sensible defaults and clear cost indications, we aim to make it possible to scroll to the bottom of any form and create what you need.”
Hetzner’s approach is similar. As their spokesperson, Christian Fitz, told me, “We offer a straightforward user interface and an easy setup process, ideal for users who need cost-effective servers without complex administration.”
Cost efficiency
One of the biggest motivations in the current climate is cutting costs. The move to the cloud promised to save users money, or at the least, ensure they only spend money on resources when needed, rather than pay for idle machines. However, as the AAG report mentioned earlier shows, companies are now paying more than ever for running services. Granted, actual usage is likely also increased. For many, the issue is the inconsistent monthly bills and the lack of transparency in the relationship between services, usage, and costs.
In recognition of how big this disconnect is, there’s now an entire foundation, the FinOps foundation, dedicated to maximising the business value of the cloud, and OpenCost, an open-source tool that helps show the cost of infrastructure decisions, was recently welcomed into the CNCF.
Around the time of KubeCon EU 2024, Broadcom had recently acquired VMWare and announced significant pricing changes. VMWare had been a popular option for running a private cloud, and while Broadcom rolled back some of the pricing changes, the experience caused many to look for more open and standards-based options for private cloud hosting. All three hosts emphasised the importance of allowing customers to switch between providers. The most controversial of these policies is charging for egress, i.e. the cost of moving data from a cloud service, which can make multi-cloud hosting prohibitively expensive.
OVH has pledged not to apply these charges. Yaniv Fdida, the company’s Chief Product and Technology Officer, said, “We have no fees on egress or traffic in and out. The cloud should be free. This enables our customers to balance workloads and have free choice. This is part of our tagline, and as far as tools are concerned, we use open standards.”
This is a growing trend amongst providers. As Majrekar notes, “We recently got rid of egress charges altogether. So that’s something you don’t have to worry about.”
A trustworthy cloud
Perhaps due to anxieties about keeping valuable data in other territories, national pride, or regulatory reasons, European companies often look to alternatives to the hyperscale hosts.
For OVH, that provides a chance to offer a more trustworthy service.
As an older company, OVH runs its own data centres but also produces a lot of its own hardware in locations in Europe and North America. The hyperscalers make a lot of their own hardware too, but the only hosts I could find that make hardware in Europe are OVH and Hetzner. This creates what OVH calls the “trusted cloud”, where they can guarantee higher data sovereignty thanks to knowledge of the supply chain behind the hardware that processes customer data.
This doesn’t just lead to increased trust but also, as Fdida put it, an increased ability to innovate.
“Because we are not tied to any third-party suppliers, we can really accelerate our time to market and our durability and longevity,” he said.
A less wasteful cloud
While the hyperscale hosts rush to open power plants to meet the high energy demands for generative AI applications, smaller hosts are taking different approaches. Sustainability is one area where European providers (and global providers’ European operations) excel and have the potential to lead the world as other regions start to pull back from sustainability commitments.
Hetzner has made the issue a key selling point. As Fitz, their spokesperson, told me, “Our commitment to environmentally friendly hosting spans many years; in Germany, we power our data centres exclusively with hydropower, and in Finland, we also use wind power. Hosting that aligns with sustainability goals is becoming increasingly important. “
Civo, meanwhile, partners with the UK’s Deep Green to run graphical processing units (GPUs) that many AI-heavy processes use. Deep Green uses multiple methods to reuse the heat that servers generate. For example, the company submerges servers in a special liquid, heating that water and using it for other purposes. While Deep Green already has 1,500 sites around the UK, it’s unclear how many of these are used by Civo.
Deep Green and Civo aren’t the only partners trying this idea in Europe. Swiss host Infomaniak does something similar. These projects are a great example of how Europe’s typically denser cities can use colocation.
One aspect of making sustainable data centres that is often missed is the embedded carbon in buying and decommissioning servers. OVH has addressed this by recycling and reusing much of its hardware, keeping older machines running for less intensive use cases.
The future of the cloud
Looking back over nearly 30 years of web hosting, you can see many changes in demands, requirements, and ways the industry handles them. If the sudden surge in demand for new AI tools is anything to judge by, it’s hard to make predictions.
Fdida thinks the next challenge for providers will be quantum computing.
“ Quantum will drastically change the way we look at workloads,” he said. “We are a pioneer in Europe and have been delivering simulators of quantum computing in our cloud for the last two years. And we have a real one hosted in our facilities with a company. We believe that it takes time to materialise how quantum can disrupt completely because it’s a new paradigm of looking at computing, right?”
Another issue for the future will be complexity. Mark Boost of Civo hopes that moving forward will also mean a return to simpler times.
“Before the cloud, you had hosting companies, you had FTP, and you could just move between providers,” he said.
“Amazon pretty much defined this new market in the early days and has created this world of complexity, which means that freedom of movement is so difficult. I’d love to see us get to that place, and I think the future will be that.”