This is a frequently asked question about weather that meteorologists can answer pretty accurately these days. But this time it’s about computing clouds. Should be prognose the development of this technology? Let’s see.
Will the landscape be full of clouds or cloudless?
Clouds are out there, everybody can see that. But the question about their development is whether their number will grow or fall. Water, temperature, and wind are responsible for clouds in nature. And what factors impact the development of computing clouds?
Data is water.
There’s more and more of it, because it’s generated by people creating contents, pictures, videos, etc. Also, a growing number of IoT devices generate it on their own.
Temperature is the need for processing and storing data.
We collect geolocation data from cars to know where the traffic is heavy. It allows us to get to our destination faster. We train artificial intelligence by feeding it huge amounts of data so it can help us make better decisions.
Data transfer is wind.
These days, the Internet is widely available even in the most remote areas. Currently, there are over 4.4 billion internet users, so in order to cover the entire human population this number would have to double. Apart from users, there are also a few billion IoT devices, and this number doubles every other year.
And what about the costs? Due to the addition of 5G and 6G standards to transmission networks, data transfer prices will be further reduced. The price of data storage keeps falling, up to a microcent per megabyte. This trend can also be observed with regard to the cost of data processing.
Taking these trends into consideration, there can be only one conclusion. It’s going to get cloudy. The number of computing cloud providers, as well as the number and capacity of datacenters will keep growing. Gartner predicts that in 2021 most businesses will move the majority of their operations to the cloud, including advanced analytics. Already in 2020 users can expect noticeably more cloud solutions. The market will grow by 15–20% in subsequent years.
Cumulus or Stratus?
There are different types of clouds, and cloud service providers differ as well. According to Synergy Research Group, currently the “sky” is dominated by three major players:
Amazon Web Services 39%
Microsoft Azure 19%
Google Cloud 9%
Other providers include: Alibaba Cloud, IBM Cloud, and Oracle Cloud.
Newcomers include: SalesForce, Rackspace, NTT Communications, or Fujitsu.
The foundations of public computing clouds are data storage, data processing and data transfer. Other benefits include very fast startup, global reach, easy scalability, security and reliability, and high productivity based on the correlation of costs and usage.
What direction will cloud solution personalisation take in the future?
Amazon Web Services (AWS)
Amazon has an advantage of being a first mover. The company is perceived as a provider of a mature, proven, and secure solution. Its strengths are a broad selection of functionalities and continuous innovativeness to implement pioneer solutions. As a consequence, their extensive portfolio includes multiple customers from startups, to mid-sized companies, to large corporations requiring the highest level of security and availability. This means that AWS Cloud can be perceived as the most versatile solution. That is perfect for beginning your cloud operations, but also meets the expectations when tasked with strategic functions in the organisation. AWS Cloud is based on the needs and experiences of Amazon, which means it’s naturally fine tuned for e-commerce.
In case of Microsoft services, resources at hand would primarily be used in the following environments: Windows, Office, Skype, LinkedIn. And don’t forget about Xbox and the whole entertainment sector. However, special attention should be given to a very strong business line for managing enterprises (Dynamics) or facilitating data analysis (Power BI). After adding strong support for programmers through VisualStudio, Teams and GitHub, you don’t have to be a great strategist to figure out that Microsoft Azure will most likely rely on those products and their communities. They will develop services related to artificial intelligence, supporting decision-making processes and image analysis. On the other hand, for a considerable time now Microsoft services have been used by government entities in numerous countries, including the federal government in the U.S. It might suggest that the Azure computing cloud will be the one supported by the government sector.
Google specialises in software manufacturing and hasn’t been paying much attention to hardware and servers until recently. They adopted the PaaS model pretty quickly, but it took them a lot more time to offer the IaaS model as well. Google owns a fairly extensive environment of their own consumer and business products. They also have the Android operating system, and they have experimented with AndroidAuto, WearOS, or even Chrome. In addition, they also offer a suite of office tools called G Suite, as well as popular services like YouTube or Maps. Google also provides support for programmers by offering more and more dedicated tools. Considering the aforementioned solutions, Google Cloud might be the first choice for startups relying on the cloud. Google Cloud offer for corporate clients is far less attractive, with physical infrastructure, legal regulations, and data privacy. Going forward, with heavy investments in analytics and machine learning, Google Cshows that it aims to focus on solutions that will be very loosely connected with—or even independent from—physical infrastructure of public computing cloud providers.
Is there a storm coming?
After carefully analysing the market, can we expect any sudden shifts? Let’s not forget that the remaining 20% of market shares is with a large group of smaller providers. Is it possible that the market will integrate and only a few major players will survive? This is a normal phenomenon. However, different forces suggest otherwise.
Firstly, the concerns about entrusting strategical functions in the organisation to an unknown company will remain. It seems reasonable that instead of limiting oneself to a single provider. It would be better to diversify the risk and divide various areas and functions between different providers.
Another thing is the geography. The largest providers have datacenters covering almost every region of the world, ensuring fast access to data. There are some exceptions, usually resulting from politics. For instance, in China the three giants must use Chinese intermediaries in order to provide such services. At the same time, the strongest position on that market is held by Alibaba Cloud. So for those whose customers are mainly in China, picking Alibaba Cloud’s offer might be the best solution. Another example is the GDPR, requiring entities operating in Europe to store sensitive data in datacenters located in Europe.
The third reason is the most important one. Apart from short-lasting, local weather phenomena, we can also observe longer climate changes. Because this happens on a regular basis in nature, technology is also affected by similar laws.
In the beginning, informatisation relied on “strong” servers called mainframes. Client workstations consisted of an absolute minimum – punched cards and printers, and later simpler computers called terminals, with a keyboard and a monitor.
The second era
The second era started with the Personal Computer with considerably enhanced capabilities. It was capable of processing and storing data. It became widely available in the form of Spectrum, Commodore, Atari, Amiga, Mac, or PC. Server connection was limited to situations where it couldn’t cope with complex calculations or storing too much data. Apart from that, it could independently handle all the entertainment as well as office or creative work.
The Third Ice Age
It began not with the Internet, but with one of its services: the World Wide Web, which to many is still a synonym of Internet. Web browsers emerged, and their only purpose was to download data from a server and display it to the user. All data storage and processing operations were completed on servers. In this service, the role of PCs decreased again and became limited to entering data and displaying it in the form of a website.
Flash or Silverlight technologies represented a “small glaciation”, and tasks like at minimum data processing, business logic, and rich visualisation were, yet again, supposed to be carried out by PCs. However, this attempt was unsuccessful. Web servers have expanded into server rooms. Later into IaaS data centers, and then into PaaS services that could be called computing clouds.
Fourth Ice Age?
Alas, opposing forces have emerged again, attempting to reverse this trend and bring us into the fourth “ice age”.
Large corporations started doubting the privacy of data sent to public computing clouds. While the data may have been safer in the cloud than in corporate server rooms, people were wary of cloud service providers having too much access to their data.
That’s not all. Ensuring access to stored data was of paramount importance. This required highly reliable internet connections, also on organisation side. In this example This reduces the risk, but achieving very short access time to data may be physically impossible. For financial data used in transactions or production data used to control robots, even millisecond delays are unacceptable.
That’s why Hybrid Cloud solutions emerged.
In a hybrid cloud, companies would partially benefit from the advantages of public cloud providers and partially rely on their own infrastructure. What brought these two worlds together? The answer is containerisation. It makes software development and management methods less dependent on the type of public or private cloud that the software is stored in. An example of this approach is Kubernetes, a solution supported by the largest cloud solution providers.
There’s more to our story. Currently, IoT devices generate large amounts of data that does not have value in its raw form, so there is no need to send or store that data. This is where the idea of Edge Computing emerges. With this solution, the burden of data processing is shifted from server side to client side. Data is initially calculated and aggregated on the IoT device itself or in intermediate controllers.
A new trend emerges here
The idea to spread the architecture again. The idea is to keep using the functionalities of public computing clouds but also have some of a cloud’s infrastructure on-premise. Besides, it lets you have more control over data and ensures quick and reliable access. In this example this solution is based on the assumption that organisation uses the same infrastructure as used by a cloud provider. Moreover it bases all functional solutions on it. As a result, a company creates another micro-data center for the data of their cloud solution provider. Despite the fact that the company remains responsible for maintaining and utilising the infrastructure, the benefits of public solutions make this a very effective combination. So such solutions are already being implemented by the largest service providers. They are calling it “Distributed Cloud” and are reflected in the following products. For example: AWS Outpost, Azure Stack, Google Anthos, Oracle Cloud at Customer.
Will the landscape be full of clouds or cloudless?
In conclusion, it is now clear that the market will not be consolidated into server rooms belonging to a handful of major cloud service providers. A fantastic opportunity has arisen for all smaller and local providers of cloud services as well as telecom operators. Micro-data centers can change the cloud solution landscape. Moreover, the aforementioned companies will focus on hardware and network infrastructures. So it leave the choice of platform to their customers or enter into partnerships with major service providers.
Following this reasoning, we can safely assume that clouds will spread out for the next few years. At least until the next ice age.