The year is coming to an end, and analysts and publishers give us predictions for the year to come and the major emerging trends. As every year, Informatique News dedicates this week of transition between two years to these significant trends. Let’s start with the Cloud. Cloud computing, has undoubtedly been the excellent business computing revolution of the past decade. But always on the move is the Cloud. Here we have grouped the main trends and predictions of the various market analysts.
The Hybrid, Again and Again
For years, everyone has agreed that the Cloud will be hybrid with elements maintained on its premises and others deployed in the Cloud. But beyond the speeches, 2019 will have been a year full of announcements to manage to offer uniform platforms from the on-prem datacenter to hyperscalers via the edge.
On the one hand, there are the approaches of internal infrastructure specialists. By announcing Tanzu, Project Pacific, and buying Pivotal, VMware is reinventing vSphere around containers and around what they can bring in infrastructure hybridization. For its part, Nutanix is making its hybrid vision a reality with an increasingly comprehensive approach that extends the very idea of hyper-convergence (pillar of its private Cloud) to different clouds.
On the other hand, the large hyperscalers now offer companies to host their primary services at home. This is what Microsoft offers with Azure Stack (based on appliances) but also Azure Arc (based on Kubernetes), Google with Anthos (based on Kubernetes), and AWS with Outposts (based on in-house appliances, managed by AWS).
For Gartner, the approach is taking shape so much that it will invite companies to completely rethink their management of “Disaster Recovery,” a good hybrid approach that can go so far as to nullify the current DRaaS offers. (Disaster Recovery as a Service). Gartner sees the emergence of a new field in 2020: HDIM, Hybrid Digital Infrastructure Management. Solutions specialized in the governance and management of hybrid infrastructures.
Repatriation Of Certain Workloads From The Cloud
Companies discovered in 2018 and 2019 that some of the Cloud’s economic promises were not being delivered and that the Cloud sometimes cost them more. The reason is simple: putting non-cloud native applications in the Cloud is often not profitable. Especially when we talk about historical applications whose criticality is “medium” and when companies have started to evolve (in particular by adopting hyper converged infrastructures) their internal IT towards more agility.
With the cloud-hybrid trend helping, many companies are considering internally repatriating cloud workloads deemed too expensive. Probably to better place them there afterward when they have been modernized in a more cloud-compatible spirit. This is also one of the keys to transforming VMs into Containers such as VMware envisages with Project Pacific, and Google already practices with GCP / Anthos Migrate.
The Era Of The Hyper-Distributed Omni-Cloud
The year 2019 saw two strong trends materialize: “multi-cloud” (the buzzword of the year) and the distributed Cloud. Multi-cloud results from the idea that companies will not entrust everything (not all of their workloads or all of their data) to a single hyperscale or public Cloud. They will combine them to improve their resilience, optimize their costs and promote competition. And the solutions to obtain more unified visions of this division of labor on different clouds have multiplied.
There are even tools, notably at Nutanix and at HPE, which make it possible to assess the cost of a workload depending on whether it is placed on the internal infrastructure (the private Cloud) or the public cloud X, Y or Z. The “Distributed Cloud” is the result of the proliferation of data centers from large cloud operators. Services are distributed to different locations, but governance remains centralized in one place. The Azure Stack / Arc, Anthos, Nutanix, or VMware hybrid approaches extend its principle.
But AWS offers an even more progressive vision of this “Distributed Cloud” with Outposts (which consists of placing AWS servers within company data centers, servers managed by AWS) and with its AWS Local Zones, mini-data centers located closer to business areas and managed as extensions of an AWS Region to better serve applications requiring low latencies. The union of these two realities leads to a new perception of the Cloud, at the same time hybrid, multiple, and distributed, which one calls “Omni-cloud.”
The Reign Of Serverless And Its Different Faces
G2 Research does not hesitate outright by titling in a deliberately provocative way “hardware is dead.” It is true that with computers now entirely defined by software, the role of hardware loses visibility if not prevalence. And it’s not the new “Serverless” trends that will make things better. Of course, hardware and servers continue to exist, but they become entirely transparent. And especially for developers who no longer have to worry about infrastructure, scalability, plumbing, etc.
But G2 Research could also have added, “Kubernetes is dead, long live Kubernetes.” Because now that the container orchestrator has established itself everywhere, including on vSphere (with the Pacific project), the sinews of war will be played elsewhere, or rather above. Developers don’t want to worry about Kubernetes clusters anymore.
They want to place their containers on a “serverless entity” that takes care of everything for them: security, optimization of placement, life cycle, scalability, etc. The technologies to watch are called Knative, “Iron.io,” or Istio. This is the era of “CaaS”, of “Container as a Service” as it is already practiced on Azure (ACS), AWS (Fargate), or GCP (Cloud Run) but which will now also be practiced on clouds private whether VMware runs them, Nutanix, Azure Stack, Amazon Outposts, or Google Anthos.
But “CaaS” is not the only face of Serverless. The “FaaS”, Function as a Service, offers a vision where developers no longer even have the notion of containers and are content to code each functionality of their applications, triggered by all kinds of events, and put them into action. Production as you go, the infrastructure was taking care of everything. Often presented as the 2.0 evolution of PaaS (Platform as a Service), FaaS revolutionizes the approach to development. What hinders FaaS today is its lack of unification: Azure Functions (open source), AWS Lambda, OpenWhisk (at IBM), OpenFaaS,… There are many platforms. And they depend, by the event-driven nature of FaaS programming, on the management of bricks of events which are also very diverse (each Cloud offers them). But FaaS is so suited to a world of microservices and IoT that it will eventually prevail. 2020 could be a pivotal year for Serverless as 2019 was for Kubernetes.
SDN Finally Materializes
With the rise of cloud computing on the one hand and the arrival of 5G, global companies must rethink their connectivity to cloud services in a direct, secure, reliable (redundant), fast, cost-effective, and fully automated way. This is generally much easier said than done. Still, this SDN (Software Defined Network) promise is the last component of the SDI (Software Defined Infrastructure), which is slow to materialize. But this time, the phone operator, the cloud operator, the interconnection operator, and businesses are in the same boat, and network services will have to automate and unify so that the promises of the “Omni-cloud hyper- distributed” materialize.
Also read: 8 Technologies To Keep An Eye On In 2021
The Quantum Cloud Is Taking Shape.
In 2020, companies will be able to confront quantum computing. At least those who have forward-looking teams working on visionary projects. Because the few rare quantum machines available worldwide are now accessible to companies via the Cloud, if IBM has already been operating a quantum cloud service for more than two years, it was joined this year by AWS, Azure, and Rigetti. They offer “Quantum Computing as a Service” offers, for the moment, with minimal access, but open to a more significant number of companies in 2020.
However, quantum computing requires radically different algorithmic approaches, and the training of true quantum programming specialists will take time. So, they might as well start their training as soon as possible. And even if quantum computing does not see the light of day anytime soon, this approach to quantum programming can be beneficial to today’s developers to reinvent the algorithms that drive our current machines, as demonstrated by Ewin Tang in his thesis. Around recommendation algorithms.
Machine Learning And AI Are Cloud Services
Between massive volumes of data, heavy infrastructures for learning that we must commission and decommission at will, the “fail fast” nature of experiments on data, the Cloud is the preferred destination for all research projects. Machine Learning and Deep Learning.
In recent months, “Machine Learning as a Service” services have gained a lot in flexibility and ease of use with the appearance of very interactive data selection, work and preparation interfaces. This is particularly the case at Azure with Azure ML Studio and at AWS with Amazon Sage Maker Studio, IBM with Watson ML Studio. But these cloud services also seek to make machine learning much more affordable with highly automated “Auto ML” approaches as offered by GCP with Cloud Auto ML and Azure and AWS (Sage Maker).
All analysts agree that these services will take off in 2020 with more and more SMEs and tiny businesses realizing the impacts that AI can have on their economic development.
Cybersecurity, A Subject Of High Tension
Until now, large public clouds have often been satisfied with getting all the specifications on the planet when it comes to infrastructure security and providing encryption services by telling companies that the cybersecurity of their workloads and their data is their responsibility. not apparent from those of cloud providers. But times are changing. And business demand has intensified. Cloud providers cannot be passive and must actively secure the businesses and workloads ultimately entrusted to them.
Microsoft and Google have understood this by launching Azure Sentinel and Chronicle BackStory respectively this year or by signing partnerships with players such as Qualys (also integrated into the AWS Security Hub). The portfolios of solutions and services for greater security of workloads and data entrusted to clouds began to expand in 2019. The trend will intensify in 2020 within particularly the desire to define and deploy standard security policies. Internally and across all clouds.
Unified Communications Go Through The Cloud.
In line with the Salesforce announcements during Dreamforce 2019 and the partnership agreements signed with AWS, we should expect to see companies (tiny and medium-sized businesses) finally turning to “Unified Communications as a Service” solutions. (UCaaS) for the moment, vastly underemployed. But companies not only have growing unified communications needs, but they also need platforms that offer APIs to allow SMS, social networks, calls, and video to be integrated at the heart of business applications (CPaaS, Communications Platform as a Service). ).
For many observers, 2020 should accelerate the adoption of UCaaS / CPaaS platforms by companies.
These trends confirm that the Cloud remains a very mobile and active space, where many challenges for companies are played out. While some observers anticipate a decline in the growth of large clouds in 2020, with growth still blithely exceeding 20%, the impression left by the latest AWS Re: Invent 2019, Microsoft Ignite 2019, and Google Next’19, It’s good that the Cloud is still in its infancy and that we have only scratched the surface of the richness and diversity of services that AWS, Azure, Google, Alibaba Cloud, and others will offer us. In the future.