How public Clouds generate innovations in the field of AI

How public Clouds generate innovations in the field of AI

Companies that do not have the resources to develop their own machine learning models are turning to the major cloud providers. […]

The three major cloud providers, in particular Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP), want developers and data scientists to develop, test and use machine learning models in their clouds. This is a lucrative undertaking for them, because test models often require a high level of infrastructure, and models in production often require high availability.

These are lucrative services for cloud providers and offer advantages to their customers, but they don’t just want to compete for their business via infrastructure, service levels and prices. They focus on versatile on-ramps to make it easier for customers to use their machine learning functions.

Each public cloud offers multiple options for data storage, including serverless databases, data warehouses, data lakes, and NoSQL data stores, so you’re likely to develop your models close to where your data resides. They offer popular machine learning frameworks, including TensorFlow and PyTorch, so their clouds offer everything from a single source for data science teams that want flexibility. All three offer modelops, MLOps and a growing number of features to support the entire machine learning lifecycle.

A recent study shows that 78% of artificial intelligence (AI) and machine learning (ML) projects are carried out in companies with a hybrid cloud infrastructure, so public clouds still have a lot of room to grow. This means that they must continue to innovate with new and differentiating features.

This innovation includes several categories that are designed to help companies perform machine learning on a large scale, with more services and more user-friendly platforms. Here are some details.

Battle of the AI Chips

The experiments in the field of machine learning are becoming more and more extensive and complex, which requires training on large amounts of data. Microsoft and Nvidia recently announced a voice processor with 530 billion parameters, while Google, according to its own data, trained a model with 1.6 trillion parameters earlier this year.

Training models of this size and complexity can take a very long time and become expensive, which is why the public clouds are innovative with AI chips and infrastructure options. AWS already has Inferentia and Trainium, and recently announced new EC2 instances powered by Habanas Gaudi that offer 40% better value for money compared to the latest GPU-driven EC2 instances.

Meanwhile, Google has announced TPU v4 in early 2021. Its fourth-generation tensor Processing unit shows an average performance improvement of 2.7 times over TPU v3. Expect more hardware innovations with AI chips and accelerators from Cerebras, Graphcore, Nvidia and SambaNova.

Chips are not the only AI-enabled infrastructure, and all three public clouds have edge computing platforms that help deliver machine learning models for the Internet of Things and other streaming applications.

Battle of AI Services

Most data science teams will not develop AI on a large scale, but they want to create and configure advanced machine learning models. All three cloud providers are developing machine learning services, and I expect them to increase significantly in the next few years.

Below is a brief overview of the machine learning services offered on Azure, GCP and AWS:

  • Microsoft’s cognitive services include voice services, voice services for sentiment analysis, and question and answer services that are commonly used in chatbots. Image processing services include face recognition, and there are decision-support services that are used for personalization and anomaly detection.
  • Microsoft recently announced the OpenAI service, which is connected to the GPT-3 natural language model and supports search, conversation, text completion and other services.
  • Google Cloud offers several AI services for document processing, including DocAI for general document processing and vertical solutions for lending, procurement, contact centers and contract management.
  • AWS machine learning services include Rekognition for computer Vision, Textract for document processing, Lex for chatbots, CodeGuru for code reviews, and Personalize for web application customization.
  • AWS also offers industry-specific AI solutions such as Amazon HealthLake to enable health data predictions, Amazon Lookout to detect abnormal device behavior, and Amazon Fraud Detector for financial services and other industries.

Will we see more machine learning as a service (MLaaS) models from public clouds and other competitors? Dr. Kirk Borne, chief science officer at DataPrime, believes so. “We will see more MLaaS or model-as-a-service offerings as these models become more sophisticated and the cost of their training is correspondingly high. Fewer and fewer companies will want to invest the time and talent to create their own instances of these pre-trained models.“

Borne continues, “A large number of small and medium-sized enterprises that are engaged in ML and AI will find these X-AAS offerings perfectly suited to their time, budget and strategic needs. MLaaS also helps close the ubiquitous talent gap by taking advantage of pre-trained models as a service that use sophisticated and powerful algorithms.“

Fight for the accessibility of AI

The next challenge for public clouds is to make their machine learning and AI capabilities accessible to companies that do not have advanced software development and data science teams. This is done through low-code technologies that either have built-in machine learning capabilities or help developers create interfaces to other AI services.

The AWS SageMaker IDE makes it easy to develop, test, and deploy machine learning models. The IDE offers several advanced features, including a data wrangler to help data scientists prepare data, a feature store to encourage collaboration and reuse between data science teams, and Devops capabilities for one-click deployment. AWS Sagemaker competes with data science platforms such as Alteryx, Dataiku, KNIME, and SAS.

Microsoft offers Azure Machine Learning Studio, a portal that combines no-code and code-first experiences for data scientists. The more advanced low-code AI offering is AI Builder for the Power Apps platform, which allows low-code developers to perform text classification, object recognition, and form processing.

Google is taking a similar approach with AutoML for training models. AppSheet’s built-in intelligence includes trend forecasting, content classification, sentiment analysis, and other features. The public clouds compete with other low-code platforms that offer machine learning capabilities, including Creatio, Outsystems, Thinkwise, Vantiq, and others.

It will be interesting to see how the public clouds, start-ups, enterprise software vendors, chip manufacturers, infrastructure providers and open source platforms compete in artificial intelligence and machine learning innovations to support larger models, more services and easier on-ramps for application integration.

*Isaac Sacolick is President of StarCIO and the author of the Amazon bestseller Driving Digital: The Leader’s Guide to Business Transformation through Technology, which covers many practices such as agile planning, devops, and data science that are critical to successful digital transformation programs. Sacolick is a recognized top social CIO and influencer in the field of digital transformation. He has more than 700 articles on,, his blog Social, Agile, and Transformation and other websites.

Ready to see us in action:

More To Explore
Enable registration in settings - general
Have any project in mind?

Contact us: