Subscribe
About

Where cloud meets AI

By taking advantage of cloud-based AI, businesses can integrate AI tools and resources.
By Tiana Cline, Contributor
Johannesburg, 08 Aug 2024
Eugene de Souza, Red Hat
Eugene de Souza, Red Hat

As large language models (LLMs) continue to grow in size and complexity, the demand for powerful hardware has skyrocketed. To meet this demand, the revenue for AI processors is projected to increase from $4 billion in 2020 to $38 billion by 2026. AI is also driving datacentre investment. Omdia’s latest Cloud and Datacentre Market Snapshot expects AI to surpass telecoms as the top server workload by 2027. Generative AI, with its vast datasets and billions of parameters that require training and fine-tuning, requires substantial compute, but managing AI infrastructure is complex and resource intensive.

It requires specialised IT skills like GPU optimisation, model deployment and scalability, but then there’s AI cloud. Also known as AI-as-a-Service (AIaas), AI cloud is infrastructure specifically designed for AI and ML workloads. It takes away the complexities around accessing advanced hardware like GPUs or Tensor Processing Units, which are prohibitively expensive and logistically challenging to deploy on-premises.

Even if you’re using the best AI model, you need the right data to get the outputs that you need to have a quality response from AI.

Linda Saunders, Salesforce

“Cloud-based AI services often operate on a pay-as-you-go model, which can be more cost-effective than maintaining in-house AI infrastructure,” says Eugene de Souza, regional cloud business lead, Red Hat. While the use of AI doesn’t necessarily need the cloud, there’s a much lower barrier for entry for AI-infused applications and services. De Souza says that one of the biggest benefits of cloud platforms is that they offer scalable computing resources. With AI cloud, businesses can adjust their AI infrastructure based on demand, which enables them to handle large volumes of data and complex AI models more efficiently, he says, and this means they can take advantage of best-of-breed AI services from cloud providers.

The AI cloud market will be worth $647.60 billion by 2030, says Grand View Research. This growth has sparked intense competition among the three leading hyperscalers, Amazon Web Services (AWS), Google and Microsoft. Through its exclusive partnership with OpenAI, Microsoft has gained a competitive advantage in GenAI. Microsoft invested $13 billion in the for-profit part of OpenAI, which gives it a 49% stake. The European Commission is now preparing for an antitrust investigation into whether the partnership was harming competition.

Azure OpenAI supports secure access to its services through virtual networks (VPNs) and private links.

Understanding AI models

AWS is focusing on three key services for generative AI – Amazon Bedrock, Sagemaker JumpStart and Titan – but also has an AI-powered coding companion called CodeWhisperer for developer productivity. Google currently has four foundation models that are accessible through its Vertex AI platform: Codey, Chirp, Imagen and PaLM.

Designed with cloud-native principles in mind, such as scalability, resiliency, and automation, AI cloud platforms are purpose-built to handle large volumes of data. But before a business can start using AI, it needs to get its data in order. Linda Saunders, Salesforce’s head of solution engineering for Africa, says that enterprises often have complex data structures – islands of data, or data trapped all over the business – that need to be properly prepared before it can be used. “Data is what AI uses to create these inferences. It’s what it responds to these prompts with,” she says. “Even if you’re using the best AI model, you need the right data to get the outputs that you need to have a quality response from AI.”

Linda Saunders, Salesforce
Linda Saunders, Salesforce

Because the AI model landscape is evolving so quickly, it can be difficult for a business to choose the most suitable model. Different business units, or use cases within an organisation, often require models with different capabilities. Katharine Janisch, head of BlueSky’s Salesforce practice, says that data drives generative models, as well as predictive and segmentation models. “Having the right data, in the correct format, is important in ensuring the quality and operability of any AI model and framework,” she says.

THE CONTACT CENTRE OF THE FUTURE

BlueSky has built a number of AI solutions for enterprise clients across multiple industries. Its “contact centre of the future”, which was implemented for Capitec Bank, used AWS Connect to improve customer engagement through generative and predictive AI. The contact centre also uses both conversational AI solutions from AWS Lex and Salesforce Service Cloud, says Katharine Janisch, head of BlueSky’s Salesforce practice. “This chatbot managed routine inquiries, allowing human agents to focus on complex issues.” Integrated with AWS Lambda, it provides real-time, personalised responses. Amazon Connect reduced average handling time by 30%, while increasing customer satisfaction scores by 25%. 

“Sticking with a single model risks it becoming outdated or less effective over time,” says Saunders, adding that enterprises want flexibility to choose models based on their own needs and use cases, rather than being locked into a single vendor’s model choices. “Some of our customers are building their own models, some more successfully than others,” she says. “An out-of-the box AI model is going to get you a good bit of the way, but we still want to put the control back into the hands of the organisation.”

One of the ways Salesforce is addressing the need for flexibility is by providing an open and extensible platform through its metadata framework, which allows models to be customised with enterprise data. “If we look at our customer base, you see a lot of examples of large enterprises, but there are also NGOs using our technology,” says Saunders. Cloud AI services can help democratise AI by lowering barriers to entry and enabling smaller businesses or individuals to access these powerful technologies. “Everyone gets access to the same solution and the pricing model only changes according to the scale of your business. That creates feature and function accessibility to businesses from all walks of life.”

While other companies are launching AI co-pilots, Saunders says one of Salesforce’s differentiators for Cloud AI is its single user interface, which, she says, makes AI easier to use and reduces cost. BlueSky’s Janisch says that one of the main challenges she’s seen when deploying AI within cloud environments comes from misunderstandings around how cloud is utilised, which is very different from an on-premises environment.

Democratising AI

What about privacy?

“In the enterprise world, all that trust you’ve earned with your customers, and your customer data, can’t vanish because you haven’t thought about privacy, security and role-based access,” says Saunders. “The first rule is to always ensure that the infrastructure and cloud environments adhere to basic best practices and least privilege access control for users and service accounts,” says Janisch. “Service accounts should be utilised as much as possible to ensure that users don’t have direct access to sensitive information that’s being accessed and processed programmatically. Rotate these keys often.”

She says that clients should have a clear understanding of the AI products they’re using, especially around re-use of training data for the cloud provider, as well as data sovereignty and legislative constraints that might apply to the data being utilised. 

SIX STEPS TO ENSURING SUCCESSFUL CLOUD AI DEPLOYMENT

Deploying AI in the cloud offers numerous advantages, but comes with its own set of challenges. Here are key steps to ensure successful cloud AI deployments from Eugene de Souza, Red Hat’s regional cloud leader for SSA.

1 Implement economic controls

Without proper economic controls, cloud AI can become costly. Pay-as-yougo is an advantage for getting quick access to AI optimised hardware solutions in the cloud, instead of having to buy these yourself for specialised use cases. If you don’t implement economic guard rails, policies, and budgeting effectively in your organisation, the cloud can become less desirable.

2 Address skill gaps

AI-based cloud often requires specialised skills that are hard to find. Implementing AI in the cloud requires specialised skills in areas such as data science, machine learning, and cloud computing. Many companies struggle to find and retain talent with the necessary expertise, leading to delays in deployment and adoption.

3 Avoid vendor lock-in

Dependence on a single cloud provider can limit flexibility and increase costs. Companies may become overly dependent on a single cloud provider for their AI infrastructure, leading to vendor lock-in. This can limit flexibility and increase costs in the long run, as migrating AI workloads between cloud providers can be complex and costly.

4 Ensure data management and Compliance

Businesses need to ensure sound model management, traceability and data integrity. Managing the data pipeline properly is important for the successful deployment of AI cloud. Compliance with regulatory requirements, such as GDPR, HIPAA and PCI DSS adds another layer of complexity to deploying AI in cloud environments.

5 Plan for integration

There will also be some integration into existing processes to consider. Solutions like Red Hat’s AI and MLOps can help businesses move AI-infused applications between cloud providers or better integrate them. Where possible, integrate AI solutions into existing processes.

6 Focus on security

Ensuring the privacy and security of your data is paramount. Effective data management, traceability, and compliance with relevant regulations and standards are essential to avoid legal and financial consequences.

* Article first published on brainstorm.itweb.co.za

Share