- The guidelines show AWS wants to address customer questions about OpenAI, Microsoft, and Google.
- AWS salespeople are instructed to question OpenAI's security and customer support in pitches.
- AWS highlights its AI infrastructure, enterprise security, and cost efficiency over rivals.
OpenAI lacks advanced security and customer support. It's just a research company, not an established cloud provider. The ChatGPT-maker is not focused enough on corporate customers.
These are just some of the talking points Amazon Web Services' salespeople are told to follow when dealing with customers using, or close to buying, OpenAI's products, according to internal sales guidelines obtained by Business Insider.
Other talking points from the documents include OpenAI's lack of access to third-party AI models and weak enterprise-level contracts. AWS salespeople should dispel the hype around AI chatbots like ChatGPT, and steer the conversation toward AWS's strength of running the cloud infrastructure behind popular AI services, the guidelines added.
"For generative AI workloads, AWS will compete most often w/ Microsoft's Azure OpenAI Service, OpenAI (directly), and Google Cloud's Generative AI on Vertex AI," one of the documents stated. "Move beyond the hype with AI chatbots, and focus on the [Foundation Models] that power them and the cloud infrastructure needed to help enterprise customers safely create, integrate, deploy, and manage their own generative AI applications using their own data."
The guideline documents are from late 2023 through to spring 2024. They reflect Amazon's urgency to aggressively counter the growth of AI rivals, especially OpenAI. The viral success of ChatGPT put OpenAI at the forefront of the AI pack, even though Amazon has been working on this technology for years.
The effort to criticize OpenAI is also unusual for Amazon, which often says it's so customer-obsessed that it pays little attention to competitors.
This is the latest sign that suggests Amazon knows it has work to do to catch up in the AI race. OpenAI, Microsoft, and Google have taken an early lead and could become the main platforms where developers build new AI products and tools.
Though Amazon created a new AGI team last year, the company's existing AI models are considered less powerful than those made by its biggest competitors. Instead, Amazon has prioritized selling AI tools like Bedrock, which gives customers access to third-party AI models. AWS also offers cloud access to in-house AI chips that compete with Nvidia GPUs, with mixed results so far.
AI growth
Amazon's spokesperson told BI that AWS is the "leader in cloud" with projected revenue of more than $100 billion this year. Much of the growth has come from its new AI services, which are on pace to generate "multi-billion dollars" in revenue this year, the spokesperson added. AWS has announced more than twice the number of AI features than the next 3 closest competitors combined since 2023, the spokesperson noted.
"It's still early days for generative AI, and with so many companies offering varied services, we work to equip our sales teammates with the information they need to help customers understand why AWS is the best, easiest, most performant place to build generative AI applications. To parse the language as anything more than that or mischaracterize our leadership position is misguided speculation," the spokesperson wrote in an email.
OpenAI's spokesperson declined to comment.
'Important moment'
The documents appear to acknowledge that Amazon is playing catch-up to OpenAI. Many AWS customers got started on AI projects with OpenAI technology, like ChatGPT and its GPT models, because of the startup's "timing in the market, ease of use, and overall model intelligence capabilities," Amazon explained in one of the guidelines.
But now is a good time to go after those customers to convert them to AWS services, particularly Bedrock, a tool that has partnerships with AI model providers including Anthropic, Meta, and Cohere, the document said. It also claimed that Anthropic's Claude model, in particular, had surpassed OpenAI's GPT models in terms of "intelligence, accuracy, speed, and cost."
The customers most likely to migrate to AWS are the ones who are already "All In" on AWS for the majority of their cloud-computing needs, but "who chose to evaluate OpenAI for their first generative AI workloads," it added.
"This is an important moment for the field to take action on," one of the documents said. "Amazon, in partnership with various foundation model providers, has now created a stronger value proposition for customers that should not only inspire them to migrate their generative AI workloads onto AWS, but also, choose AWS for their next GenAI projects."
Switching to AWS
Some of those efforts are starting to pay off, according to Amazon's spokesperson. They cited 4 AWS customers — HUDstats, Arcanum AI, Forcura, and Experian — that initially used OpenAI's products, but switched to AWS's AI services after facing "limitations with flexibility and scalability."
"In Q2 2024, AWS had its biggest quarter over quarter increase in revenue since Q2 2022, and much of this growth is being fueled by customer adoption of generative AI," Amazon's spokesperson said. "Ultimately, customers are choosing AWS because we continue to be the significant leader in operational excellence, security, reliability, and the overall breadth and depth of our services."
Microsoft and Google
It's not just OpenAI that AWS is going after. The sales guidelines also share how AWS sales reps should respond to customer questions about Microsoft and Google.
If a customer talks about Microsoft and Google's AI infrastructure and chips, AWS salespeople should say Amazon has more than 5 years of experience investing in its own silicon processors, including its AI chips, Trainium and Inferentia, the documents advised.
The guidelines also highlight AWS's better cost and energy efficiency compared to competing products, and note the limited availability of Microsoft's Maia AI chip. One of the guidelines also points out Google's limitations in the number of foundation models offered.
"We're flattered they're worried about us, but fiction doesn't become fact just because it's in talking points," Google spokesperson Atle Erlingsson told BI. "Not only do we offer more than 150 first, third and open-source models via Vertex AI, our AI infrastructure offers best overall performance, best cost performance, as well as uptime and security."
Microsoft's spokesperson declined to comment.
"Cut through the hype"
For customers who say Microsoft and OpenAI are at the "cutting edge" of generative AI, AWS wants its salespeople to "cut through the hype," and ensure customers understand how AWS has solutions "across the entire stack" of generative AI technology, from the bottom infrastructure to the AI applications used by end customers, it said.
In situations where Microsoft pitches its AI-powered analytics software Fabric to customers, AWS salespeople are instructed to say, "Microsoft Fabric is a new (unproven) offering." It says Fabric doesn't offer many integration points with Azure's Generative AI services, and AWS's own analytics services "offer superior functionality" across diverse workloads.
Microsoft previously said 67% of Fortune 500 companies use Fabric.
'Misleading FUD'
The documents also share AWS "value propositions" that should be emphasized during sales pitches. They include AWS's ease of use, including "enterprise-grade security and privacy," and the ability to customize AI models using the customer's own data. It also stresses AWS's price efficiency and broad set of AI chips offered, as well as its own AI-powered applications, like Amazon Q.
Customers typically consider the following 9 criteria before choosing an AI model and service provider, one of the documents said. They are customization; personalization; accuracy; security; monitoring; cost; ease of use; responsible AI; and innovation.
Despite the competitive tone of the guidelines, AWS also tells salespeople to use caution and clarity when discussing what data its rivals use for model training. OpenAI, for example, publicly said that it may use customer data to train the consumer version of ChatGPT, but not the business data shared through its enterprise product.
"The APIs and the Enterprise chatbots from Microsoft, Google, and OpenAI all declare product terms specifying that customer data is not used for model training," one of the documents said. "Be careful to not use misleading FUD (Fear, Uncertainty, Doubt) by conflating competitors' enterprise solutions with consumer services."
Do you work at Amazon? Got a tip?
Contact the reporter, Eugene Kim, via the encrypted-messaging apps Signal or Telegram (+1-650-942-3061) or email (ekim@businessinsider.com). Reach out using a nonwork device. Check out Business Insider's source guide for other tips on sharing information securely.
from Business Insider https://ift.tt/D69JuOQ
No comments:
Post a Comment