More than seven in 10 cloud environments are adopting managed AI services, either directly or through the Azure SDK, according to a new analysis. The widespread use of OpenAI shows that generative AI tools are “rapidly becoming a part of cloud business models.” universal”.
The study from cloud security platform provider Wiz analyzed more than 150,000 public cloud accounts and noted that Microsoft “is in the lead.” 70% of analyzed Azure environments include Azure AI service instances, equivalent to two-fifths (39%) of all cloud environments. The report states that Azure OpenAI usage increased by 228% within four months of 2023.
Azure’s lead may not be surprising given that Amazon Bedrock, AWS’ fully managed offering, only fully launched in September. The report acknowledged this fact. However, Amazon SageMaker’s deployment rate lags only 38% behind Azure AI services. While Amazon Bedrock was released during the study period and was therefore not included in the analysis, Wiz noted through preliminary activity analysis that at least 15% of organizations appear to be deploying it.
Perhaps not surprisingly, the majority of users of managed AI services are still in the experimental phase as defined by Wiz – 32% of overall users. However, it’s a close call, with 28% of those surveyed being defined as active users and 10% as advanced users.
How did Wiz come to this conclusion? The report notes that while instances cannot be defined by workload, whether in development, production or otherwise, the analysis is tied to the number of instances of a given service in any cloud environment. Power consumers are defined as consumers with 50 or more instances in their environment. This may be considered low, but it also has some limiting effects; the cost of training and fine-tuning is very high, and some providers are enforcing strict quotas on the number of AI service instances that can be deployed per customer.
More than half (53%) of the analyzed cloud environments use OpenAI or the Azure OpenAI SDK, which allows integration with OpenAI models from GPT to DALL-E. The report states that self-hosted artificial intelligence and machine learning software is “very common” in the cloud; 45% of analyzed objects use Hugging Face Transformers, LangChain was found in 32% of environments, and the Tensorflow Hub library was found in 22% of environments .
Looking ahead, Wiz noted that the cost of training and inference (note the prohibitive prices mentioned above) will be a top priority for customers over the next 12 months. However, this will lead to some forks in the road when organizations actually get to grips with the technology.
“In 2024, many companies are likely to decide which experimental paths are worth investing in and which AI-based products and capabilities they will pursue,” the report concludes. “As many organizations are experimenting with generative AI in parallel, we expect to see Revealing exactly whether and how this technology can improve efficiency and enable never-before-seen capabilities.”
You can read the full State of Cloud AI report on the Wiz website (email required).
Photo by Rafael Garcin on Unsplash
Want to learn more about cybersecurity and the cloud from industry leaders? Check out the Cyber Security and Cloud Expo in Amsterdam, California, and London. Explore other upcoming enterprise technology events and webinars powered by TechForge here.