Amazon launches AIGC suite, making the competition in the AI field more intense.

Technology Author: Notrice May 04, 2023 12:02 AM (GMT+8)

EqualOcean has learned that Amazon launched its AIGC Family Bucket - a generative AI service called Bedrock and its own big language model called Titan - on April 13.

artificial intelligence

Unlike consumer AI services from companies like Open AI, Google, and Microsoft, the Amazon AIGC Family Bucket is designed to provide a one-stop solution, an AI "pedestal" service that provides AI infrastructure for upper tier application companies to help developers and enterprises quickly build, deploy, and manage a variety of AI applications.

AIGC relies on the support of machine learning models, which are very large models pre-trained on large amounts of data, often referred to as base models. Because they have been pre-trained on large amounts of data, the base models are already highly adaptive and capable of performing a range of downstream tasks.

The Bedrock service from AWS (Amazon's cloud computing IaaS and PaaS platform service) is undoubtedly the most notable module in the AIGC family bucket. The most important feature of Bedrock is that it allows developers to easily customize models and build their own generative AI applications, with users having API access to Amazon's own Titan Big models, including two new big Language models, as well as support for calling diverse models from third parties such as AI21 Labs, Anthropic, and Stability AI.

Bedrock also supports enterprise customization of the base model, which allows model customization and fine-tuning with only a small amount of data. Similar enterprise-class services have been launched by domestic Baidu ERNIE Bot(百度文心一言) and Aliyun Tongyi Qianqin(阿里云通义千问). Bedrock is also Amazon Cloud Technologies' biggest foray into the generative AI market. According to estimates from Grand View Research, the market could be worth close to USD 110 billion by 2030.

In addition, AWS has launched two compute instances optimized for generative AI: EC2 Trn1n instance and EC2 Inf2. The training instance EC2 Trn1 is powered by its homegrown chip Trainium, which saves up to 50 percent on training costs, and EC2 Trn1n goes a step further with a 20 percent performance improvement. Inference instance Inf2 is based on its own chip Inferentia2, which offers a 4x increase in throughput and a 10x reduction in latency. AWS is the first player among cloud giants to launch a dedicated generative AI instance.

The AI programming companion, called CodeWhisperer, is a tool that assists programmers with programming, increasing the speed at which users can perform tasks, reportedly by up to 57 percent. the preview version of CodeWhisperer will be available for free to all users. This product is similar to the GPT-4-based CopilotX platform, which is live on Microsoft's GitHub, but AWS says it is the first of its kind with a built-in security scan that can find hard-to-detect vulnerabilities and suggest remediation.

Swami, global vice president of database, data analytics and machine learning at Amazon Cloud Technologies, revealed that Amazon's self-developed base model "Titan" will become one of Bedrock's base models, and will be open to it. The model is currently optimized for two different scenarios: the first scenario is similar to ChatGPT, which provides text summarization, chat and information extraction; the second scenario is called embeddings, a large language model that converts text into code that can be used for a variety of purposes, such as personalized recommendations and search.

With the evolution of related technologies, the demand for AIGC content is increasing in various industries, from marketing to customer service, to news and entertainment, and so on. This is a great opportunity for those providers who offer infrastructure services. One of the major industry pain points that the Amazon AIGC family bucket can address is: reducing the cost of training AI for enterprises. OpenAI, the current AI leader, spent billions of dollars in training ChatGPT and used more than 30,000 Nvidia A100s to build supercomputers. In addition, data availability has also become a major problem in the training process, and according to the New York Times, Reddit, the web forum giant, has started to plan to charge companies that use platform data to train models, including Microsoft, Google, OpenAI, etc. Bedrock, as a basic big model cloud service, can solve the above pain points and help more enterprises to customize AI models to meet their business needs at a lower cost. The AI models that meet their own business needs at a lower cost.

Amazon AWS used to have a first-mover advantage in the AIGC space, but this advantage is gradually being caught up with the development of Microsoft and Google. Especially after Microsoft drove the AIGC race, Amazon is not indifferent. Currently, both Microsoft and Google let developers or users use their language models by opening API interfaces. Amazon, on the other hand, has launched two new AI language models through the Bedrock platform, developed by AI research startup Anthropic and AI21 Labs, respectively. These AI models are developed and trained in direct competition with OpenAI and Google's language models.

As a result of this news, Amazon shares closed the day up 4.67% at USD 102.4 per share, with an updated market capitalization of USD 1.05 trillion.