The Rise of Generative AI: How Asia is Shaping the Future of Large Language Models

The rise of Generative AI in Asia, Meta AI rolls out in India, and pitfalls to avoid in processing data with LLMs.
The Rise of Generative AI: How Asia is Shaping the Future of Large Language Models

The Rise of Generative AI: How Asia is Shaping the Future of Large Language Models

The recent report from UK research firm Omdia reveals a stark contrast in AI development strategies between American and Asian companies. While US firms heavily invest in Artificial General Intelligence (AGI), Asian nations are prioritizing localization and the preservation of multilingual cultures, with cloud service providers and telecom companies playing a pivotal role.

“The global surge in interest sparked by OpenAI’s ChatGPT has prompted countries in Asia and Oceania to accelerate their AI initiatives.” - Omdia report

Asia’s Generative AI Strategy

The report, titled “Asia and Oceania: Local and Regional Responses to Generative AI,” examines the response of Asian and Oceanic countries to Generative AI. It highlights the region’s linguistic and cultural diversity, encompassing numerous developing nations. Omdia forecasts that from 2024 to 2028, Generative AI software revenue in this region will soar to US$18.3 billion, with a compound annual growth rate (CAGR) of 53%, far outpacing the 13% CAGR for predictive AI.

Despite this, predictive AI will continue to coexist with Generative AI, catering to different application scenarios. The global surge in interest sparked by OpenAI’s ChatGPT has prompted countries in Asia and Oceania to accelerate their AI initiatives. They are determined to prevent a handful of tech giants from monopolizing the development of large language models (LLMs) while ensuring stability and security in their AI advancements.

Localization and Multilingual Preservation

Most LLMs training data is derived from English or other dominant languages. Likewise, LLM safety mechanisms are often rooted in Western cultural values, which may not be suitable for diverse regional contexts. In response, many organizations are shifting towards using localized data, infrastructure, and models for developing and deploying Generative AI.

China, South Korea, and Japan are notably proactive in promoting AI localization, ensuring their AI solutions reflect local identities and cultural needs. Prominent companies like Alibaba, Baidu, Huawei, and Tencent, along with SK Telecom (SKT) and Korea Telecom (KT), are creating their own AI chips, infrastructure, and related solutions.

“Nvidia co-founder and CEO Jensen Huang advocates for each country to develop sovereign AI, controlling their own data and models.” - Jensen Huang

Meta AI Rolls Out in India

Tech giant Meta has announced the availability of its artificial intelligence (AI) assistant in India on WhatsApp, Facebook, Messenger, Instagram, and meta.ai – built with the latest ‘Llama 3’ large language model (LLM). Millions of users in the country can apply Meta AI in feed, chats, and more across the apps to get things done, create content, and deep dive on topics, without having to leave the app they are using.

“Ask Meta AI to give you ideas of places to stop on a road trip. Cramming for a test? Ask Meta AI on the web to create you a multiple-choice test.” - Meta

AI assistant Meta AI assistant

Pitfalls to Avoid in Processing Data with LLMs

As organizations in every industry rush to enrich their own private data sets with LLMs, the quest for more and better data is unfolding at a scale never seen before, stretching the limits of present-day infrastructure in new and disruptive ways. Yet the sheer scale of the data sets ingested by LLMs raises an important question: Is more data really better if you don’t have the infrastructure to handle it?

Training LLMs on internal data poses many challenges for data and development teams. This entails the need for considerable compute budgets, access to powerful GPUs (graphics processing units), complex distributed compute techniques, and teams with deep machine learning (ML) expertise.

“Don’t saddle your data scientists with infrastructure worries. Let them use these astounding innovations to test hypotheses and gain insights that can help you train and optimize your data models and drive value that can help differentiate your organization in the market and lead to the creation of new products.” - Erik Landerholm

Data scientists Data scientists

To navigate this golden age of opportunity effectively, choose a platform that helps you focus on your differentiators while automating the foundational elements of building your AI stack. Look for a solution that gives you choice and flexibility in GPU usage and where you run your stack. Lastly, find an option that offers ephemeral environments that allow you to optimize costs by paying only for the resources you use.