Abu Dhabi based Technology Innovation Institute Trains its open source Falcon 40B model on AWS – Express Computer

Abu Dhabi based Technology Innovation Institute Trains its open source Falcon 40B model on AWS - Express Computer

Amazon Web Services announced that Technology Innovation Institute (TII), a leading global scientific research center in Abu Dhabi, trained its top-performing, open source Falcon 40B model on AWS. Falcon 40B is a 40-billion-parameter large language model (LLM) available under the Apache 2.0 license that ranked #1 in Hugging Face’s Open LLM Leaderboard, which tracks, ranks, and evaluates LLMs across multiple benchmarks to identify top performing models. Customers can now deploy Falcon 40B from Amazon SageMaker JumpStart, a machine learning (ML) hub that offers pre-trained models, giving customers access to Falcon 40B’s state-of-the-art accuracy and industry-leading performance without having to build their own model from scratch.

LLMs are a subset of machine learning (ML) models focused on language that can power a broad range of generative AI applications, from text processing and summarization to question answering. While these LLMs have the potential to transform industries, the process of building, training, and deploying an LLM can take weeks, if not months, and cost tens of millions of dollars, putting them out of reach of many companies. To drive better performance and cost efficiencies throughout the development process, numerous customers, including Stability AI, AI 21 Labs, Hugging Face, and LG AI rely on Amazon SageMaker, AWS’s end-to-end machine ML service, to build, train, and deploy their LLMs.

That is why TII turned to Amazon SageMaker to build its Falcon 40B model. Because SageMaker is a fully managed service, TII could focus on developing custom training mechanisms and optimizations instead of managing its ML infrastructure. To minimize training costs and reduce time to market, TII pursued several optimizations, including writing a custom matrix multiplication, to accelerate training speed. Throughout the training process, AWS also worked closely with TII to enhance resiliency using SageMaker, ensuring that training ran smoothly and reducing interruptions that required developer attention.

TII released its Falcon 40B model in May 2023 under the Apache 2.0 license. Since its release, Falcon 40B has outperformed similar contemporary models across various benchmarks, demonstrating exceptional performance without specialized fine-tuning. To make it easier for customers to access this state-of-the-art model, AWS has also made Falcon 40B available to customers via Amazon SageMaker Jumpstart. Now customers of every size and across every industry can quickly and easily deploy their own Falcon 40B model and customize it to fit their specific needs for applications such as translation, question answering, summarizing information, or identifying images.

“Falcon-40B’s open source release empowers organizations to harness its exceptional capabilities and drive advancements in AI-driven solutions. It represents a significant milestone in our commitment to fostering AI innovation and exemplifies the profound scientific contributions of the UAE,” said executive director of AI-Cross Center Unit and Project Lead for LLM Projects at TII. “By making Falcon LLM open-source, we aim to enable widespread access to its advanced tech capabilities and empower researchers and organizations worldwide. Next steps include contributing to further advancements in the field of AI and advanced technologies, with new models on the horizon, and promoting the utilization of advanced AI tech within UAE organizations and businesses.”

To help customers get started quickly with ML, customers can deploy and use the Falcon models easily in SageMaker Studio or programmatically through the SageMaker Python SDKFalcon 40B are generally available today through Amazon SageMaker JumpStart in US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Sydney), Asia Pacific (Seoul), Europe (London), and Canada (Central), with availability in additional AWS Regions coming soon.