Cloudflare and Meta have joined forces to make Llama 2 accessible all over the world. On Cloudflare’s serverless infrastructure, Llama 2 is now ready for worldwide consumption, offering “privacy-first, local inference” to everyone. Over one million developers who are currently working on Cloudflare now have access to a top LLM to improve their apps thanks to Cloudflare.
Developers creating artificial intelligence (AI) applications on Cloudflare’s developer platform, Workers, can now have access to the Llama 2 open source large language model (LLM).
A call to a proprietary model was up until very recently the sole method to get access to an LLM. The majority of developers will not be able to participate in training LLMs because it requires a significant expenditure of resources (including time, computer power, and money) and is hence beyond of their reach, according to Cloudflare. A significant change has been introduced as a result of the release of Llama 2 by Meta. This LLM is now accessible to the public and enables developers to operate and deploy their very own LLMs. However, this still needs access to the infrastructure necessary to operate the LLM and management of that infrastructure.
Enhancing the LLM
The hyper-distributed edge network that Cloudflare is developing will make it possible for developers located everywhere to construct applications using Llama 2. They would also be able to accomplish this with the help of Cloudflare’s Data Localization Suite, which will allow them to select the location where their data is handled. By guaranteeing that data used for inference is not utilized for training or enhancing the LLM, the privacy-first approach to application development that Cloudflare takes may help businesses win the confidence of their customers.
“Giving every developer access to Llama 2, one of the most robust LLMs to date, with no configuration required, is going to propel generative AI forward in ways that we’ve only imagined so far,” said Matthew Prince, CEO and co-founder of Cloudflare. “Meta has been a great contributor to the open source community, and by bringing that same approach to AI, we are going to help ensure that powerful AI is accessible to all developers, and their communities, around the world, with data localization built in.”