Apple Releases 8 Small AI Language Models To Compete With Microsoft’s Phi-3

Apple Releases 8 Small AI Language Models To Compete With Microsoft’s Phi-3

Seeing strength in numbers, Apple has made a strategic move in the competitive artificial intelligence marketplace by making eight small AI models available. Collectively called OpenELM, the compact tools are designed to run on devices and offline—perfect for smartphones.Published on the open-source AI community The pre-trained models provide a base atop which users can fine tune and develop. The instruction-tuned models are already programmed to respond to instructions, making them more suitable for conversations and interactions with end users.While Apple hasn’t suggested specific use cases for these models, they could be applied to run assistants that can parse emails and texts, or provide intelligent suggestions based on the data. This is an approach similar to one The models were trained on publicly available datasets, and Apple is sharing both the code for CoreNet (the library used to train OpenELM) and the “recipes” for its models. In other words, users can inspect how Apple built them.The Apple release comes shortly after Being open source and lightweight, Phi-3 Mini could potentially replace traditional assistants like Apple’s Siri or Google’s Gemini for some tasks, and Microsoft has already tested Phi-3 on an iPhone and reported satisfactory results and fast token generations.While Apple has not yet integrated these new AI language model capabilities into its consumer devices, the upcoming iOS 18 update is Apple hardware has an advantage in local AI use, as it combines device RAM with GPU video RAM (or VRAM). This means that a Mac with 32 GB of RAM (a common configuration in a PC) can utilize that RAM as it would GPU VRAM to run AI models. By comparison, However, Apple lags behind Windows/Linux in the area of AI development. Most AI applications revolve around hardware designed and built by Nvidia, which Apple phased out in support of its own chips. This means that there is relatively little Apple-native AI development, and as a result, using AI on Apple products requires translation layers or other complex procedures.