Microsoft's lightweight Phi-3 Mini model can run on smartphones - Help with AI for latest technology

Breaking

Underreview for latest technology gadgets and worldwide technologies, AI, Machine Learning, Neural networks, Artificial intelligence, Tensorflow, Deep Learning, DeepAI, Python,JavaScript,OpenCv, ChatBot, Natural Language Processing,Scikit-learn

Tuesday, 23 April 2024

Microsoft's lightweight Phi-3 Mini model can run on smartphones

Microsoft has unveiled its latest light AI model called the Phi-3 Mini designed to run on smartphones and other local devices, it revealed in a new research paper. Trained on 3.8 billion parameters, it's the first of three small Phi-3 language models the company will release in the near future. The aim is to provide a cheaper alternative to cloud-powered LLMs, allowing smaller organizations to adopt AI. 

According to Microsoft, the new model handily outperforms its previous Phi-2 small model and is on par with larger models like Llama 2. In fact, the company says Phi-3 Mini provides responses close to the level of a model 10 times its size. 

"The innovation lies entirely in our dataset for training," according to the research paper. That dataset is based on the Phi-2 model, but uses "heavily filtered web data and synthetic data," the team states. In fact, a separate LLM was used to do both of those chores, effectively creating new data that allows the smaller language model to be more efficient. The team was supposedly inspired by children's books that use simpler language to get across complex topics, according to The Verge

Microsoft's lightweight Phi-3 Mini model can run on smartphones
Microsoft

While it still can't produce the results of cloud-powered LLMs, Phi-3 Mini can outperform Phi-2 and other small language models (Mistral, Gemma, Llama-3-In) in tasks ranging from math to programming to academic tests. At the same time, it runs on devices as simple as smartphones, with no internet connection required.

Its main limitation is breadth of "factual knowledge" due to the smaller dataset size — hence why it doesn't perform well in the "TriviaQA" test. Still, it should be good for models like that only require smallish internal data sets. That could allow companies that can't afford cloud-connected LLMs to jump into AI, Microsoft hopes.

Phi-3 Mini is now available on Azure, Hugging Face and Ollama. Microsoft is next set to release Phi-3 Small and Phi-3 Medium with significantly higher capabilities (7 billion and 14 billion parameters, respectively). 

This article originally appeared on Engadget at https://ift.tt/g2RyNHb

from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/g2RyNHb

No comments:

Post a Comment

Guys Comments for Revolutionary Change!!!