Home Technology Microsoft homegrown AI model can run on smartphones, laptops – Times of India

Microsoft homegrown AI model can run on smartphones, laptops – Times of India

0
Microsoft homegrown AI model can run on smartphones, laptops – Times of India

[ad_1]

Google recently brought its AI model Gemini to the Pixel 8 Pro. The model that runs on the smartphone is the lightest and most efficient model (Gemini Nano) of the group, and now it seems there’s a competitor: Microsoft.
Microsoft has released the latest version of its AI model called Phi-2. The company said that this particular model was trained only on high quality data and is small enough to run locally on a laptop or mobile device.
Phi-2 can ‘outperform’ Meta Llama 2
Microsoft also claimed that Phi-2 can outperform big AI models from companies such as Meta and will be available through Microsoft Azurecloud at a fraction of the cost.Phi-2 is a “small language model” and Microsoft says it requires less computing power than models likeOpenAI’s GPT-4 or Meta’s Llama-2. The AI model can carry out tasks, such as generating text or describing images.
“With only 2.7 billion parameters, Phi-2 surpasses the performance of Mistral and Llama-2 models at 7B and 13B parameters on various aggregated benchmarks. Notably, it achieves better performance compared to 25x larger Llama-2-70B model on mutistep reasoning tasks, i.e., coding and math,” Microsoft said.
It went on to claim that Phi-2 matches or outperforms the recently-announced Google Gemini Nano, despite being smaller in size.
Microsoft’s plan B
This may suggest that Microsoft is not relying entirely on OpenAI, the startup in which it reportedly invested more than $10 billion. OpenAI’s GPT-4 model powers Copilot and Office 365. As per a previous report by The Information, Microsoft wants small language models to help it cut back on costs.
The report noted that Peter Lee, who oversees Microsoft’s 1,500 researchers, directed several researchers in the company to develop conversational AI that may not perform as well as OpenAI’s but that is smaller in size and costs far less to operate.



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here