Read Time:24 Second
Microsoft Research has done it once again. After outperforming Meta’s LLaMa with phi-1 in July, the researchers have now introduced phi-1.5, a cutting-edge language model of 1.3 billion parameters that outperforms Llama 2’s 7 billion parameters model on several benchmarks. Microsoft has decided to open source the model. The phi-1.5 model, comprising a staggering 1.3 […]
The post Microsoft’s 1.3 Billion Model Outperforms Llama 2 appeared first on Startup Reporter.