0 0
Read Time:24 Second

Microsoft Research has done it once again. After outperforming Meta’s LLaMa with phi-1 in July, the researchers have now introduced phi-1.5, a cutting-edge language model of 1.3 billion parameters that outperforms Llama 2’s 7 billion parameters model on several benchmarks. Microsoft has decided to open source the model.  The phi-1.5 model, comprising a staggering 1.3 […]

The post Microsoft’s 1.3 Billion Model Outperforms Llama 2 appeared first on Startup Reporter.

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

By

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *