To Data & Beyond

To Data & Beyond

Share this post

To Data & Beyond
To Data & Beyond
Textbooks Are All You Need: Microsoft PHI-1.5 Model with 1.3 Billion Parameters

Textbooks Are All You Need: Microsoft PHI-1.5 Model with 1.3 Billion Parameters

Introduction to Microsoft’s 1.3 billion parameter model which outperformed Llama 2’s 7-billion parameter model

Youssef Hosni's avatar
Youssef Hosni
Oct 08, 2023
∙ Paid
1

Share this post

To Data & Beyond
To Data & Beyond
Textbooks Are All You Need: Microsoft PHI-1.5 Model with 1.3 Billion Parameters
1
Share

To Data & Beyond is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Small Transformer-based language models initiated by TinyStories — a 10 million parameter model that can produce coherent English followed-up by phi-1, a 1.3 billion parameter model with Python coding performance close to the state-of-the-art. Phi-1 proposed using existing Large Language Models (LLMs) to generate “textbook quality” data as a way to enhance the learning process compared to traditional web data. 

Microsoft followed the “Textbooks Are All You Need” approach, focusing this time on common sense reasoning in natural language, and created a new 1.3 billion parameter model named phi-1.5, with performance on natural language tasks comparable to models 5x larger, and surpassing most non-frontier LLMs on more complex reasoning tasks such as grade-school mathematics and basic coding. 

More generally, phi-1.5 exhibits many of the traits of much larger LLMs, both good –such as the ability to “think step by step” or perform some rudimentary in-context learning– and bad, including hallucinations and the potential for toxic and biased generations –encouragingly though, we are seeing improvement on that front thanks to the absence of web data. Luckily Microsft published Phi-1.5 as an open source to promote further research on these urgent topics.

Table of Contents:

  1. From Phi 1 to Phi 1.5

  2. Phi-1.5 Overcoming Biasing

  3. Phi-1.5 Benchmark

  4. Getting Started with Phi-1.5

  5. Limitations of phi-1.5


This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Youssef Hosni
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share