Guide

🌱 Beginner — AI Fundamentals

Chapter 20 of 24

🔢 Chapter 20: Open Source & Parameters

Ollama, and what billions of parameters mean

Open-source models (Llama, Mistral, etc.) can be run locally with tools like Ollama. "7B" or "70B" = 7 or 70 billion parameters — the learned weights inside the model. More parameters usually mean better quality but need more RAM/GPU.

What does "7B parameters" mean?

Parameters = the learned numbers (weights) inside the model. 7B = 7 billion. More parameters usually = smarter but heavier (more RAM, slower).

Tiny (e.g. 1B)
1B
Small (7B)
7B
Medium (13B)
13B
Large (70B)
70B
XL (175B+)
175B

Ollama runs open-source models locally (Llama, Mistral, etc.). You pick model size by parameter count: 7B fits on a laptop; 70B needs more GPU/RAM.

Ollama in one line

Download and run open models on your machine; pick a size (e.g. 7B for laptops, 70B for powerful machines) and chat or use an API.