Llama4Scout
17B
Overview
Llama 4 Scout is auto-regressive language models that use a mixture-of-experts (MoE) architecture and incorporate early fusion for native multimodality.
Files and versions
Parameters
17B
Quantization
Q2_K
Download
Copy
Gaia CLI Command
Copy
Model Metadata
Gaia Domains
Architecture:
-
Finetune:
-
Parameters:
-
Quantization:
-
Prompt Template:
-
Sign up for updates from the Gaia team
SUBMIT
Products
Gaia Domain
Agents
Chat
Network Map
Samsung Galaxy
Developers
LLM Library
Dev Docs
Gaia Cookbook
Use Case
Whitepaper
Community
Academy
Blog
Ecosystem
Guardian Program
Contribution
XP Program
Early Contributors
©2025 Gaia
Terms