Llama4Scout
17B
Overview
Llama 4 Scout is auto-regressive language models that use a mixture-of-experts (MoE) architecture and incorporate early fusion for native multimodality.
Files and versions
Parameters
17B
Quantization
Q2_K
Download
Copy
Gaia CLI Command
Copy
Model Metadata
Gaia Domains
Architecture:
-
Finetune:
-
Parameters:
-
Quantization:
-
Prompt Template:
-
Sign up for updates from the Gaia team
SUBMIT
Products
Gaia Domain
AI Agent Domains
LLM Library
Gaia XP
Network Map
Chat
Developers
Docs
Use Cases
Ecosystem
Get Involved
Community
Blog
Events
FAQs
Social
X
Discord
Telegram
GitHub
Hugging Face
©2025 Gaia
Terms