Llama4Scout
17B
OverviewLlama 4 Scout is auto-regressive language models that use a mixture-of-experts (MoE) architecture and incorporate early fusion for native multimodality.
Files and versions
Parameters
17B
Quantization
Q2_K
Download
Copy
Gaia CLI Command
Copy
Model Metadata
Gaia Domains
Architecture:
-
Finetune:
-
Parameters:
-
Quantization:
-
Prompt Template:
-
Sign up for updates from the Gaia team
Products
Gaia DomainAgentsChatNetwork MapSamsung Galaxy
Developers
LLM LibraryDev DocsGaia CookbookUse CaseWhitepaper
Community
AcademyBlogEcosystemGuardian ProgramAI Sovereignty Alliance
Contribution
XP ProgramEarly Contributors
Legal and Privacy
Terms of ServiceMiCA White Paper
©2025 GaiaTerms