Home InfiniLoom Documentation GitHub
Documentation Menu
Getting Started

InfiniLoom Documentation

InfiniLoom is a high-performance repository context generator for large language models. Transform your codebase into optimized formats for Claude, GPT-4, Gemini, Llama, Mistral, DeepSeek, Qwen, and other AI models.

Installation

# Using Cargo (Rust)
cargo install infiniloom

# Using Homebrew
brew tap Topos-Labs/infiniloom
brew install --cask infiniloom

# Using pip (Python)
pip install infiniloom

# Using npm (Node.js)
npm install -g infiniloom

Quick Start

# Pack a repository for Claude
infiniloom pack ./my-project --model claude --output context.xml

# Generate a repository map
infiniloom map ./my-project

# Scan for secrets
infiniloom scan ./my-project --security-check

Commands

Output Formats

InfiniLoom supports multiple output formats optimized for different use cases:

Supported Models

Token counting and optimization for: