Production-Ready RAG Platform

Enterprise Features for Modern RAG

Everything you need to build, deploy, and scale production RAG applications. Built on LlamaIndex and LangGraph with enterprise-grade security and flexibility.

Agentic Workflows

Powered by LangGraph, Ragpie breaks down complex queries into manageable sub-tasks, orchestrates tool usage, and reasons through multi-step processes. Perfect for knowledge-intensive workflows that require advanced reasoning and planning.

  • Query decomposition for complex questions
  • Multi-step reasoning with state management
  • Tool orchestration and function calling
  • Graph-based workflow execution
  • Automatic error handling and retries
  • Built-in observability and debugging
python
# Define an agentic RAG workflow
from ragpie import AgenticRAG, QueryPlanner

# Initialize with LangGraph
agent = AgenticRAG(
    llm="gpt-4",
    tools=["search", "calculate", "summarize"],
    max_iterations=5
)

# Complex query gets automatically decomposed
query = "Compare Q4 2024 revenue across regions and forecast Q1 2025"
result = agent.execute(query)

# Agent automatically:
# 1. Decomposes into sub-queries
# 2. Retrieves relevant data
# 3. Performs calculations
# 4. Synthesizes final answer

Agentic workflows handle complex multi-step reasoning automatically

Multi-Modal Processing

Extract insights from text documents, images, tables, code snippets, and structured data. Ragpie handles document parsing, OCR, table extraction, and semantic understanding across multiple modalities seamlessly.

  • Process PDFs, Word docs, PowerPoint, and spreadsheets
  • Extract text from images with OCR
  • Parse tables and preserve structure
  • Understand code syntax across languages
  • Handle handwritten notes and diagrams
  • Maintain document hierarchy and relationships
python
# Process multi-modal documents
from ragpie import DocumentProcessor

processor = DocumentProcessor()

# Upload mixed content
files = [
    "contract.pdf",          # Legal document
    "financial_report.xlsx", # Spreadsheet with tables
    "diagram.png",           # Architecture diagram
    "codebase.zip"          # Source code
]

# Automatically extracts and indexes all content
documents = processor.process(files)

# Query across all modalities
result = agent.query(
    "What are the key terms in the contract and "
    "how does the architecture support them?"
)

Seamlessly process and query across text, images, tables, and code

Customizable Retrieval

Fine-tune your retrieval pipeline with hybrid search, semantic re-ranking, custom chunking strategies, and advanced filtering. Optimize for accuracy, speed, or cost based on your specific use case.

  • Hybrid search combining semantic + keyword matching
  • Neural re-ranking for improved precision
  • Customizable chunking (fixed, semantic, recursive)
  • Metadata filtering and faceted search
  • Query expansion and synonym handling
  • Configurable top-k and similarity thresholds
python
# Configure advanced retrieval
from ragpie import RetrievalConfig, ChunkingStrategy

config = RetrievalConfig(
    # Hybrid search: 70% semantic, 30% keyword
    search_type="hybrid",
    semantic_weight=0.7,

    # Re-ranking with cross-encoder
    reranker="cross-encoder",
    rerank_top_k=20,

    # Custom chunking
    chunking=ChunkingStrategy(
        method="semantic",
        chunk_size=512,
        overlap=50
    ),

    # Metadata filtering
    filters={"department": "legal", "year": 2024}
)

retriever = agent.configure_retrieval(config)
results = retriever.retrieve("find GDPR compliance docs")

Customize every aspect of the retrieval pipeline for optimal results

Enterprise Security

Built for regulated industries with end-to-end encryption, granular access controls, comprehensive audit logging, and compliance certifications including SOC 2, GDPR, HIPAA, and ISO 27001.

  • End-to-end encryption (data at rest and in transit)
  • Role-based access control (RBAC) with fine-grained permissions
  • SSO/SAML integration with enterprise identity providers
  • Comprehensive audit logs for all operations
  • SOC 2 Type II, GDPR, HIPAA compliant infrastructure
  • Data residency controls and regional deployments
python
# Configure security and access controls
from ragpie import SecurityConfig, AccessPolicy

security = SecurityConfig(
    # Encryption
    encryption="AES-256",
    key_rotation_days=90,

    # Access control
    rbac_enabled=True,
    sso_provider="okta",

    # Compliance
    compliance_mode="hipaa",
    audit_logging=True,
    data_residency="eu-west-1"
)

# Define access policies
policy = AccessPolicy(
    role="legal_analyst",
    permissions=["read", "search"],
    resources=["legal_docs/*"],
    ip_whitelist=["10.0.0.0/24"]
)

agent = AgenticRAG(security=security, policies=[policy])

Enterprise-grade security and compliance out of the box

Flexible Deployment

Deploy Ragpie on cloud, on-premises, or hybrid environments. Kubernetes-native architecture with Docker containers, infrastructure-as-code templates, and support for all major cloud providers.

  • Cloud deployment (AWS, GCP, Azure) with auto-scaling
  • On-premises installation for sensitive data
  • Hybrid deployments with data synchronization
  • Kubernetes-native with Helm charts
  • Infrastructure-as-code (Terraform, CloudFormation)
  • Multi-region and edge deployment support
yaml
# Deploy with Kubernetes
apiVersion: apps/v1
kind: Deployment
metadata:
  name: ragpie-production
spec:
  replicas: 3
  template:
    spec:
      containers:
      - name: ragpie
        image: ragpie/server:latest
        env:
        - name: DEPLOYMENT_MODE
          value: "on-premises"
        - name: DATA_RESIDENCY
          value: "us-east-1"
        resources:
          limits:
            cpu: "2"
            memory: "4Gi"
        volumeMounts:
        - name: data
          mountPath: /data
---
# Or use docker-compose for local development
version: '3.8'
services:
  ragpie:
    image: ragpie/server:latest
    ports:
      - "8000:8000"
    environment:
      - OPENAI_API_KEY=${OPENAI_API_KEY}

Deploy anywhere with containers, Kubernetes, or traditional infrastructure

LLM Flexibility

Switch between OpenAI, Anthropic, Cohere, or bring your own LLM with a unified API. Optimize for cost, latency, or accuracy by routing queries to different models based on complexity.

  • Support for OpenAI (GPT-4, GPT-3.5), Anthropic (Claude), Cohere
  • Bring your own model (open-source or custom)
  • Automatic model routing based on query complexity
  • Cost optimization with model fallbacks
  • Fine-tuned models for domain-specific tasks
  • Unified API across all providers
python
# Configure multiple LLM providers
from ragpie import LLMConfig, ModelRouter

llm_config = LLMConfig(
    providers={
        "openai": {
            "api_key": "sk-...",
            "models": ["gpt-4", "gpt-3.5-turbo"]
        },
        "anthropic": {
            "api_key": "sk-ant-...",
            "models": ["claude-3-opus", "claude-3-sonnet"]
        },
        "cohere": {
            "api_key": "...",
            "models": ["command-r-plus"]
        }
    }
)

# Automatic routing based on query complexity
router = ModelRouter(
    simple_queries="gpt-3.5-turbo",  # Cost-effective
    complex_queries="gpt-4",          # High accuracy
    reasoning_queries="claude-3-opus" # Best reasoning
)

agent = AgenticRAG(llm_config=llm_config, router=router)

Mix and match LLM providers for optimal cost and performance

Built on Battle-Tested Open Source

Ragpie leverages industry-leading frameworks and is MIT licensed for maximum flexibility

LlamaIndex

Advanced data framework for LLM applications with powerful indexing and retrieval

LangGraph

Build stateful, multi-actor applications with LLMs using graph-based workflows

MIT Licensed

Open source core with commercial support for enterprise deployments

Trusted by developers at

Enterprise Co.Tech StartupResearch Lab

Ready to Build Production RAG?

Start with our free tier and scale as you grow. No credit card required.