Startfeatures

New Features in v0.1.31

This document outlines the major features added in indusagi-coding-agent v0.1.31.

Quick Start

  • MCP Guide - Learn how to use Model Context Protocol for external integrations
  • Memory Guide - Learn how Memory helps the assistant remember context

[PKG] Feature Summary

1. Model Context Protocol (MCP)

Connect your AI assistant to external tools and services.

What it enables:

  • Access external APIs and tools
  • Filesystem operations
  • [GitHub] GitHub integration
  • [Search] Web search and scraping
  • [DB] Database queries
  • [Web] Real-time web access

Quick setup:

# Create ~/.indusagi/mcp-servers.json
indusagi
# Tools are automatically available!

Examples:

User: "Read my README.md"
# Uses filesystem MCP

User: "Search GitHub for open issues"
# Uses GitHub MCP

User: "What's the latest Node.js update?"
# Uses web search MCP

Read Full MCP Guide โ†’


2. Memory System

The assistant now remembers important context across sessions.

What it enables:

  • [Storage] Persistent conversation history
  • ๐Ÿง  Semantic understanding of past decisions
  • Context-aware responses
  • Faster problem-solving
  • Project knowledge accumulation

Automatic setup: Memory works out of the box. Just start using indusagi!

Examples:

Session 1:
User: "I prefer TypeScript for all projects"

Session 2 (days later):
User: "Create a new API"
Assistant: "I'll use TypeScript as you prefer..."
# Assistant remembers your preference!

Read Full Memory Guide โ†’


Feature Comparison

Feature Before v0.1.31 After v0.1.31
External Tools Limited to built-in tools Full MCP ecosystem
GitHub Integration Manual instructions Automatic via MCP
Web Access Web fetch only Web search, fetch, scraping
Context Memory Session-only Persistent across sessions
Semantic Search Not available Full vector-based search
Database Access No direct access Multiple DB systems supported

Configuration Files

MCP Configuration

Location: ~/.indusagi/mcp-servers.json

{
  "servers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "$HOME"]
    }
  }
}

Memory Configuration

Location: ~/.indusagi/memory.json

{
  "enabled": true,
  "storage": "in-memory",
  "vectorStore": "in-memory",
  "maxMemoryItems": 1000,
  "similarityThreshold": 0.7
}

๐Ÿ› ๏ธ Common Use Cases

Use Case 1: Code Review with Context

User: "Review my TypeScript code. Remember, we prefer functional patterns."

Memory: Recalls your functional programming preference
MCP: Accesses your GitHub to fetch the code
Result: Context-aware, personalized review

Use Case 2: Project Management

User: "Update the project status"

Memory: Recalls previous project decisions
MCP: Accesses GitHub to update issues
Result: Consistent with past architectural decisions

Use Case 3: Research with Memory

Session 1: "We're building a real-time chat app"
Session 2: "What database should we use?"

Memory: Recalls the chat app context
MCP: Searches for latest solutions
Result: Recommendations tailored to your use case

Architecture

MCP Integration

indusagi
  โ†“
MCP Client Pool
  โ”œโ”€โ†’ Filesystem Server
  โ”œโ”€โ†’ GitHub Server
  โ”œโ”€โ†’ Web Search Server
  โ””โ”€โ†’ Custom Servers

Memory Integration

Conversation
  โ†“
Memory System
  โ”œโ”€โ†’ Store as vector embedding
  โ”œโ”€โ†’ Search for relevant context
  โ””โ”€โ†’ Inject into conversation
  โ†“
Enhanced Response

[Config] Environment Variables

MCP

INDUSAGI_DEBUG=1              # Enable MCP debug output
MCP_CONFIG_PATH=~/.indusagi/  # Custom config location

Memory

MEMORY_ENABLED=true           # Enable/disable memory
MEMORY_THRESHOLD=0.7          # Similarity threshold
OPENAI_API_KEY=sk-...         # For semantic embeddings

Integration with Other Features

GPT-5.4 Models

v0.1.31 includes full support for GPT-5.4:

  • gpt-5.4 - Base model
  • gpt-5.4-codex - Coding optimized (default for openai-codex)

These work seamlessly with MCP and Memory!

Clean CLI Output

MCP debug messages are suppressed by default for a clean terminal experience.

Enable with: INDUSAGI_DEBUG=1 indusagi


Getting Started

Step 1: Update to v0.1.31

npm install -g indusagi-coding-agent@0.1.31
indusagi --version
# Should show: 0.1.31

Step 2: Enable MCP (Optional)

# Copy example config
cp ~/.npm-global/lib/node_modules/indusagi-coding-agent/examples/mcp-servers.example.json ~/.indusagi/mcp-servers.json

# Edit to customize
nano ~/.indusagi/mcp-servers.json

Step 3: Set Environment Variables

export GITHUB_TOKEN="your_token"
export BRAVE_API_KEY="your_key"
export OPENAI_API_KEY="sk-..."

Step 4: Start Using!

indusagi

Memory and MCP are ready to use!


Documentation Index

Topic Document
MCP Setup & Usage MCP.md
Memory System MEMORY.md
Installation ../README.md
Troubleshooting ../README.md#troubleshooting

Tips & Tricks

MCP Tips

  • Start with filesystem and web search for maximum value
  • Set up GitHub if you work with repos
  • Use environment variables for secrets

Memory Tips

  • Be explicit about preferences and constraints
  • Reference past decisions to strengthen memory
  • Start sessions with relevant context
  • Search memory with indusagi --search-memory "query"

[Q&A] FAQ

Q: Is my memory data stored online?
A: No! Memory is stored locally in ~/.indusagi/memory/. Only OpenAI embeddings API may be called (if enabled).

Q: Can I use MCP without Memory?
A: Yes! They work independently. Enable only what you need.

Q: Does Memory slow down indusagi?
A: Minimal impact (<5% overhead). Memory retrieval is <100ms.

Q: Can I share Memory across machines?
A: Export and import: indusagi --export-memory and indusagi --import-memory file.json

Q: How often should I clear Memory?
A: You don't need to! Memory automatically manages old items. Clear only if privacy is a concern.


Getting Help

If you encounter issues:

  1. MCP Issues: See MCP.md Troubleshooting
  2. Memory Issues: See MEMORY.md Troubleshooting
  3. General Issues: Check ../README.md

Version: 0.1.31
Released: March 2026
Status: ยท Production Ready