CLI Guide
The Gentoro OneMCP CLI is your command center for connecting APIs to AI models. This guide covers everything from installation to advanced workflows.
Installation
Quick Install (Recommended)
Install globally via npm:
curl -sSL https://raw.githubusercontent.com/Gentoro-OneMCP/onemcp/main/cli/install.sh | bashThis method works on macOS, Linux, and Windows (WSL).
Alternative Installation Methods
System-wide Installation
Install to /usr/local/bin (requires sudo):
curl -sSL https://raw.githubusercontent.com/Gentoro-OneMCP/onemcp/main/cli/install.sh | \
ONEMCP_INSTALL_METHOD=system-wide bashLocal Installation
Install to ~/.local/bin (no sudo required):
curl -sSL https://raw.githubusercontent.com/Gentoro-OneMCP/onemcp/main/cli/install.sh | \
ONEMCP_INSTALL_METHOD=local-bin bashNote: Ensure ~/.local/bin is in your PATH.
Install from Custom Branch
For testing or development:
curl -sSL https://raw.githubusercontent.com/Gentoro-OneMCP/onemcp/main/cli/install.sh | \
ONEMCP_REPO_BRANCH=your-branch-name bashLocal Development
Clone and build from source:
git clone https://github.com/Gentoro-OneMCP/onemcp.git
cd onemcp/cli
npm install && npm run build && npm linkCore Workflows
Starting Your First Chat
The simplest way to get started:
onemcp chatOn first run, the setup wizard will:
- Ask you to choose an AI provider (OpenAI, Anthropic, or Gemini)
- Prompt for your API key
- Let you choose between the ACME Analytics example or your own API
- Start all required services automatically
Working with AI Providers
OneMCP supports multiple AI providers. You can configure and switch between them easily.
Set Up a Provider
onemcp provider setThis interactive command will:
- Let you choose a provider (OpenAI, Anthropic, Gemini)
- Prompt for your API key
- Save the configuration
Switch Between Providers
onemcp provider switchChoose from your configured providers. No need to re-enter API keys.
List Configured Providers
onemcp provider listShows all providers you’ve configured and their API keys (masked).
Managing Handbooks
Handbooks are the core organizational unit in OneMCP. Each handbook represents a service or API you want to connect to AI.
Create a New Handbook
onemcp handbook init my-serviceThis creates a new handbook directory with the standard structure:
~/handbooks/my-service/
├── Agent.md # Instructions for the AI
├── apis/ # OpenAPI specifications
├── docs/ # Supplementary documentation
├── regression/ # Optional test definitions
└── state/ # Auto-generated runtime dataList All Handbooks
onemcp handbook listShows all handbooks in your handbooks directory.
Switch Active Handbook
onemcp handbook use my-serviceSets my-service as the current handbook. All subsequent onemcp chat commands will use this handbook.
Check Current Handbook
onemcp handbook currentShows which handbook is currently active.
Validate Handbook Structure
onemcp handbook validateChecks that your handbook has the required structure and files.
Chatting with Different Handbooks
You can chat with any handbook without changing the active one:
# Chat with the current handbook
onemcp chat
# Chat with a specific handbook
onemcp chat analytics-dashboard
# Switch handbooks during chat
# Type 'switch' in the chat interfaceConnecting Your Own API
Here’s a complete workflow for connecting your own API:
1. Create a Handbook
onemcp handbook init my-api
onemcp handbook use my-api2. Add Your OpenAPI Specification
Place your OpenAPI spec in the handbook’s apis/ directory:
cp path/to/your-api-spec.yaml ~/handbooks/my-api/apis/3. Configure Service Authentication
If your API requires authentication:
onemcp service auth my-apiThis creates a service.yaml file with your authentication configuration:
service: my-api
header: Authorization
pattern: Bearer {token}
token: your-token-here
expiresAt: 2025-12-31T23:59:59Z4. Add Documentation (Optional)
Place any relevant documentation in the docs/ directory:
cp path/to/docs/*.md ~/handbooks/my-api/docs/5. Write Agent Instructions
Edit ~/handbooks/my-api/Agent.md to provide instructions for the AI:
# My API Agent
You are an assistant that helps users interact with My API.
## Capabilities
- Query customer data
- Generate reports
- Update records
## Guidelines
- Always validate input before making API calls
- Provide clear summaries of results
- Handle errors gracefully6. Start Chatting
onemcp chatThe CLI will automatically index your OpenAPI spec and documentation.
Managing Service Authentication
Configure Authentication
onemcp service auth my-servicePrompts for:
- Service name
- Authentication header (e.g.,
Authorization) - Token pattern (e.g.,
Bearer {token}) - Token value
- Expiration date
Renew a Token
onemcp service renew my-serviceUpdates the token and expiration date for an existing service.
List Configured Services
onemcp service listShows all services with authentication configured.
Service Management
Check Service Status
onemcp statusShows the health status of all running services:
- OneMCP Server (port 8080)
- TypeScript Runtime (port 7070)
- Mock Server (port 8082, if enabled)
- OpenTelemetry Collector (port 4317, if enabled)
Stop All Services
onemcp stopGracefully stops all OneMCP services.
View Logs
# View all logs
onemcp logs
# View logs for a specific service
onemcp logs app
onemcp logs typescript
onemcp logs mock
# Follow logs in real-time
onemcp logs -f
# Show last N lines
onemcp logs -n 100Architecture
The CLI manages multiple services that run as child processes:
OneMCP Server (Port 8080)
The main Spring Boot application that:
- Exposes the MCP endpoint at
/mcp - Handles AI model interactions
- Manages the knowledge base
- Orchestrates tool execution
TypeScript Runtime (Port 7070)
Executes TypeScript code snippets generated by the AI for:
- Data transformations
- API call orchestration
- Complex workflows
Mock Server (Port 8082)
The ACME Analytics mock server provides a complete example API for testing and learning. It includes:
- Sample sales analytics endpoints
- Realistic data
- Full OpenAPI specification
OpenTelemetry Collector (Port 4317)
Optional telemetry collection for:
- Distributed tracing
- Metrics collection
- Log aggregation
All services include automatic health checks and log management.
Configuration
Global Configuration
Location: ~/.onemcp/config.yaml
provider: openai # Default AI provider
apiKeys:
openai: sk-... # Provider API keys
gemini: your-gemini-key
anthropic: your-anthropic-key
currentHandbook: my-service # Active handbook
handbookDir: ~/handbooks # Handbooks directory
logDir: ~/.onemcp/logs # Logs directoryHandbook Structure
Each handbook follows this structure:
handbook/
├── Agent.md # AI instructions
├── apis/ # OpenAPI specs
│ └── my-api.yaml
├── docs/ # Documentation
│ ├── overview.md
│ └── examples.md
├── regression/ # Test definitions
│ └── test-cases.yaml
└── state/ # Runtime data
└── knowledge-base-state.jsonService Authentication
Location: ~/handbooks/{handbook}/service.yaml
service: my-api
header: Authorization # Authentication header
pattern: Bearer {token} # Token pattern
token: eyJh... # Token value
expiresAt: 2025-10-25T14:02:12Z # ExpirationTroubleshooting
CLI Won’t Start
Check system requirements:
onemcp doctorThis verifies:
- Node.js >= 20
- Java 21
- Maven installation
- Network connectivity
Services Not Healthy
# Check status
onemcp status
# View logs for errors
onemcp logs app
# Restart services
onemcp stop
onemcp chatToken Expired
onemcp service renew my-serviceEnter a new token and expiration date.
API Key Issues
# List configured providers
onemcp provider list
# Configure a new provider
onemcp provider set
# Switch providers
onemcp provider switchReset Everything
If you need to start fresh:
onemcp resetThis will:
- Stop all services
- Delete configuration files
- Re-run the setup wizard
Warning: This deletes all configuration. Your handbooks are preserved.
Port Conflicts
If ports 8080, 7070, or 8082 are already in use:
- Stop the conflicting service
- Or modify the port configuration in
~/.onemcp/config.yaml
Logs Location
All logs are stored in ~/.onemcp/logs/:
~/.onemcp/logs/
├── app.log # OneMCP server
├── typescript.log # TypeScript runtime
├── mock.log # Mock server
└── archived/ # Archived logsLogs are automatically rotated and archived.
Updating
Keep your CLI up to date:
onemcp updateThis will:
- Pull the latest version from GitHub
- Rebuild the CLI
- Preserve your configuration
Uninstallation
Automatic Uninstall
curl -sSL https://raw.githubusercontent.com/Gentoro-OneMCP/onemcp/main/cli/uninstall.sh | bashManual Uninstall
# Stop services
onemcp stop
# Remove based on installation method
# npm:
npm uninstall -g @gentoro/onemcp-cli
# system-wide:
sudo rm /usr/local/bin/onemcp
# local-bin:
rm ~/.local/bin/onemcp
# Remove configuration and source
rm -rf ~/.onemcp-src ~/.onemcp
# Optionally remove handbooks
rm -rf ~/handbooksBest Practices
Organizing Handbooks
- One handbook per service: Keep each API or service in its own handbook
- Descriptive names: Use clear, descriptive names like
customer-apioranalytics-service - Documentation: Always include comprehensive documentation in the
docs/directory - Version control: Consider versioning your handbooks with Git
Writing Agent Instructions
- Be specific: Clearly define what the AI should and shouldn’t do
- Include examples: Show example queries and expected responses
- Set boundaries: Define any limitations or constraints
- Update regularly: Keep instructions current as your API evolves
Managing Multiple Environments
Create separate handbooks for different environments:
onemcp handbook init my-api-dev
onemcp handbook init my-api-staging
onemcp handbook init my-api-prodConfigure different service authentication for each environment.
Security
- Protect API keys: Never commit API keys to version control
- Token expiration: Set realistic expiration dates for service tokens
- Regular rotation: Rotate tokens regularly using
onemcp service renew - Least privilege: Use tokens with minimal required permissions
Next Steps
- Explore Concepts: Learn about Architecture and Retrieval
- Command Reference: See the complete CLI Reference
- Add Tools: Follow the Adding a Tool guide
- Docker Deployment: For production, see Run with Docker