Configuration & Endpoints
Minimal settings to run locally or in Docker. Values shown reflect defaults in the image and application.yaml.
Environment Variables
- OPENAI_API_KEY β API key for OpenAI (optional if using other providers)
- GEMINI_API_KEY β API key for Gemini (optional)
- ANTHROPIC_API_KEY β API key for Anthropic (optional)
- OTEL_EXPORTER_OTLP_ENDPOINT β OTLP collector endpoint (default:
http://localhost:4317) - TS_RUNTIME_URL β TypeScript runtime URL (default:
http://localhost:7070) β optional/placeholder - FOUNDATION_DIR β Foundation directory (default:
/var/foundation) - INFERENCE_DEFAULT_PROVIDER β AI provider (openai, gemini, or anthropic)
Process Arguments
- βprocess=validate β validates foundation data and exits
- βprocess=regression β runs regression test suite and exits
- βprocess=standard (default) β starts MCP server + orchestrator
These can be passed via APP_ARGS in Docker:
# Validation mode
docker run --rm -e APP_ARGS="--process=validate" admingentoro/gentoro:latest
# Regression mode
docker run --rm -e APP_ARGS="--process=regression" admingentoro/gentoro:latest
# Standard mode (default)
docker run --rm -p 8080:8080 admingentoro/gentoro:latestFoundation Directory Structure
The process modes expect a foundation directory with the following structure:
foundation/
βββ Agent.md # Required: Agent configuration
βββ apis/ # Optional: OpenAPI specifications
β βββ *.yaml
βββ docs/ # Optional: Documentation files
β βββ *.md
βββ regression/ # Optional: Regression test files
β βββ *.yaml
βββ state/ # Auto-generated: Knowledge base state
βββ knowledge-base-state.json # Indexed docs, APIs, and content signatureMCP Endpoint
- Path:
/mcp(HTTP stream) - Example URL:
http://localhost:8080/mcp
Client connections must speak the Model Context Protocol over HTTP streaming.
Last updated on