Overview Alpha
Enginy MCP lets you connect an AI assistant to your Enginy workspace so it can look things up, summarize what it finds, and help you prepare work inside Enginy.
If you are in sales ops, rev ops, or another non-technical role, start with Claude Code first. It is the clearest setup path in the current alpha.
The hosted resource server is:
MCP Overview (Alpha)
Learn how the hosted server, OAuth flow, OpenAPI-derived tools, and workspace policy model fit together.
Connect Claude Code (Alpha)
Register the hosted server in Claude Code, complete OAuth, and verify the granted scopes.
Clients & interfaces (Alpha)
Get connection guidance for Claude, Codex, Cursor, VS Code, ChatGPT developer mode, and other remote MCP
clients.
Client support matrix (Alpha)
Compare supported interfaces, setup shapes, and caveats before choosing the client you want to use.
Supported tools (Alpha)
See what Enginy MCP can do today, how the tool surface is generated, and what is not guaranteed yet.
Example prompts (Alpha)
Use practical prompts for Claude, Codex, Cursor, ChatGPT developer mode, and other MCP clients.
Scopes and policy (Alpha)
Review scope groups, policy ceilings, and how Enginy maps MCP permissions onto OpenAPI routes.
FAQ / known limitations (Alpha)
Get quick answers about the alpha, current limitations, and what assumptions are unsafe to make.
When to use MCP instead of direct HTTP Alpha
- Use MCP when you want an AI assistant to work inside Enginy for you, without building a custom integration.
- Use direct HTTP when an engineer is building a fixed product integration against specific Enginy API endpoints.
Fastest path for Claude Code Alpha
This is the recommended first setup for most teams during the alpha.
- In Enginy, enable MCP for the workspace and allow
http://localhost:3118/callback. - Add the hosted server:
- In Claude Code, open:
- Authenticate in the browser and approve the requested scopes.
- Run
mcp_whoamito confirm the granted client ID, user, and scopes.