Quickstarts by interface Alpha
These are the main interface patterns you are likely to use with Enginy MCP today.
| Interface | How to connect Enginy MCP (Alpha) | What to expect |
|---|---|---|
| Claude Code | claude mcp add --transport http --callback-port 3118 enginy https://openapi.enginy.ai/mcp | Browser-based approval, then tool use inside Claude Code |
| Other Claude interfaces with remote MCP support | Add the same hosted MCP URL in that interface’s MCP settings | Similar approval flow, but the UI depends on that Claude interface |
| Codex CLI / IDE | codex mcp add enginy --url https://openapi.enginy.ai/mcp or add the server in ~/.codex/config.toml | Likely to work if your Codex setup supports remote MCP |
| Cursor | Add an enginy entry to .cursor/mcp.json or ~/.cursor/mcp.json | Likely to work if your Cursor setup supports remote MCP |
| VS Code / GitHub Copilot Agent mode | Add the server to .vscode/mcp.json with type: "http" | Likely to work in VS Code-based MCP workflows |
| ChatGPT developer mode | Add Enginy as a remote MCP app/server in settings | Useful for chat-style testing if your ChatGPT setup supports remote MCP |
| Other remote-MCP clients | Add https://openapi.enginy.ai/mcp and follow the client’s remote MCP instructions | Usually needs technical help unless the client already supports remote MCP well |
Claude interfaces Alpha
Claude Code
Use the hosted server directly:/mcp in Claude Code and complete the browser approval flow.
Other Claude interfaces
If a Claude surface supports remote MCP directly, use the same hosted URL:- the workspace must allow the redirect URI
- the client completes OAuth in the browser
- the final tool set depends on the granted scopes
Codex / OpenAI interfaces Alpha
Codex CLI or IDE extension
If your team uses Codex and your version supports remote MCP, add Enginy with:~/.codex/config.toml:
codex mcp login enginy and complete the Enginy browser approval flow when prompted.
ChatGPT developer mode
If your ChatGPT environment supports remote MCP apps/connectors, add the Enginy MCP server there and complete the same browser approval flow. This is useful when you want to test Enginy tools in a chat-style interface instead of in an IDE or terminal agent.Cursor Alpha
If your Cursor setup supports remote MCP, it can read hosted MCP servers from mcp.json.
Project-level example in .cursor/mcp.json:
~/.cursor/mcp.json for a global setup.
After restarting Cursor or refreshing MCP servers, let Cursor open the browser flow and approve Enginy.
VS Code and similar agent UIs Alpha
For editors that use the VS Code MCP config shape, add Enginy to .vscode/mcp.json:
Advanced setup details Alpha
Everything below is mainly for people configuring clients in depth, debugging a connection, or building their own MCP experience.
What a client needs in order to work with Enginy Alpha
An MCP client can integrate with Enginy if it supports:
- remote Streamable HTTP MCP
- OAuth 2.0 Authorization Code + PKCE
- public-client token exchange with no client secret
- bearer access tokens and refresh tokens
Advanced: discovery model Alpha
Start from the hosted MCP URL:
- the MCP host advertises protected-resource metadata
- the authorization server publishes OAuth authorization-server metadata
- the client can then follow the discovered authorization, token, revocation, and JWKS endpoints
Advanced: OAuth behavior Alpha
Enginy uses:
- Authorization Code grant with PKCE (
S256) - bearer access tokens
- rotating refresh tokens
- no client secret for the hosted remote-client flow
- access tokens are short-lived
- refresh tokens rotate on every successful refresh
- reusing a refresh token after rotation returns
invalid_grant; an already-rotated active token chain is preserved - token refresh is checked against the current workspace policy, not only the original grant
Advanced: registration options Alpha
Enginy supports two useful patterns for public clients:
1. Resource-server registration
Post to the hosted MCP server’s/register endpoint with:
2. Admin-managed registration
If your workspace pins trusted client IDs, an Enginy admin can pre-register the client from the Enginy side so the client ID is persisted in the workspace allowlist. Use this when you want a tighter production trust model than “any client ID is acceptable.”If your workspace leaves the allowed client-ID list empty, Enginy accepts any client ID that otherwise
passes the redirect-URI and scope checks. If you do pin client IDs, the client must match that allowlist
even when using dynamic registration.
Advanced: transport expectations Alpha
Enginy’s MCP transport is intentionally stateless:
- no long-lived server-side MCP session is required
- requests are handled over Streamable HTTP with JSON responses enabled
- the same bearer token is forwarded into the matching OpenAPI route call
pathParamsquerybody
Advanced: tool-generation rules Alpha
This is important if you are building a client UX around tool discovery.
Enginy generates tools from the public OpenAPI schema:
- summaries become tool names when possible
- if names would collide, Enginy falls back to method-plus-path names
- only JSON request bodies are surfaced
- required scopes come from the OpenAPI route description metadata
Advanced: compatibility aliases Alpha
Some MCP clients assume shorthand OAuth paths. Enginy supports both the canonical and shorthand forms:
- authorization: canonical
/api/v1/mcp/oauth/authorize, alias/authorize - token: canonical
/api/v1/mcp/oauth/token, alias/token - revocation: canonical
/api/v1/mcp/oauth/revoke, alias/revoke
Client support matrix (Alpha)
Compare support levels and setup shapes across Claude, Codex, Cursor, VS Code, ChatGPT, and generic MCP
clients.
FAQ / known limitations (Alpha)
See the current alpha constraints before you assume a client or workflow should work.