Skip to main content
Alpha: Enginy MCP is in active development. Use the client examples here as current patterns, not as a long-term stability guarantee.
If you are in sales ops, rev ops, or another non-technical role, start with Claude Code first. The other interfaces are useful when your team already has a preferred AI tool and knows it supports remote MCP.

Quickstarts by interface Alpha

These are the main interface patterns you are likely to use with Enginy MCP today.
InterfaceHow to connect Enginy MCP (Alpha)What to expect
Claude Codeclaude mcp add --transport http --callback-port 3118 enginy https://openapi.enginy.ai/mcpBrowser-based approval, then tool use inside Claude Code
Other Claude interfaces with remote MCP supportAdd the same hosted MCP URL in that interface’s MCP settingsSimilar approval flow, but the UI depends on that Claude interface
Codex CLI / IDEcodex mcp add enginy --url https://openapi.enginy.ai/mcp or add the server in ~/.codex/config.tomlLikely to work if your Codex setup supports remote MCP
CursorAdd an enginy entry to .cursor/mcp.json or ~/.cursor/mcp.jsonLikely to work if your Cursor setup supports remote MCP
VS Code / GitHub Copilot Agent modeAdd the server to .vscode/mcp.json with type: "http"Likely to work in VS Code-based MCP workflows
ChatGPT developer modeAdd Enginy as a remote MCP app/server in settingsUseful for chat-style testing if your ChatGPT setup supports remote MCP
Other remote-MCP clientsAdd https://openapi.enginy.ai/mcp and follow the client’s remote MCP instructionsUsually needs technical help unless the client already supports remote MCP well

Claude interfaces Alpha

Claude Code

Use the hosted server directly:
claude mcp add --transport http --callback-port 3118 enginy https://openapi.enginy.ai/mcp
Then open /mcp in Claude Code and complete the browser approval flow.

Other Claude interfaces

If a Claude surface supports remote MCP directly, use the same hosted URL:
https://openapi.enginy.ai/mcp
The exact UI differs by interface, but the Enginy side stays the same:
  • the workspace must allow the redirect URI
  • the client completes OAuth in the browser
  • the final tool set depends on the granted scopes

Codex / OpenAI interfaces Alpha

Codex CLI or IDE extension

If your team uses Codex and your version supports remote MCP, add Enginy with:
codex mcp add enginy --url https://openapi.enginy.ai/mcp
Then start the OAuth flow:
codex mcp login enginy
Or configure it directly in ~/.codex/config.toml:
[mcp_servers.enginy]
url = "https://openapi.enginy.ai/mcp"
After the server is added, run codex mcp login enginy and complete the Enginy browser approval flow when prompted.

ChatGPT developer mode

If your ChatGPT environment supports remote MCP apps/connectors, add the Enginy MCP server there and complete the same browser approval flow. This is useful when you want to test Enginy tools in a chat-style interface instead of in an IDE or terminal agent.

Cursor Alpha

If your Cursor setup supports remote MCP, it can read hosted MCP servers from mcp.json. Project-level example in .cursor/mcp.json:
{
  "mcpServers": {
    "enginy": {
      "url": "https://openapi.enginy.ai/mcp"
    }
  }
}
You can also place the same entry in ~/.cursor/mcp.json for a global setup. After restarting Cursor or refreshing MCP servers, let Cursor open the browser flow and approve Enginy.

VS Code and similar agent UIs Alpha

For editors that use the VS Code MCP config shape, add Enginy to .vscode/mcp.json:
{
  "servers": {
    "enginy": {
      "type": "http",
      "url": "https://openapi.enginy.ai/mcp"
    }
  }
}
This pattern is useful for GitHub Copilot Agent mode and other VS Code-based clients that support remote MCP from project configuration.

Advanced setup details Alpha

Everything below is mainly for people configuring clients in depth, debugging a connection, or building their own MCP experience.

What a client needs in order to work with Enginy Alpha

An MCP client can integrate with Enginy if it supports:
  • remote Streamable HTTP MCP
  • OAuth 2.0 Authorization Code + PKCE
  • public-client token exchange with no client secret
  • bearer access tokens and refresh tokens
If your client only supports stdio or SSE, place a bridge in front of Enginy MCP and let that bridge handle the remote HTTP transport.

Advanced: discovery model Alpha

Start from the hosted MCP URL:
https://openapi.enginy.ai/mcp
Clients should use discovery instead of hardcoding OAuth endpoints:
  • the MCP host advertises protected-resource metadata
  • the authorization server publishes OAuth authorization-server metadata
  • the client can then follow the discovered authorization, token, revocation, and JWKS endpoints
Enginy also exposes standard authorization-server metadata at:
/.well-known/oauth-authorization-server
/.well-known/oauth-authorization-server/api/v1/mcp/oauth

Advanced: OAuth behavior Alpha

Enginy uses:
  • Authorization Code grant with PKCE (S256)
  • bearer access tokens
  • rotating refresh tokens
  • no client secret for the hosted remote-client flow
Important runtime behavior:
  • access tokens are short-lived
  • refresh tokens rotate on every successful refresh
  • reusing a refresh token after rotation returns invalid_grant; an already-rotated active token chain is preserved
  • token refresh is checked against the current workspace policy, not only the original grant

Advanced: registration options Alpha

Enginy supports two useful patterns for public clients:

1. Resource-server registration

Post to the hosted MCP server’s /register endpoint with:
{
  "client_name": "my-mcp-client",
  "redirect_uris": ["http://localhost:3118/callback"]
}
That returns a signed dynamic client ID tied to the redirect URIs you supplied. This works well for clients that expect dynamic client registration near the MCP resource server.

2. Admin-managed registration

If your workspace pins trusted client IDs, an Enginy admin can pre-register the client from the Enginy side so the client ID is persisted in the workspace allowlist. Use this when you want a tighter production trust model than “any client ID is acceptable.”
If your workspace leaves the allowed client-ID list empty, Enginy accepts any client ID that otherwise passes the redirect-URI and scope checks. If you do pin client IDs, the client must match that allowlist even when using dynamic registration.

Advanced: transport expectations Alpha

Enginy’s MCP transport is intentionally stateless:
  • no long-lived server-side MCP session is required
  • requests are handled over Streamable HTTP with JSON responses enabled
  • the same bearer token is forwarded into the matching OpenAPI route call
For OpenAPI-derived tools, the input contract is normalized into these keys:
  • pathParams
  • query
  • body
Example:
{
  "pathParams": {
    "contactId": "123"
  },
  "query": {
    "limit": 10
  },
  "body": {
    "status": "ACTIVE"
  }
}

Advanced: tool-generation rules Alpha

This is important if you are building a client UX around tool discovery. Enginy generates tools from the public OpenAPI schema:
  • summaries become tool names when possible
  • if names would collide, Enginy falls back to method-plus-path names
  • only JSON request bodies are surfaced
  • required scopes come from the OpenAPI route description metadata
This means the OpenAPI docs are the source of truth for the MCP surface area. As Enginy adds public routes, the MCP server can expose more tools without inventing a second contract format.

Advanced: compatibility aliases Alpha

Some MCP clients assume shorthand OAuth paths. Enginy supports both the canonical and shorthand forms:
  • authorization: canonical /api/v1/mcp/oauth/authorize, alias /authorize
  • token: canonical /api/v1/mcp/oauth/token, alias /token
  • revocation: canonical /api/v1/mcp/oauth/revoke, alias /revoke
You should still prefer discovery and canonical metadata, but these aliases improve client compatibility when a tool makes stricter assumptions.

Client support matrix (Alpha)

Compare support levels and setup shapes across Claude, Codex, Cursor, VS Code, ChatGPT, and generic MCP clients.

FAQ / known limitations (Alpha)

See the current alpha constraints before you assume a client or workflow should work.