What this page covers Alpha
This page helps you choose the right AI interface for Enginy MCP.
If you want the simplest starting point, use Claude Code. If your team already prefers Codex, Cursor, VS Code, or ChatGPT, those paths may also work, but they are more dependent on the exact client version and setup.
If you need the architecture, start with Overview. If you need setup details, use Clients & interfaces. If you are debugging auth or transport issues, use Security and troubleshooting.
Support levels
Best choicemeans this is the clearest current path for most teams.Likely to workmeans the client can often connect if it supports remote MCP, but the experience depends on that client’s current MCP support.Needs technical setupmeans you will probably need a bridge, proxy, or engineer help.
Support matrix Alpha
| Client / interface | Support level | Connection shape | Notes |
|---|---|---|---|
| Claude Code | Best choice | claude mcp add --transport http --callback-port 3118 ... | Best documented path for Enginy MCP today. |
| Other Claude interfaces with remote MCP support | Likely to work | Point the client at https://openapi.enginy.ai/mcp | Exact UI differs, but the same browser approval flow should apply. |
| Codex CLI / IDE | Likely to work | codex mcp add ... --url https://openapi.enginy.ai/mcp | Good option if your team already works in Codex. |
| Cursor | Likely to work | .cursor/mcp.json or ~/.cursor/mcp.json | Good option if your team already uses Cursor MCP. |
| VS Code / GitHub Copilot Agent mode | Likely to work | .vscode/mcp.json with a remote HTTP server entry | Best suited to teams already using a VS Code MCP workflow. |
| ChatGPT developer mode | Likely to work | Add the hosted MCP server in developer settings | Useful for chat-style testing if your ChatGPT setup supports remote MCP. |
| Generic remote MCP clients | Needs technical setup | Use https://openapi.enginy.ai/mcp if the client supports remote MCP directly | If the client only speaks stdio or SSE, add a bridge or proxy first. |
Practical notes Alpha
- Most successful setups today use remote MCP over HTTP and a browser sign-in flow.
- Many browser-based flows use a local callback like
http://localhost:3118/callback. - If a client only supports local stdio connections, it is not a direct fit for Enginy MCP.
- If you are choosing from scratch, Claude Code is usually the fastest way to get started.
Best fit by workflow Alpha
- Use Claude Code when you want the clearest one-command setup and the most explicit browser callback flow.
- Use Codex when your team already works in Codex and wants MCP in that environment.
- Use Cursor or VS Code when your team already uses project-level
mcp.jsonfiles. - Use ChatGPT developer mode when you want to try the same MCP server in a chat-style interface.
- Use a bridge only when your preferred client cannot connect to remote MCP directly.
Clients & interfaces (Alpha)
See copy-paste config examples for Claude, Codex, Cursor, VS Code, ChatGPT developer mode, and generic MCP
clients.
Security and troubleshooting (Alpha)
Review the auth flow, callback handling, and common failure modes.