Skip to main content
Alpha: client support can change as Enginy MCP evolves. Treat this page as a current compatibility snapshot, not a long-term guarantee.

What this page covers Alpha

This page helps you choose the right AI interface for Enginy MCP. If you want the simplest starting point, use Claude Code. If your team already prefers Codex, Cursor, VS Code, or ChatGPT, those paths may also work, but they are more dependent on the exact client version and setup. If you need the architecture, start with Overview. If you need setup details, use Clients & interfaces. If you are debugging auth or transport issues, use Security and troubleshooting.

Support levels

  • Best choice means this is the clearest current path for most teams.
  • Likely to work means the client can often connect if it supports remote MCP, but the experience depends on that client’s current MCP support.
  • Needs technical setup means you will probably need a bridge, proxy, or engineer help.

Support matrix Alpha

Client / interfaceSupport levelConnection shapeNotes
Claude CodeBest choiceclaude mcp add --transport http --callback-port 3118 ...Best documented path for Enginy MCP today.
Other Claude interfaces with remote MCP supportLikely to workPoint the client at https://openapi.enginy.ai/mcpExact UI differs, but the same browser approval flow should apply.
Codex CLI / IDELikely to workcodex mcp add ... --url https://openapi.enginy.ai/mcpGood option if your team already works in Codex.
CursorLikely to work.cursor/mcp.json or ~/.cursor/mcp.jsonGood option if your team already uses Cursor MCP.
VS Code / GitHub Copilot Agent modeLikely to work.vscode/mcp.json with a remote HTTP server entryBest suited to teams already using a VS Code MCP workflow.
ChatGPT developer modeLikely to workAdd the hosted MCP server in developer settingsUseful for chat-style testing if your ChatGPT setup supports remote MCP.
Generic remote MCP clientsNeeds technical setupUse https://openapi.enginy.ai/mcp if the client supports remote MCP directlyIf the client only speaks stdio or SSE, add a bridge or proxy first.

Practical notes Alpha

  • Most successful setups today use remote MCP over HTTP and a browser sign-in flow.
  • Many browser-based flows use a local callback like http://localhost:3118/callback.
  • If a client only supports local stdio connections, it is not a direct fit for Enginy MCP.
  • If you are choosing from scratch, Claude Code is usually the fastest way to get started.

Best fit by workflow Alpha

  • Use Claude Code when you want the clearest one-command setup and the most explicit browser callback flow.
  • Use Codex when your team already works in Codex and wants MCP in that environment.
  • Use Cursor or VS Code when your team already uses project-level mcp.json files.
  • Use ChatGPT developer mode when you want to try the same MCP server in a chat-style interface.
  • Use a bridge only when your preferred client cannot connect to remote MCP directly.

Clients & interfaces (Alpha)

See copy-paste config examples for Claude, Codex, Cursor, VS Code, ChatGPT developer mode, and generic MCP clients.

Security and troubleshooting (Alpha)

Review the auth flow, callback handling, and common failure modes.