Getting Started

OptiPrompt Setup & Configuration

Prerequisites

Before you begin, make sure you have the following installed and configured:

  • Node.js 18+
  • Git (optional but recommended)
  • An Anthropic API key (only if you wish to connect OptiPrompt with Claude Desktop)
  • A terminal (macOS, Linux, or Windows PowerShell)

The OptiPrompt code lives in the public repository: https://github.com/AntonReise/OptiPrompt

1. Download & Install OptiPrompt

Step 1 – Clone the repository

git clone https://github.com/AntonReise/OptiPrompt.git
cd OptiPrompt

Step 2 – Install dependencies

npm install

Step 3 – Configure the Anthropic API key (Claude Desktop only)

Note: This step is only required if you plan to connect OptiPrompt with Claude Desktop. If you're only using Cursor, you can skip this step.

Create a .env file in the project root with:

ANTHROPIC_API_KEY=YOUR_ANTHROPIC_API_KEY_HERE

Replace YOUR_ANTHROPIC_API_KEY_HERE with the key from the Anthropic dashboard.

This variable is loaded via dotenv.config() in src/index.ts, and the MCP server will refuse to run without it when connecting to Claude Desktop.

Step 4 – Build the TypeScript project

npm run build

This compiles src/index.ts to build/index.js as configured in tsconfig.json:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "Node16",
    "moduleResolution": "Node16",
    "outDir": "./build",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules"]
}

Step 5 – Start the MCP server

node ./build/index.js

This starts the OptiPrompt MCP server on stdio using the McpServer and StdioServerTransport from @modelcontextprotocol/sdk.

Optionally, because of the bin field in package.json, advanced users can install it globally (npm install -g .) and then run optiprompt directly.

What OptiPrompt Provides

The OptiPrompt MCP server exposes the following:

  • MCP server name: "optiprompt"
  • Version: "1.0.0"
  • Single tool registered: "optimize-prompt"
  • Tool description: "Optimizes a user's prompt by refining it for clarity and effectiveness."
  • Input schema: one prompt string (validated with Zod)

Under the hood, it calls anthropic.messages.create with model claude-3-haiku-20240307 and a long system prompt (OPTIMIZATION_SYSTEM_PROMPT) that combines five expert roles (Clarity, Code Quality, Security, Performance, Test Coverage) and returns a structured answer with sections like ## Clarity Refinement, ## Final Optimized Prompt, etc.

2. Connect OptiPrompt to Claude Desktop

Note: This section is for connecting OptiPrompt to Claude Desktop. If you're using Cursor, skip to the next section.

Claude Desktop uses a claude_desktop_config.json file to register MCP servers. Create or edit this file and add the following snippet:

{
  "mcpServers": {
    "optiprompt": {
      "command": "node",
      "args": ["ABSOLUTE_OR_RELATIVE_PATH_TO/OptiPrompt/build/index.js"],
      "env": {
        "ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_API_KEY_HERE"
      }
    }
  }
}
  • Replace ABSOLUTE_OR_RELATIVE_PATH_TO/OptiPrompt/build/index.js with the actual path on your machine (for example, /Users/you/dev/OptiPrompt/build/index.js on macOS/Linux or C:\\Users\\you\\dev\\OptiPrompt\\build\\index.js on Windows).
  • Ensure the path points to the compiled file in build/index.js, not the TypeScript source.
  • Either set the API key here in env or rely on the .env file and system environment variables (but do not hard-code real secrets in shared configs).

After restarting Claude Desktop, there should be a tool/server named optiprompt available.

OR

3. Connect OptiPrompt to Cursor

Note: This section is for connecting OptiPrompt to Cursor. The Anthropic API key is not required for Cursor integration.

Cursor supports MCP servers via its config (for example .cursor/config.json in the workspace). Add the following snippet:

{
  "mcpServers": {
    "optiprompt": {
      "command": "node",
      "args": ["../OptiPrompt/build/index.js"],
      "env": {
        "ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_API_KEY_HERE"
      }
    }
  }
}
  • The "args" path (../OptiPrompt/build/index.js) is just an example assuming the user's project and the OptiPrompt repo are sibling folders. Adjust this according to your folder structure.
  • The ANTHROPIC_API_KEY in the config is optional for Cursor. You can omit the env section entirely if you're only using Cursor.

To verify in Cursor that the MCP server is detected, open the tools panel and look for optiprompt / optimize-prompt.

4. Global LLM Rule (System Message)

This is an optional but recommended text you can paste into the "system prompt" / "global instructions" area of Claude, Cursor, or other LLM tools:

When you need multi-step reasoning, complex planning, or careful prompt design, always call the MCP tool named "optimize-prompt" from the server "optiprompt" (also known as the Prompt Optimizer) before answering.

1. Send the user's raw request to "optimize-prompt".
2. Take the "Final Optimized Prompt" section from the tool's response.
3. Use that Final Optimized Prompt as your main instruction to complete the task.
4. Do not show the optimization internals to the user unless they explicitly ask.

Always follow this workflow whenever it would improve reasoning quality.

You can tweak the wording, but the idea is: use OptiPrompt first, then answer with the optimized prompt.

5. Example: Using OptiPrompt in Practice

Here's a simple workflow showing how OptiPrompt works:

  1. User types a messy prompt asking for a complex coding task.
  2. LLM calls the optimize-prompt tool of the optiprompt server.
  3. The tool returns the six structured sections, including ## Final Optimized Prompt.
  4. The LLM then uses that optimized prompt to produce the final answer.

Note: This workflow happens automatically once OptiPrompt is configured. The LLM will use the optimized prompt behind the scenes to provide better responses.

Ready to Get Started?

Follow the steps above to set up OptiPrompt and start optimizing your prompts. If you run into any issues, our support team is here to help.