How to Use CodivUpload's MCP Server with Claude or ChatGPT
Model Context Protocol (MCP) is the emerging standard that lets AI agents call external tools natively — no custom integration code, no per-tool API wrappers. Claude Desktop, Cursor, Zed, and (with the Connectors feature) ChatGPT Pro all speak MCP, which means an MCP server you point them at instantly becomes available as a tool palette inside any conversation. CodivUpload ships an MCP server that auto-generates tool definitions from the OpenAPI spec — every endpoint at api.codivupload.com is exposed as an MCP tool the LLM can invoke directly. Once configured, your AI agent can post to TikTok, schedule a YouTube Short, or pull engagement data without you writing a single line of integration code.
Prerequisites
- Claude Desktop, ChatGPT Pro, Cursor, or any MCP-compatible client
- A CodivUpload API key from Dashboard → API Keys
- Node.js 18+ on the machine running the MCP client
- A CodivUpload profile with social accounts connected
Step-by-step
- 1
Install the CodivUpload MCP server
It's a single npx command — no global install, no registry config, nothing to clone. The server is published as the codivupload-mcp npm package and runs as a local stdio process the MCP client launches on demand.
npx codivupload-mcp --help - 2
Add it to Claude Desktop's config
Open Claude Desktop's config file at Settings → Developer → Edit Config. The file is claude_desktop_config.json under your home directory. Add a new entry under mcpServers. The CODIVUPLOAD_API_KEY env var is how the MCP server authenticates — set it once here and every Claude conversation has access to your CodivUpload workspace.
{ "mcpServers": { "codivupload": { "command": "npx", "args": ["-y", "codivupload-mcp"], "env": { "CODIVUPLOAD_API_KEY": "YOUR_API_KEY" } } } } - 3
Restart Claude Desktop
Claude picks up new MCP servers on launch — there's no hot reload. After restart, you'll see codivupload tools in the tool list at the top of any conversation, alongside any other MCP servers you have configured. The list updates automatically if CodivUpload ships new endpoints; the MCP server reads the live OpenAPI spec at startup, so you don't need to update the npm package to gain new tools.
- 4
Try a prompt
"Schedule a TikTok video and an Instagram Reel for tomorrow at 10am Eastern. Use my profile 'my_brand', the video at https://cdn.example.com/clip.mp4, and a caption that hints at productivity." Claude reads your prompt, picks the create_post tool, fills in profile_name, platforms, scheduled_date, and description, and shows you the planned tool call before executing. Approve it and the post goes live in CodivUpload's queue.
- 5
Use with ChatGPT, Cursor, or any other MCP client
The same config pattern works in every MCP-compatible client — only the config file path differs. ChatGPT Pro: Settings → Connectors → Add MCP server (paste the same JSON). Cursor: ~/.cursor/mcp.json with the same shape. Zed: settings.json's context_servers field. The CodivUpload MCP server is client-agnostic; it speaks the standard MCP stdio protocol.
Frequently asked
What tools does the MCP server expose?+
Every public CodivUpload endpoint as a separate tool: create_post, get_post, list_posts, retry_failed, create_profile, list_profiles, get_analytics, create_webhook, and several dozen more. The list auto-updates as CodivUpload ships new endpoints because the MCP server reads the live OpenAPI spec at startup.
Is my API key sent to Anthropic or OpenAI?+
No. The MCP server runs locally on your machine — Claude or ChatGPT only sees the tool's name, the arguments your prompt produced, and the JSON return value. The CODIVUPLOAD_API_KEY env var stays inside the local process; the LLM never sees credentials.
Can I run the MCP server in production for an autonomous agent?+
Yes — it's stateless and stdio-based. Run it as a sidecar in your Docker stack and point your agentic LLM clients at it. For multi-tenant production use, generate a separate API key per tenant and inject it via env var per process.
Does the MCP server consume CodivUpload quota?+
MCP tool calls are normal API calls — they count toward whatever rate limits apply to your API key. Free plan: 30 posts/month. Paid plans: per the plan's limit. The MCP server itself has no overhead beyond the underlying API call.
Can I restrict which tools the MCP server exposes?+
Yes. Pass --tools to the npx command followed by a comma-separated allowlist of tool names: npx codivupload-mcp --tools=create_post,get_post,list_posts. Useful when you want to give a Claude session only the read-only tools and keep destructive operations off the table.
Is the MCP server open source?+
The npm package is published with full source visibility — you can inspect every byte of what runs on your machine. The server itself is intentionally simple: ~200 lines of TypeScript that reads the OpenAPI spec and emits MCP tool descriptors.
What if I want my AI to post on a profile that doesn't exist yet?+
The MCP server exposes create_profile and connect_account too. Your AI can chain create_profile → connect_account → create_post in a single conversation — though connect_account opens an OAuth window the user has to approve manually. There's no fully-headless way to add a new social account; OAuth always requires human consent.
Related guides
How to Automate Cross-Posting with Python
Cross-post to TikTok, Instagram, YouTube, X, LinkedIn, and 4 more from a single Python script using the CodivUpload SDK. Production-ready code with error handling.
How to Automate Social Media with n8n
Build n8n workflows that publish to 11 social platforms via CodivUpload. Triggers, RSS-to-social, Airtable-to-social, AI-generated captions, approval flows.
How to Schedule Instagram Posts via API
Schedule Instagram Reels, carousels, and feed posts via REST API using the CodivUpload profile + platforms model. Free tier, no Meta app review. Code samples in cURL, Python, JavaScript.
Ready to automate?
Free plan includes 30 posts/month across 11 platforms. No credit card required.
See pricing