Skip to main content

AI Assistants Integration

Point any AI coding assistant — Claude Code, Claude Workspaces, Cursor, ChatGPT, Windsurf — at FlyMy.AI's docs and it will know how to create agents, run models, and where to grab the API key. No MCP server to install, no Python to run — just a single static file at:

https://docs.flymy.ai/llms.txt

This file follows the llms.txt standard: a short Markdown index of every documentation page with a one-line description, plus the two API base URLs and where to get a key. The assistant fetches it once, picks the relevant page links, and fetches those individually as needed.

Claude Code

Use the built-in WebFetch tool — no MCP plumbing:

fetch https://docs.flymy.ai/llms.txt

Now Claude Code has the doc map cached. Any follow-up question ("show me how to call nano-banana", "how do I freeze an agent and re-run with new inputs") makes it fetch the right linked page on its own.

For a project-wide setup, drop this snippet into your CLAUDE.md:

## FlyMy.AI documentation

When asked about FlyMy.AI agents, models, MCP, or the `flymyai` Python SDK,
start by fetching the docs index at https://docs.flymy.ai/llms.txt and
follow the linked pages for specifics. Two distinct endpoints:

- agents: https://backend.flymy.ai/api/v1/agents/...
- models: https://api.flymy.ai/api/v1/...

API keys: https://app.flymy.ai/profile

Claude Code reads CLAUDE.md automatically when present in the working directory.

Claude Workspaces / claude.ai

Paste the URL into the chat:

"Read https://docs.flymy.ai/llms.txt and remember the structure — I'll ask you questions about FlyMy.AI agents and models."

Cursor / Windsurf / other agentic IDEs

Add the URL to project rules or your .cursorrules / .windsurfrules file:

docs:
- https://docs.flymy.ai/llms.txt

The IDE's built-in fetch tool will pull the index and link out from there.

ChatGPT / generic LLM chats

The llms.txt is plain Markdown — copy-paste it into the chat as system context. Ask follow-up questions about FlyMy.AI and the model will know exactly which page covers what.

Why a static file beats an MCP server here

  • Zero install. No Python, no Node, no claude mcp add .... Any tool that can fetch a URL just works.
  • Always in sync. The file ships with the docs build, so updates land at the same time as the pages they describe.
  • No infrastructure. Served as a static asset behind CloudFront. Same uptime as the rest of docs.flymy.ai.

If you need richer integration later — full-text search, vector lookups, doc-scoped permissions — wrapping llms.txt (or the underlying Markdown) in a thin MCP server is straightforward, but for "teach my AI assistant about FlyMy.AI" this file is enough.