Skip to main content
The AI tab brings together everything you need to integrate the Timely.ai documentation directly into your development workflow with AI. Instead of switching between the browser and the editor, you inject the right context where the language model is already running.

What you’ll find here

llms.txt

Full context file (~75 KB) in the llmstxt.org standard. Everything a model needs to know about the Timely.ai API in a single text file.

Cursor

Add the documentation as a @Docs source in Cursor and mention @Timely.ai in any prompt to get accurate answers about the API.

Claude

Use the llms.txt in Claude Desktop skills or load the context into CLAUDE.md via Claude Code for sessions with persistent context.

VS Code

Configure Continue.dev as a context provider pointing to https://docs.timelyai.com.br, or use context comments with GitHub Copilot.

Why this matters

Language models have a training cutoff date and don’t know about private APIs. When you work with Timely.ai, the model needs to know:
  • Which endpoints exist and what parameters they accept
  • How to authenticate with x-api-key
  • The webhook event types and their payloads
  • Rate limiting and pagination rules
The llms.txt delivers this context in a compact format. The guide for each editor shows how to inject it in the right place.

How llms.txt is generated

The file at https://docs.timelyai.com.br/llms.txt is generated automatically from the .mdx pages in this documentation. Whenever a page is updated, the file reflects the change. You can download the latest version with:
curl https://docs.timelyai.com.br/llms.txt -o timely-llms.txt

Choose your editor

Go to Settings → Features → Docs → Add and paste https://docs.timelyai.com.br. Then use @Timely.ai in your prompts. View full guide →