Skip to main content

Adding Webhooks and Callbacks Support in LLM Context Generation

· One min read

We've extended LLM context generation to include Webhooks and Callbacks documentation. Both llms.txt and llms-full.txt now provide AI tools like ChatGPT and Claude with comprehensive information about your event-based API interactions.

What's New?

Previously, LLM context files only included standard API endpoints, leaving AI tools without visibility into Webhooks and Callbacks. This enhancement ensures developers can get accurate answers about event-based interactions when using AI assistants.

llms.txt

Direct links to Webhooks and Callbacks documentation are now included for quick discovery:

Webhooks & Callbacks Links in llm.txt example

This enables LLMs to locate and reference event-based endpoints and their guides.

llms-full.txt

Complete documentation for each Webhook and Callback is now included with:

  • Event descriptions and payload structures
  • Request payload schemas with data types
  • Usage examples in supported frameworks
  • Code samples for signature verification and event parsing

Webhooks & Callbacks in llm-full.txt example

This provides LLM tools with full context when interpreting and explaining your event-based APIs.

To learn more, see the LLM context generation documentation.