Skip to main content
Internal Service β€” This is not an HTTP endpoint. It’s called directly by the generate-all orchestrator.

Purpose

Generates /llms/{product-slug}.txt files for discovered products and deploys them to Vercel. Runs in GROUP 2c (parallel with GROUP 2a and 2b, after GROUP 1a + 1b + 1d complete).

Shared Service

This task uses the shared generate_product_llms_txt service:
src/app/shared/products/generate_llms_txt.py
The same service is used by:
  • Onboarding (this task - GROUP 2c)
  • Cron (discover-products)
  • Cron (generate-product-llms-txt standalone endpoint)

Function Signature

async def run_generate_product_llms_txt(
    url: str,
    org_slug: str,
    business_name: str,
    products: list,
    pages: list
) -> StepResult

Parameters

ParameterTypeDescription
urlstrThe business website URL
org_slugstrThe Clerk organization slug
business_namestrThe business name
productslistProducts from GROUP 1d []
pageslistPre-scraped pages from GROUP 1b

Returns

{
  "name": "generate_product_llms_txt",
  "status": "success",
  "data": {
    "files_generated": 3,
    "files_deployed": 3,
    "products": ["Product A", "Product B", "Product C"]
  }
}

Execution Flow

Product LLMs Architecture

FileLocationIn SitemapIn index.js
Root llms.txt/llms.txtYesYes
Product llms/llms/{product-slug}.txtYesNo
This architecture scales to thousands of products:
  • Root llms.txt stays small (~3KB)
  • Each product gets its own file (~1-2KB)
  • Product files are fetched on-demand for AI article generation

Code Location

src/app/shared/products/
β”œβ”€β”€ __init__.py
β”œβ”€β”€ discover.py           # Shared discover_products service
└── generate_llms_txt.py  # Shared generate_product_llms_txt service

src/app/apis/onboarding/generate_all/tasks/
└── product_llms.py       # run_generate_product_llms_txt task wrapper

Error Handling

{
  "name": "generate_product_llms_txt",
  "status": "error",
  "error": "AI site project_name not found"
}
If product llms generation fails, onboarding continues - products are still saved and prompts are still generated.