Internal Service β This is not an HTTP endpoint. Itβs called directly by the generate-all orchestrator.
Purpose
Generates /llms/{product-slug}.txt files for discovered products and deploys them to Vercel.
Runs in GROUP 3b (parallel with GROUP 3a product prompts, after GROUP 2b discover products).
Shared Service
This task uses the shared generate_product_llms_txt service:
src/app/shared/products/generate_llms_txt.py
The same service is used by:
- Onboarding (this task - GROUP 3b)
- Cron (
discover-products-from-changes)
- Cron (
generate-product-llms-txt standalone endpoint)
Function Signature
async def run_generate_product_llms_txt(
url: str,
org_slug: str,
business_name: str,
products: list,
pages: list
) -> StepResult
Parameters
| Parameter | Type | Description |
|---|
url | str | The business website URL |
org_slug | str | The Clerk organization slug |
business_name | str | The business name |
products | list | Products from GROUP 2b [] |
pages | list | Pre-scraped pages from GROUP 1a |
Returns
{
"name": "generate_product_llms_txt",
"status": "success",
"data": {
"files_generated": 3,
"files_deployed": 3,
"products": ["Product A", "Product B", "Product C"]
}
}
Execution Flow
Product LLMs Architecture
| File | Location | In Sitemap | In index.js |
|---|
| Root llms.txt | /llms.txt | Yes | Yes |
| Product llms | /llms/{product-slug}.txt | Yes | No |
This architecture scales to thousands of products:
- Root
llms.txt stays small (~3KB)
- Each product gets its own file (~1-2KB)
- Product files are fetched on-demand for boosted page generation
Code Location
src/app/shared/products/
βββ __init__.py
βββ discover.py # Shared discover_products service
βββ generate_llms_txt.py # Shared generate_product_llms_txt service
src/app/apis/onboarding/generate_all/tasks/
βββ product_llms.py # run_generate_product_llms_txt task wrapper
Error Handling
{
"name": "generate_product_llms_txt",
"status": "error",
"error": "AI site project_name not found"
}
If product llms generation fails, onboarding continues - products are still saved and prompts are still generated.