POST /api/cron/regenerate-fresh-website
Performs a FULL rebuild of an existing AI site using the same flow as initial onboarding, but deploys to the EXISTING Vercel project.Purpose
Use this endpoint when you want to:- Regenerate content with fresh LLM output
- Fix issues with an existing AI site
- Test changes to the generation pipeline
update-site which only does incremental updates when source website content changes.
Architecture
Request Body
| Field | Type | Required | Description |
|---|---|---|---|
business_id | string | Yes | Org slug (e.g., βwebsite-arena-1766312513β) |
url | string | No | Source URL (fetched from DB if not provided) |
max_pages | integer | No | Max pages to scrape (default: 5000) |
Pipeline Steps
| Step | Service | Description |
|---|---|---|
| 1 | scrape_website | Fresh scrape of source website |
| 2 | hash_pages | Generate content hashes for change detection |
| 3 | organize_with_llm | LLM organizes content (fresh, not update) |
| 4 | generate_ai_site | Generate all static files |
| 5 | deploy_to_vercel | Deploy to existing Vercel project |
| 6 | assign_domain | Update AI site record with URLs |
| 7 | store_page_hashes | Save hashes for future change detection |
| 8 | submit_urls_to_indexnow | Notify search engines |
Response Fields
| Field | Type | Description |
|---|---|---|
status | string | βsuccessβ or error |
ai_site_url | string | Deployed AI site URL |
source_url | string | Source website URL |
business_id | string | Business identifier |
pages_scraped | integer | Number of pages scraped |
files_generated | integer | Number of files generated |
pages_hashed | integer | Number of page hashes stored |