Skip to main content
POST
https://searchcompany-main.up.railway.app
/
api
/
cron
/
update-all-timestamps
curl -X POST https://searchcompany-main.up.railway.app/api/cron/update-all-timestamps \
  -H "Content-Type: application/json" \
  -d '{"business_id": "nike"}'
{
  "status": "success",
  "core_files_updated": 3,
  "pages_updated": 47,
  "pages_failed": 0,
  "files": [
    {"path": "public/llms.txt", "content": "..."},
    {"path": "public/data.json", "content": "..."},
    {"path": "next.config.js", "content": "..."},
    {"path": "pages/top-10-sneakers-2025.js", "content": "..."}
  ],
  "project_name": "ai-nike",
  "ai_site_url": "https://nike.searchcompany.dev",
  "updated_paths": [
    "/llms.txt",
    "/data.json",
    "/top-10-sneakers-2025"
  ]
}
Internal endpoint for the Cron service to update timestamps on ALL pages for a business.
This signals freshness to AI search engines like Bing, which favor recently updated content. Updates core AI site files (llms.txt, data.json, next.config.js) AND all boosted pages.
This is Job 2 in the cron sequence, running BEFORE new boosted pages are created. It ensures existing content appears fresh to crawlers.

What Gets Updated

Core AI Site Files

FileUpdate
llms.txt”Last updated: [date]” at bottom
data.jsondateModified field
next.config.jsLast-Modified and ETag HTTP headers

Boosted Pages

For every existing boosted page:
ElementUpdate
<meta property="article:modified_time">Current ISO timestamp
Year in title”2025” β†’ β€œ2026” (if year changed)
Year in <h1>Same as title
Footer”Last updated: December 24, 2025”

Request Body

business_id
string
required
Business identifier (org_slug)

Response

status
string
β€œsuccess” or β€œerror”
core_files_updated
number
Number of core AI site files updated (llms.txt, data.json, next.config.js)
pages_updated
number
Number of boosted pages successfully updated
pages_failed
number
Number of boosted pages that failed to update
files
array
Array of updated files to deploy. Each file has path and content.
project_name
string
Vercel project name for deployment
ai_site_url
string
AI site URL (e.g. β€œhttps://nike.searchcompany.dev”)
updated_paths
array
List of URL paths that were updated
curl -X POST https://searchcompany-main.up.railway.app/api/cron/update-all-timestamps \
  -H "Content-Type: application/json" \
  -d '{"business_id": "nike"}'
{
  "status": "success",
  "core_files_updated": 3,
  "pages_updated": 47,
  "pages_failed": 0,
  "files": [
    {"path": "public/llms.txt", "content": "..."},
    {"path": "public/data.json", "content": "..."},
    {"path": "next.config.js", "content": "..."},
    {"path": "pages/top-10-sneakers-2025.js", "content": "..."}
  ],
  "project_name": "ai-nike",
  "ai_site_url": "https://nike.searchcompany.dev",
  "updated_paths": [
    "/llms.txt",
    "/data.json",
    "/top-10-sneakers-2025"
  ]
}

Workflow

This endpoint returns files but does NOT deploy. The cron service handles deployment:
1. update-all-timestamps β†’ Returns updated files
   └── Updates core files + all boosted pages

2. deploy-to-vercel β†’ Deploys all updated files at once
   └── Single deployment with all changes

3. submit-indexnow β†’ Notifies search engines
   └── Tells Bing, Yandex, etc. that pages were updated

Why This Matters

From Bing’s AI Search documentation:
β€œBing’s crawl frequency and its AI tend to favor fresh content. Update your content regularly – even minor tweaks with a β€˜Last updated’ note can signal freshness. Bing might include a date in the citation (e.g., β€˜source: yoursite.com (Dec 2025)’), which can influence user trust.”
By updating timestamps daily, your pages:
  • Appear fresh to AI crawlers
  • Get higher priority in crawl scheduling
  • Show recent dates in AI citations
  • Compete better against newer content