Scheduled Lookouts
Recurring research tasks with change detection and webhook notifications
Lookouts are recurring research tasks that run on a schedule, detect changes between runs, and optionally notify you via webhook. Set a query, pick a schedule, and Krawl keeps watching.
How Lookouts Work
- Create a lookout with a research query, mode, and schedule
- Krawl runs the research automatically on the schedule
- Results are compared with the previous run to detect changes (LLM-powered diff)
- Webhooks fire at each lifecycle stage: started, progress, completed, error
Create a Lookout
curl -X POST https://api.krawl.sh/lookouts \
-H "Content-Type: application/json" \
-H "X-API-Key: your-key" \
-d '{
"query": "Latest developments in AI agents",
"mode": "deep",
"schedule": "daily",
"webhook_url": "https://your-webhook.example.com/notify"
}'Response:
{
"id": "550e8400-...",
"query": "Latest developments in AI agents",
"mode": "deep",
"schedule": "daily",
"active": true,
"last_run": null,
"last_result_id": null,
"created_at": "2025-01-15T10:00:00Z",
"webhook_url": "https://your-webhook.example.com/notify"
}Schedules
| Schedule | Frequency |
|---|---|
hourly | Every hour |
daily | Once per day |
weekly | Once per week |
Limits
| Setting | Default | Env Variable |
|---|---|---|
| Max active lookouts per user | 10 | LOOKOUT_MAX_PER_USER |
| Minimum interval between runs | 60 minutes | LOOKOUT_MIN_INTERVAL_MINUTES |
| Manual trigger rate limit | 2/minute | — |
List Lookouts
curl -H "X-API-Key: your-key" \
"https://api.krawl.sh/lookouts?offset=0&limit=50"Get a Lookout
curl -H "X-API-Key: your-key" \
https://api.krawl.sh/lookouts/LOOKOUT_UUIDDelete a Lookout
curl -X DELETE -H "X-API-Key: your-key" \
https://api.krawl.sh/lookouts/LOOKOUT_UUIDManual Trigger
Force an immediate run (respects the cooldown interval):
curl -X POST -H "X-API-Key: your-key" \
https://api.krawl.sh/lookouts/LOOKOUT_UUID/runReturns 429 if the cooldown hasn't elapsed since the last run.
Change Detection
When a lookout runs, Krawl compares the new report with the previous one:
- If identical →
has_changes: false, no diff - If different → LLM-powered structured diff with:
new_findings— facts present in new report but not oldremoved_claims— facts in old report that disappearedchanged_data— metrics/data that changed (with old and new values)summary— natural language summary of changes
If the LLM diff fails, falls back to a simple line-level diff.
Webhooks
When webhook_url is set, Krawl sends POST requests at each lifecycle stage.
started Event
Fired when the lookout research begins.
{
"event": "started",
"lookout_id": "...",
"query": "Latest developments in AI agents",
"timestamp": "2025-01-15T10:00:00Z"
}progress Event
Fired periodically during research with step count.
{
"event": "progress",
"lookout_id": "...",
"steps_completed": 12,
"timestamp": "2025-01-15T10:01:00Z"
}completed Event
Fired when research finishes, includes the diff.
{
"event": "completed",
"lookout_id": "...",
"query": "Latest developments in AI agents",
"result_id": "result-uuid",
"has_changes": true,
"diff": {
"has_changes": true,
"summary": "3 new findings about agent frameworks...",
"new_findings": ["GPT-5 based agents announced", "..."],
"removed_claims": [],
"changed_data": []
},
"result_preview": "## AI Agents Report\n\nFirst 500 chars...",
"steps_completed": 24,
"timestamp": "2025-01-15T10:05:00Z"
}error Event
Fired if the research fails.
{
"event": "error",
"lookout_id": "...",
"query": "Latest developments in AI agents",
"error": "Research pipeline error: ...",
"timestamp": "2025-01-15T10:02:00Z"
}Webhook Security
- Webhook URLs are validated for format correctness
- Private/internal IPs are blocked (SSRF prevention)
- Webhooks fire asynchronously (non-blocking)
Lookout Data Model
{
"id": "string (UUID)",
"query": "string",
"mode": "deep|quick|crypto|...",
"schedule": "hourly|daily|weekly",
"active": true,
"last_run": "ISO timestamp or null",
"last_result_id": "UUID or null",
"created_at": "ISO timestamp",
"webhook_url": "URL or null"
}Scheduler
Lookouts are managed by an internal scheduler that starts on application boot. The scheduler:
- Loads all active lookouts from the database
- Registers cron jobs for each schedule
- Respects the concurrency semaphore (max 3 concurrent research sessions)
- Updates
last_runandlast_result_idafter each run