Chronicle is a daily AI signal filter for builders. It clusters duplicate AI/ML links, classifies each item, scores novelty against a 30-day rolling history, and ranks by a composite signal score.
Static site, no backend. The Chronicle workflow runs the pipeline once a day,
commits two JSON files, uploads public/ as the GitHub Pages artifact, and
deploys GitHub Pages.
public/feed.json and data/history.json are intentionally committed by the
scheduled workflow. The feed gives GitHub Pages a static file to serve, and the
history file gives novelty scoring durable state without adding a database or
separate storage service.
sources → fetch → canonicalize → window-filter → cluster
↓
novelty ← history ← classify (Haiku, one call)
↓
score
↓
feed.json
utm_*, fbclid, /amp, fragments, default ports,
www., normalizes arXiv pdf ↔ abs.kind, quality, one_liner.1 − max trigram-Jaccard against 30 days of history.npm install
ANTHROPIC_API_KEY=sk-... npm run run:pipeline
npm run verify:feed
npx --yes serve public
Run without an API key to see the fallback path (uses kind_hint, marks
everything mixed).
ANTHROPIC_API_KEY.pages branch.Refresh Chronicle manually for the first run.Scheduled workflows run from the repository default branch. Set the default
branch to pages if you want the daily cron to run from this branch.
All knobs live in source. Eyeball output for a week, then adjust:
src/sources/registry.yamlTITLE_THRESHOLD in src/pipeline/cluster.tsW in src/pipeline/score.tsWINDOW_HOURS env (default 36)MAX_OUTPUT env (default 60)Chunked Haiku calls per run, ~150 clusters/day, JSON tool output. Roughly $0.20–0.50/month at current Haiku pricing. Free tier of GitHub Actions covers the compute.
signal
vs hype over time, fold into trust.