chronicle

Chronicle

Chronicle is a daily AI signal filter for builders. It clusters duplicate AI/ML links, classifies each item, scores novelty against a 30-day rolling history, and ranks by a composite signal score.

Static site, no backend. The Chronicle workflow runs the pipeline once a day, commits two JSON files, uploads public/ as the GitHub Pages artifact, and deploys GitHub Pages.

public/feed.json and data/history.json are intentionally committed by the scheduled workflow. The feed gives GitHub Pages a static file to serve, and the history file gives novelty scoring durable state without adding a database or separate storage service.

Pipeline

sources → fetch → canonicalize → window-filter → cluster
                                                     ↓
                              novelty ← history ← classify (Haiku, one call)
                                                     ↓
                                                   score
                                                     ↓
                                                feed.json

Local run

npm install
ANTHROPIC_API_KEY=sk-... npm run run:pipeline
npm run verify:feed
npx --yes serve public

Run without an API key to see the fallback path (uses kind_hint, marks everything mixed).

Deploy

  1. Settings → Pages → Source: GitHub Actions.
  2. Settings → Secrets and variables → Actions → add ANTHROPIC_API_KEY.
  3. Push the pages branch.
  4. Trigger Refresh Chronicle manually for the first run.

Scheduled workflows run from the repository default branch. Set the default branch to pages if you want the daily cron to run from this branch.

Tuning

All knobs live in source. Eyeball output for a week, then adjust:

Cost

Chunked Haiku calls per run, ~150 clusters/day, JSON tool output. Roughly $0.20–0.50/month at current Haiku pricing. Free tier of GitHub Actions covers the compute.

v2 backlog (deferred on purpose)