… public posts, …. Set the assumptions yourself — read‑time, think‑time, typing speed, what counts as one session — and see what the data implies.
⚠ Lower bound, not a measurement. The dataset is one community scrape — best available, not guaranteed exhaustive. Deleted posts and missed scrapes are gone forever. Pure scrolling that didn't end in a post is invisible to this method. Whatever number the page shows, the real one is at least that much.
We start with every public post Elon has sent — … of them going back to 2010, each tagged with the exact moment it went out. Posts that came in close succession get bundled into a single session — our shorthand for "he was on the app right then." If two posts are more than half an hour apart, that's two separate visits. Across his timeline that adds up to about … distinct sessions.
For each session we add up time from two angles, and take whichever is bigger:
Add it up across every session and you get the daily, monthly, and lifetime totals in the cards.
Drag any slider and every chart redraws live. The only thing fixed when the data was prepared is the half‑hour gap rule that separates one session from the next — change it and a session like "9:00 then 9:35" merges or splits, which shifts the estimate. Drop a wider or tighter gap into the build and the page rebuilds with a new (but always defensible) number.
Heads up: many of his sessions are anchored by wall‑clock evidence — if he posted at 9:00 and 9:25 he was on the app for 25 min regardless of how fast he types. So a few of these knobs barely move the headline number. The breakdown card below shows where the time actually comes from.
The dataset stores timestamps as UTC. Buttons apply a fixed offset — no DST handling. PDT (−7) is a reasonable default for most of the year in California; flip to −8 for winter.
Hour in selected time zone. Length of bar = % of all tweets posted in that hour.
Darker = more posts. Mon top, Sun bottom. Selected time zone.
Replies dominate — every reply implies he had to read the parent tweet first.
Each row = one year. Cell darkness = share of that year's posts in that hour (selected time zone). Look for the dark band — the closest thing to sleep. The band shrinks after 2022.
| Started (UTC) | Length | Posts | Originals | Replies | Quotes | Posts/min | Chars typed |
|---|
Posts come from a public, community‑maintained scrape of @elonmusk's X.com timeline, snapshot dated 2025‑08‑15. Source repo: MagdalenaRomaniecka/Decompiling‑MuskOS CSV · 56 MB.
Raw rows: 67,978. After dedupe on id and dropping rows with no createdAt: … usable posts spanning …. The dataset stops at 2025‑04‑13 — anything more recent isn't in this build. Re‑run make fetch && make against a newer snapshot and the entire site rebuilds.
From each row we keep:
id — primary key for dedupecreatedAt — ISO timestamp, UTCfullText — the post body (we use length(fullText) for character count)isReply, isRetweet, isQuote — boolean flags from the upstream scraper. A post is "original" iff all three are false.Engagement counts (likeCount, replyCount, etc.) are stored but the time‑budget model doesn't use them.
A session is a maximal run of posts where every consecutive gap is ≤ session_gap_min minutes. Default is 30 minutes. Larger gap → fewer, longer sessions; smaller gap → more, shorter sessions and probably a smaller estimate.
Sessionization runs entirely in your browser, on per‑tweet timestamps shipped with the page. The session_gap_min slider re‑groups all 55k posts each time you move it (~30 ms).
Each post gets a minimum time cost based on its kind. With knob symbols matching the panel above:
cost(original) = think_original + chars/type_cps + send cost(reply) = read_context + think_reply + chars/type_cps + send cost(quote) = read_context + think_quote + chars/type_cps + send cost(retweet) = think_rt + send
type_cps = 3 matches mobile thumb‑typing; 5–7 is typical desktop.Each session has three components, all added together:
end − start. If you posted at 9:00 and again at 9:25, you were on the app for at least 25 minutes regardless of how fast you think.Σ cost(post). For a tight burst (10 replies in 30 seconds), span is small but typing+thinking is long; this term dominates.2·edge_pad. Opening the app + glancing at notifications, applied once at start and once at end of every session, regardless of which of the other two terms is bigger.The first two are alternatives — only the bigger one represents actual time. Edge pad is added unconditionally:
session_time = max( span , Σ cost(post) ) + 2·edge_pad
Daily total = sum of session_time for every session that started that day.
| Knob | Default | What it represents |
|---|---|---|
read_context | 8 s | Time to read the parent of a reply or quote |
think_original | 30 s | Composing an original take from a blank slate |
think_reply | 10 s | Reacting to something you've already read |
think_quote | 15 s | Quote‑tweeting requires more setup than a plain reply |
think_rt | 2 s | See it, tap RT, done |
type_cps | 3 cps | Typing speed, characters per second |
send | 3 s | Tap, confirm, occasional edit |
edge_pad | 30 s | Buffer at start and end of every session |
We assume a 16‑hour waking day. career_avg = total_estimated_seconds / (calendar_days × 16 × 3600). The 2024‑specific number uses just 2024's posts and 2024's calendar days. Calendar days, not active days — empty days still count against the denominator.
Net: the headline number is a lower bound. Make every assumption optimistic for him (high typing speed, low think time, small edge pad) — and the lower bound is still huge.
The whole pipeline lives in a Makefile:
make fetch # re-download upstream dataset make # build/musk.db → build/*.json → site/index.html make serve # local server on :8765
Steps: sql/load.sql (CSV → DuckDB tables, dedupe) → sql/sessions.sql (initial sessionization) → sql/exports.sql (parquet archive + columnar JSON, including per‑tweet timestamps) → scripts/build_site.py (inject JSON into web/index.template.html).
All knobs above run client‑side — including session_gap_min, which re‑sessionizes from the per‑tweet data array on the fly. Move the sliders, every chart updates instantly.