📊 Data Processing Reddit

AI Jobs Globe in One Day: Cowork for Research, Scraping, Geocoding, and PRDs

The author used Cowork to turn a vague market hunch into a full data product: company list, job scraping, office geocoding, logo assets, Supabase schema, and three PRD docs before handing implementation to Claude Code.

★★★ Advanced 1 day 27. April 2026
S
Similar-Kangaroo-223 @u/Similar-Kangaroo-223

Built an AI jobs globe using Claude Cowork for research and data preparation

Source
📋

Scenario

The author wanted to answer where AI jobs actually are, but first needed company data, job listings, geocoded offices, visual assets, and product requirements.

💬

Prompt

Help me turn the idea 'nobody knows where AI jobs actually are' into a concrete data product. Curate a master list of AI companies by tier, scrape current AI job listings, geocode office locations, collect company logos, design the database schema, write frontend/logic/data PRDs, and prepare a clean handoff package for Claude Code.

Expected Result

Cowork produced a 1,802-company dataset, scraped 15,352 jobs, geocoded 4,682 office locations, collected 1,594 logos, wrote frontend/logic/data PRDs, and created Supabase schema, migration, and import scripts. Claude Code then built the 3D globe frontend.

📝

Original Post

· 2026-04-27

Cowork helped turn a vague AI-jobs hunch into a product plan, curate 1,802 AI companies, scrape 15,352 jobs, geocode 4,682 office locations, gather 1,594 company logos, write a three-document PRD, create the database schema, run migrations, import 21K+ rows, and hand the project off to Claude Code for implementation.