Back to Home

Build Log #2: Day 1-7 — From Idea to First Signals

By Picking Solutions AdminJanuary 27, 202610 min read
Build Log #2: Day 1-7 — From Idea to First Signals

Day one started with a blank Next.js project and a vague ambition: build a market intelligence platform covering energy, finance, and digital assets. By day seven, the first live data was flowing. Here's how.

The Stack Decision (Day 1)

First prompt to DeepAgent: "I need a blog and market intelligence platform. Next.js 14 with App Router, Prisma ORM, PostgreSQL, Tailwind CSS, NextAuth for authentication. Set it up."

The stack wasn't debated. Next.js gives you SSR, API routes, and React in one package. Prisma handles the ORM with type safety. PostgreSQL because it's battle-tested. Tailwind because life's too short to write CSS from scratch.

What surprised me: DeepAgent didn't just scaffold — it made architectural decisions. App Router over Pages Router. Server components by default. Prisma with a sensible schema that I'd iterate on for the next four weeks.

By end of day one: authentication working, admin panel skeleton, first blog post created manually.

Data Architecture (Days 2-3)

The core problem: how do you ingest, normalise, and serve intelligence from dozens of sources across three verticals?

We settled on a unified model: IntelligenceItem. Every piece of external content — RSS articles, API data, social posts — gets normalised into the same shape: title, summary, source, vertical, quality score, approval status.

The key decisions that emerged:

    • Source-level trust: Each source has an autoApprove flag. Reuters gets auto-approved. Random RSS feeds don't.
    • Vertical tagging: Energy, Finance, Digital Assets — mapped at the source level, not the item level
    • Quality scoring: LLM-based relevance scoring (added later, but the schema was ready from day one)
    • Free vs Premium: Sources flagged with displayInFreeFeed for the public daily feed

Prisma schema by day 3 was already at 15+ models. Posts, Categories, Users, IntelligenceItems, IntelligenceSources, LumenDigests, DailyEditions, PulseSignals. It felt like over-engineering at the time. It wasn't.

RSS Pipeline (Days 4-5)

The first data pipeline: RSS feed ingestion. Sources from Reuters, Bloomberg Terminal Blog, Carbon Brief, CoinDesk, The Block — about 30 feeds covering the three verticals.

Architecture: a scheduled task runs every 6 hours, hits each RSS feed, deduplicates by URL, normalises the content, and writes to the IntelligenceItem table. Trusted sources get auto-approved. Everything else goes to a review queue.

The messy reality: RSS feeds are wildly inconsistent. Some give full content, some give one-line summaries. Date formats vary. Character encoding breaks. We wrote a normalisation layer that handles the common edge cases and logs the rest for manual review.

By day 5: ~200 intelligence items in the database, auto-refreshing every 6 hours.

The Daily Feed (Days 6-7)

With data flowing, we needed a way to present it. The /daily-feed page: a clean, three-column layout showing today's approved items grouped by vertical.

Design principle: less is more. Maximum 7 items per vertical per day. No infinite scroll. No algorithmic ranking. Just the most relevant items from trusted sources, presented cleanly with source attribution and one-line summaries.

The daily feed became the anchor of the platform — the thing you check once in the morning with your coffee. Not a firehose. A curated briefing.

What Worked (and Didn't) in Week 1

Worked:

    • DeepAgent for scaffolding — 80% of the boilerplate code was generated correctly on first try
    • Schema-first thinking — designing the Prisma models before writing any UI saved massive refactoring later
    • Automated data pipelines early — the platform felt "alive" from day 5

Didn't work:

    • CSS perfectionism — spent 4 hours on header gradients that nobody would notice. Learned to ship rough and iterate.
    • Overscoping the admin panel — built features I wouldn't use for weeks
    • Not testing dark mode from day one — this haunted us later (see GLB-010)

End of Week 1 State

Models: 15+ Prisma models. Data: ~200 intelligence items from 30 RSS sources. Pages: Home, Blog, Daily Feed, Admin Dashboard, Categories, About, Contact. Authentication: NextAuth with email/password. Deployment: live on pickingsolutions.tech via Abacus.AI hosting.

Total AI-assisted development hours: ~35. Total manual override hours: ~10. The ratio would shift as complexity grew.

Next: Week 2 — Tools & Calculators →

Tags:

build-logaiarchitecturedeepagent

Recommended Reading