I’ve been using Claude daily for months. Hundreds of conversations. Thousands of messages. Code reviews, brainstorming sessions, debugging rabbit holes, late-night side project ideas. All sitting inside a JSON export that nobody ever looks at.
So I did something weird with it. I turned it into a city.
AI Town is a web app that takes your Claude conversation export and transforms it into a living, breathing pixel-art town. Every conversation becomes a building. Every message becomes a person walking the streets. The more you’ve chatted, the bigger your city gets.
And here’s the kicker: I built the entire thing using Claude Code as my coding partner.
The Idea
It started with a simple question: what does my relationship with AI actually look like?
I had exported my Claude data out of curiosity and was staring at a conversations.json file with hundreds of entries. Some conversations had 3 messages. Others had 300+. Some were from January 2024, others from last week. There was a shape to it, a pattern of how I use AI, but raw JSON doesn’t tell a story.
What if it could, though? What if every conversation was a building, sized by how deep the conversation went? What if a quick question was a small house, and a week-long debugging session was a skyscraper? What if little pixel characters walked around, representing the messages exchanged?
That’s AI Town.
What It Actually Does
The flow is dead simple:
- Go to claude.ai, navigate to Settings > Privacy, and export your data as a ZIP
- Drop that ZIP onto aitown-seven.vercel.app/upload
- Pick a username
- Watch your city generate in real-time
Your conversations become buildings. A quick 5-message exchange? That’s a small house. A 50-message deep dive? Medium building. A 200+ message marathon coding session? That’s a tower, and it dominates the skyline.
Each building gets a unique color derived from the conversation’s UUID (hashed into a hue value). The result is a colorful, varied cityscape that’s unique to every user.
Once your town is live, you can share it with anyone. They’ll see your buildings, your stats, little pixel people wandering around. But they won’t see your conversation titles or any content. More on that later.
The Tech Stack
Here’s what powers AI Town:
- Next.js 16 with App Router (yes, 16, with React 19)
- TypeScript everywhere
- HTML Canvas for the pixel-art rendering
- Tailwind CSS 4 for the UI
- Cloudflare R2 for storage (S3-compatible, globally distributed)
- Vercel for deployment
- JSZip for client-side ZIP parsing
- shadcn/ui for UI components
No database. No auth system. No user accounts. Just a username namespace backed by JSON files in object storage.
Building It With Claude Code
I want to be upfront about this: Claude Code was my primary development partner for this project. Not in the “I asked it to write a function” way. In the “I described what I wanted and iterated on the implementation together” way.
Here’s what that actually looked like in practice.
The Canvas Renderer
The pixel-art rendering engine was the most complex part of the project. It’s a custom Canvas 2D renderer that draws an entire city tile by tile, 60 frames per second.
I started by describing what I wanted: a top-down pixel city with buildings, roads, trees, street lamps, and little walking characters. Claude Code helped me architect the rendering pipeline as a layered system:
- Sky layer - the dark navy background
- Ground tiles - grass and roads, with roads automatically placed adjacent to buildings
- Environmental details - trees (8% spawn chance per grass tile) and street lamps on roads
- Buildings - walls, roofs, flickering windows, doors
- Peeps - the wandering NPCs drawn last so they appear in front of everything
Every environmental detail is generated using seeded randomness based on tile coordinates. This means the same town looks identical every time you load it, no server-side rendering state needed. The trees are always in the same spots. The lamps are always on the same roads.
The window flickering was a fun detail. Each window uses a sine-based flicker function that makes some windows glow gold (#ffd700) while others stay dark, and the pattern shifts subtly over time. It gives the whole town a sense of life without any complex animation system.
The Spiral Layout Algorithm
When you have 50 conversations that need to become buildings of different sizes, how do you arrange them so they look like an organic city rather than a spreadsheet?
The answer was a spiral placement algorithm. Buildings are placed starting from the center of the grid and spiraling outward:
// Simplified version of the spiral generator
function* spiralPositions(centerX: number, centerY: number) {
yield { x: centerX, y: centerY };
let layer = 1;
while (true) {
// Walk right, then down, then left, then up
// Each layer adds one more ring around the center
for (const [dx, dy] of directions) {
for (let step = 0; step < layer * 2; step++) {
yield { x: centerX + dx * step, y: centerY + dy * step };
}
}
layer++;
}
}
Each building checks if it can fit at the current position without overlapping other buildings, with a 1-tile gap between everything (which becomes the roads). Larger buildings (towers, large) get placed first since they need more room. The result is a city that grows naturally from the center, with bigger buildings near the core and smaller ones on the outskirts.
The Peep AI System
Every building spawns little pixel people based on its message count. One peep per 5 messages. They wander around their home building with a simple state machine:
- Idle state: Stand still for 30-150 frames, with a 5% chance per cycle to show a speech bubble (”…”)
- Walking state: Pick a random point within a 3-tile radius of home and walk toward it at a speed between 0.3 and 0.7 tiles per frame
The peeps are colored based on whether they represent human messages (warm skin tone, #e8a87c) or assistant messages (cool blue, #7cb8e8). When you look at a busy building, you can literally see the back-and-forth of conversation happening as warm and cool colored characters mill around.
Walking animation is just a sine wave applied to leg position. Simple, but it sells the illusion at 16px tile scale.
Camera System
The canvas supports two modes:
Interactive mode (on town pages): Click and drag to pan, scroll wheel to zoom (1x to 4x), click buildings to see their stats.
Cinematic mode (on the landing page): The camera slowly pans across the town with a subtle vertical bob, like a drone flyover. It ping-pongs left to right, giving visitors a preview of what their town could look like.
Screen-to-world coordinate conversion handles the math for click detection. When you click a building, it converts your screen pixel coordinates back to world tile coordinates, accounting for camera offset and zoom level.
Privacy: The Part I’m Most Proud Of
Here’s the thing about conversation data: it’s personal. I didn’t want to build something that requires people to upload their actual conversations to a server.
So I didn’t.
Your conversation content never leaves your browser. The ZIP file is parsed entirely client-side using JSZip. The app extracts only metadata: message counts, timestamps (reduced to month/year granularity), and a hashed color value derived from each conversation’s UUID.
What gets sent to the server:
{
"messageCount": 245,
"humanMessageCount": 120,
"assistantMessageCount": 125,
"firstActive": "2024-01",
"lastActive": "2024-12",
"buildingType": "tower",
"colorSeed": 32
}
What does NOT get sent to the server:
- Conversation titles
- Conversation UUIDs
- Any message content
- Any personally identifiable information
Conversation titles (which could contain sensitive project names or topics) are stored exclusively in the browser’s localStorage. Only the person who created the town can see them. If they clear their browser data, the titles are gone forever. The server never knew them.
The API also validates everything coming in with strict field whitelisting. It rejects any unexpected properties, validates date formats, checks numeric ranges, and sanitizes building types against the known enum. No room for injection of unintended data.
ZIP uploads have safety limits too: 100MB compressed, 50MB uncompressed, max 100 files. This prevents decompression bombs and memory exhaustion.
The Username System
There’s no auth. No passwords. No OAuth. You pick a username, we check if it’s available via a HEAD request to R2, and if it is, it’s yours.
GET /api/towns/username/exists → 200 (taken) or 404 (available)
POST /api/towns/username → Creates the town (409 if race condition)
The username regex is strict: ^[a-z0-9][a-z0-9-]{0,28}[a-z0-9]$. Lowercase alphanumeric, dashes allowed in the middle, 2-30 characters. This prevents directory traversal attacks and keeps URLs clean.
Is this “secure” in the traditional sense? No. Someone could theoretically squat usernames. But for a fun visualization tool, the friction of requiring account creation would kill the experience. The goal was zero-friction: upload, name, done.
Dynamic OG Images
When someone shares their town on Twitter or Discord, I wanted it to look good. Not a generic preview. Their town.
Next.js has a brilliant feature for this: opengraph-image.tsx files. You export a React component that returns an ImageResponse, and Next.js generates the image server-side on demand.
For per-town OG images, the component fetches the town data from R2, then renders:
- The username in large gold text
- Four stat cards (conversations, messages, buildings, average messages per conversation)
- The date range of their Claude usage
- A miniature skyline of their actual buildings, colored by their real color seeds
The landing page gets its own OG image with a full-width skyline and the tagline.
One gotcha I ran into: the getTown() call in the OG image route needs a try/catch. If R2 throws an error (network timeout, rate limit, whatever), the entire image generation fails silently and social platforms show nothing. Wrapping it in try/catch with a fallback “Town not found” image fixed that.
The Landing Page
The landing page needed to sell the concept instantly. When someone lands on aitown-seven.vercel.app, they should understand what this is within 3 seconds.
The hero section features a live canvas running in cinematic mode, rendering either the creator’s actual town or sample data as a fallback. Floating pixel particles drift across the screen. The headline uses the Press Start 2P pixel font with a shimmer animation.
Below that, a simple 3-step “How It Works” section, a privacy callout (because people rightfully care about that), and a sponsors section.
The retro aesthetic is consistent throughout: navy backgrounds, gold accents, scanline overlays, vignette effects, noise grain texture. It all reinforces the pixel-art theme without being obnoxious about it.
Storage Architecture
I chose Cloudflare R2 over traditional databases for a few reasons:
- No schema needed. Each town is just a JSON blob at
towns/{username}/town.json - Globally distributed. R2 has edge locations everywhere
- S3-compatible. The AWS SDK works with zero modifications
- Cost-effective. Free egress, pennies for storage
- Simple operations. GET, PUT, HEAD. That’s the entire API surface
The Vercel API routes add a caching layer on top: Cache-Control: public, s-maxage=60, stale-while-revalidate=300. So after the first request, the town data is served from Vercel’s edge cache for 60 seconds, with stale-while-revalidate extending that to 5 minutes.
For a read-heavy app where data changes are infrequent (you create your town once), this is plenty. No connection pools, no ORM, no migrations. Just JSON in a bucket.
Interesting Technical Details
UUID to Color Hashing
Each conversation has a UUID. I hash it into a hue value (0-360) for deterministic color assignment:
function uuidToColorSeed(uuid: string): number {
let hash = 0;
for (let i = 0; i < uuid.length; i++) {
hash = (hash << 5) - hash + uuid.charCodeAt(i);
hash |= 0; // Convert to 32-bit integer
}
return Math.abs(hash) % 360;
}
This means the same conversation always gets the same building color, and the distribution across the hue wheel is roughly uniform. Your town’s color palette is determined by your conversation IDs, making it truly unique.
Building Animation
When your town first generates, buildings don’t just appear. They bounce in one by one, 100ms apart, using a bounce easing function. Combined with an auto-panning camera that follows the spiral outward, it creates this satisfying feeling of watching your city grow from nothing.
Canvas Optimization
Rendering a potentially large grid at 60fps requires some care:
- Tile culling: Only tiles visible within the current viewport are rendered. No point drawing what’s off-screen
- Seeded randomness: Environmental details (trees, lamps) are calculated from tile coordinates, not stored in memory
- Single draw pass: All rendering happens in one
requestAnimationFramecallback, no intermediate buffers - Device pixel ratio: The canvas accounts for high-DPI displays (Retina, etc.) for crisp rendering
Lessons from Building With Claude Code
A few things I learned from using Claude Code as my primary development tool:
Describe the “what”, iterate on the “how”. I got the best results when I described what I wanted the end result to look and feel like, rather than specifying implementation details upfront. “I want buildings that flicker their windows like a real city at night” produced better code than “implement a sine-based animation for window opacity.”
Read code before asking for changes. Claude Code is significantly better at modifying code it has already seen in context. When I pointed it to a file and said “improve this,” the results were targeted and appropriate. When I described changes without context, they sometimes missed existing patterns.
Let it handle the boilerplate, focus on the creative. Setting up R2 clients, API route validation, TypeScript interfaces, Canvas boilerplate? Let Claude Code handle it. Where I spent my time was on the creative decisions: what should the aesthetic look like? How should the buildings be arranged? What details make the town feel alive?
Ship fast, iterate. The first version of AI Town went from idea to deployed in a weekend. It wasn’t perfect. The OG images were basic, the landing page was minimal, the peep AI was simple. But it worked. Every session after that was an improvement pass, and Claude Code made those iterations fast.
What’s Next
There are a bunch of ideas I want to explore:
- Multi-provider support: Not just Claude, but ChatGPT, Gemini, and other exports
- Town comparison: Side-by-side visualization of different users’ AI usage patterns
- Time-lapse view: Watch your town grow chronologically as conversations happened over months
- More building variety: Different architectural styles based on conversation topics or length patterns
- Sound: Ambient chiptune background music and click sound effects
Try It
If you use Claude (and if you’re reading HackerNoon, you probably do), go export your data and see what your town looks like.
- Head to claude.ai/settings, go to Privacy, click Export
- Wait for the email with your ZIP
- Go to aitown-seven.vercel.app and drop it in
- Pick a username and watch it build
Your data stays private. Your town is public. Share it, compare it with friends, see who has the tallest towers.
The whole project is open source. Check it out, fork it, build on it. And if you build something cool with Claude Code, I’d love to hear about it.
Built with Next.js 16, Canvas 2D, Cloudflare R2, and a lot of late-night conversations with Claude Code.