Build Notes
How this site was built, what tools were used, and a running log of decisions. Each entry reflects something real.
How This Site Was Built
This site was developed using an AI-assisted workflow designed to move quickly from idea to working product.
Tools used
- AnthropicArchitectural reasoning, debugging, and implementation
- OpenAICopy editing, structural thinking, and prompt refinement
- GitHubVersion control and deployment management
- VercelHosting and continuous deployment
Development pattern
- 1.Generate an initial working artifact quickly.
- 2.Step back and evaluate the architecture.
- 3.Rewrite sections intentionally rather than iterating blindly.
- 4.Use AI to assist with targeted improvements while maintaining control of the system design.
The goal wasn't just to generate code. It was to learn how to move from idea to artifact to working product using AI effectively.
Build Log
Site launched
Initial build with Next.js, TypeScript, and Tailwind CSS. Started from scratch rather than a template. Structure and copy written from a hiring manager's perspective.
Rebuilt + games added
Stripped the first version and rebuilt with a dark theme and cleaner layout. Added three embedded browser games: a Processing-style canvas, a calming Fishtank, and Shape Runner v2.
ANACOMICS
ANACOMICS is a comic series explaining Python packaging, environments, and the Anaconda ecosystem. Launched the Deployment Architecture module: 7 pages with custom image sequencing, serif typography, and an accessible panel layout.
Homepage redesign
Repositioned the site around product and systems thinking. Rewrote all copy, restructured every section, and replaced static lists with an accordion, a 2×2 card grid, a Toolkit, and this build log.
sebban.tech is live
Deployed the site end-to-end: code lives on GitHub, Vercel pulls from the master branch and rebuilds on every push, and Hostinger DNS routes sebban.tech to Vercel via A and CNAME records. Also shipped a full SEO pass: self-hosted fonts, dynamic OG image, favicon, sitemap, and robots.txt.