AI-Assisted Product, Systems, and Full-Stack Builder

Building AI-assisted tools that feel intentional instead of theoretical.

Building internal tools, assistant systems, and Linux workflows with AI as a practical multiplier.

I care most about tools that are actually usable: fast enough, clear enough, and maintainable enough to survive beyond the first demo.

Featured work

Four case studies that show how I use AI differently depending on the problem.

One project uses AI directly in the product. Another uses AI to manage a complex internal software roadmap. Another turns Linux customization into reproducible tooling. Another turns dockerized local models into a voice assistant with streaming speech and native runtime clients.

Shipped project

NewForm AI

Featured

A scheduled reporting app that turns ad-platform performance data into readable, AI-generated insight reports with web, email, and PDF delivery paths.

  • Prompting is limited to the metrics the user selected
  • HTML, email, and PDF outputs share the same reporting pipeline
  • Recurring schedules and manual reruns are both supported
Next.js 16TypeScriptTailwind CSS
Read case study
Active internal product

TripInvoiceTracker

Featured

An internal transportation workflow app focused on centralizing trips, invoicing, payments, admin controls, and future intake and deployment planning.

  • Single trip entry can feed multiple downstream documents
  • Role-aware admin, office, and dispatch behavior
  • Backup and restore workflow documented for migration and recovery
Node.jsExpressMongoDB
Read case study
Active systems toolkit

CachyOS Config

Featured

A Linux desktop configuration repo that evolved into a reproducible restore kit with custom Waybar modules, popup tooling, and scoped backup management.

  • Custom control center, system monitor, clock, calendar, and AI-usage Waybar modules
  • bootstrap.sh restores curated packages and config targets
  • Backup manager GUI supports whole-app and component-level backup selection
PythonBashWaybar
Read case study
Active AI voice assistant

Jiggumps

Featured

A private AI voice assistant built around dockerized local models, streaming speech, and native always-on clients that turn Ollama into something you can actually talk to.

  • Docker Compose brings up Ollama, Redis, SearXNG, and the FastAPI assistant backend as one local stack
  • Native shell clients on Windows, macOS, and Linux share one always-on voice runtime
  • Whisper, Piper, wake-word detection, and streaming turn handling work together as one speech pipeline
PythonFastAPIOllama
Read case study