Portfolio Case Study · 2024–2025

Building GitPilot
from the Ground Up

A desktop app that solves the real frustration of juggling multiple Git identities — with an on-device AI that writes your commit messages for you, no cloud required.

Role
Solo Engineer — Full Stack
Timeline
2024 Q4 – 2025 Q1
Platforms
macOS · Windows
Status
Complete · Open Source
Tauri 2RustReact + TypeScriptlibgit2llama.cppSQLiteZustandGGUF Models
01 — Problem Statement

The identity confusion every developer ignores until it bites them.

Every developer with more than one job, client, or side project faces the same quiet tax: committing to a work repo under your personal email, or worse — pushing code under a GitHub handle that exposes you as the author of that questionable open-source project. It's embarrassing, hard to undo, and entirely preventable.

The existing solutions are inadequate. Git's conditional includes require you to structure your entire filesystem around .gitconfig rules that are easy to forget and impossible to audit at a glance. GUI clients like Sourcetree or GitKraken don't model identities at all — they just expose whatever git config returns, silently.

Meanwhile, AI commit message tools require either piping diffs to OpenAI or relying on Ollama being installed, running, and correctly configured as a separate daemon. There was no single tool that combined identity management, visual Git operations, and local AI — without external runtime dependencies.

The goal: a desktop app where switching Git context is zero-friction, AI commit messages are private and always available, and your entire repo landscape is visible and auditable at a glance.

02 — Solution Overview

One app. Identity-aware, AI-powered, entirely offline.

GitPilot is a Tauri 2 desktop application — a Rust backend exposing typed commands to a React/TypeScript frontend. The Rust layer handles all Git operations via libgit2 (the same library powering GitHub's backend), stores identity configs in SQLite, and manages a bundled llama-server sidecar for on-device AI inference.

The frontend is a dark-themed, animation-first React app using Zustand for state management and inline styles for rendering consistency across platforms. The result is a polished, identity-aware Git client that surfaces your 21 repos across two identities at a glance — with real-time status badges, mismatch warnings, and one-click AI commit generation.

GitPilot v0.1.0 — Dashboard
GitPilot main dashboard showing 21 repositories grouped across two Git identities
Fig 1. — Main dashboard: repos grouped by identity with live status. Work identity (ADCB) shows 17 repos; personal identity shows 4 including SpiderMac, android, krishnaraj-portfolio, and gitpilot itself.
03 — Technical Deep Dives

Architecture decisions that mattered.

Rust Backend: Tauri Commands + libgit2

The entire Git layer is implemented in Rust using the git2 crate, a high-level binding to libgit2. This gave GitPilot access to the full Git object model — commits, refs, diffs, index operations, remote transport — without shelling out to the git binary, which would've been brittle and path-dependent on Windows.

Tauri's command system bridges the Rust/JS boundary with zero serialization boilerplate. Each Git operation is a typed #[tauri::command] function that returns structured JSON to the frontend. Identity configs and path rules persist in SQLite via the rusqlite crate, with migrations managed manually for simplicity and auditability.

src-tauri/src/commands/git.rs
// Stage a file and return updated diff stats
#[tauri::command]
pub async fn stage_file(
    repo_path: String,
    file_path: String,
) -> Result<DiffStats, String> {
    let repo = Repository::open(&repo_path)
        .map_err(|e| e.to_string())?;
    let mut index = repo.index()?;
    index.add_path(Path::new(&file_path))?;
    index.write()?;
    get_diff_stats(&repo)
}

Local AI Pipeline: llama-server as Tauri Sidecar

The AI commit message feature was the most architecturally novel part of the project. Rather than requiring Ollama, the app bundles a compiled llama-server binary (from llama.cpp) as a Tauri sidecar — a child process the app owns, starts on launch, and terminates on exit.

On commit generation, the Rust backend POSTs the staged diff to the sidecar's http://localhost:8080/v1/chat/completions endpoint with stream: true. The SSE response is piped back to the frontend via Tauri events, enabling real-time token streaming in the UI. The active model path is resolved at runtime from SQLite, so users can switch models without restarting the app.

AI Model Panel
GitPilot AI model selection panel showing Qwen 2.5 Coder models
Fig 2. — AI Model Panel: Qwen 2.5 Coder models from 0.5B to 7B, downloaded on demand. The 3B model is active (marked Ready) with a one-click Use/switch flow.

Identity Matching Engine

Each identity stores an ordered list of glob patterns (e.g., ~/work/adcb/**). On repo scan, the engine walks the pattern list in priority order and assigns the first match. SQLite caches the resolution to avoid rescanning on every launch. When a repo's actual git config user.email doesn't match the assigned identity, the dashboard surfaces a mismatch badge — immediately actionable with a one-click fix.

The dashboard groups repos by identity section headers — as shown in Fig 1, "Krishnaraj Nair" (work identity, 17 repos) and "Krishnaraj Rajendran Nair" (personal identity, 4 repos) are rendered as distinct visual groups. The identity matching runs on every rescan and on filesystem watcher events.

Real-Time Features: Filesystem Watcher + Animated UI

The notify crate provides cross-platform filesystem events. GitPilot watches repo root directories and emits debounced Tauri events (250ms window) to prevent UI thrash on large saves or build outputs. The React side subscribes via listen() and triggers targeted Zustand state updates — only the affected repo card re-renders.

UI animations — sync modals, progress bars, staging transitions — are CSS keyframe animations triggered by state transitions. The frosted glass panels use backdrop-filter: blur() composited on Tauri's transparent window layer, requiring explicit WebView compositor flags per platform.

src-tauri/src/watcher.rs
// Debounced filesystem watcher → Tauri event
let (tx, rx) = channel();
let mut watcher = RecommendedWatcher::new(tx, Config::default())?;
watcher.watch(Path::new(&path), RecursiveMode::NonRecursive)?;

for event in rx.iter() {
    if let Ok(e) = event {
        app_handle.emit_all("repo:changed", &e.paths)?;
    }
}
04 — Key Challenges & Solutions

The problems worth documenting.

Every non-trivial project has a set of moments where the standard approach breaks. Here are the four that cost the most time and taught the most.

⚠ Challenge

libgit2 diff callback ownership errors

Rust's borrow checker rejected attempts to capture mutable state inside libgit2's foreach callbacks, causing compile failures when accumulating staged diff summaries.

✓ Solution

Refactored to collect diff deltas into a pre-allocated Vec using explicit lifetime scoping, then processed the collected data outside the callback closure entirely.

⚠ Challenge

OpenSSL cross-compilation on Windows

libgit2's HTTPS transport depends on OpenSSL, which has a notoriously painful cross-compilation story on Windows — link errors, missing headers, wrong architectures.

✓ Solution

Switched to the vendored feature flag in openssl-sys, which compiles OpenSSL from source as part of the Cargo build. Larger binary, but fully reproducible with no system dependency.

⚠ Challenge

Tailwind CSS purging dynamic classes

Dynamic class names constructed with string interpolation (e.g. `bg-${color}-500`) were purged by Tailwind's tree-shaker in production builds, causing invisible UI elements.

✓ Solution

Replaced the entire styling system with inline styles. Verbose, but eliminated all build-time style dependencies and ensured pixel-perfect rendering across WebView versions.

⚠ Challenge

llama-server sidecar lifecycle on macOS

The bundled llama-server process would occasionally outlive the Tauri app on macOS when force-quit, leaving a zombie process holding the inference port.

✓ Solution

Registered a RunEvent::ExitRequested handler that explicitly kills the sidecar PID before exit, plus port-availability checking on startup with automatic stale-process cleanup.

05 — Results & Metrics

By the numbers.

3.2K+
Lines of Rust
34
Tauri Commands
2
Platforms
~18MB
Binary Size

The app ships as a ~18MB binary on macOS (Universal) and ~16MB on Windows, with AI models downloaded on demand. Cold start to interactive UI is under 1.2 seconds on an M1 Mac. The llama-server sidecar initializes in 2–4 seconds for the 1.5B model — after which commit message generation streams the first token in under 800ms.

The dashboard indexes 21 repos across 2 identities in real-time (visible in Fig 1), with status classifications across 5 states: All Repos, Local Changes (15), Committed (0), Pushed (6), and Clean (0). The repo scanner runs recursively from a configured root path and completes in under 300ms for typical developer project directories.

06 — Reflections

What this project proved.

GitPilot was built to answer a specific question: can a solo frontend engineer ship a production-quality desktop app with a Rust core, local AI, and zero external service dependencies? The answer is yes — but the path required going significantly deeper into systems programming than any web project demands.

The most valuable lesson was the difference between bundling complexity and exposing it. The entire llama-server lifecycle, OpenSSL compilation, and libgit2 diff callbacks are invisible to the user — they see a button labeled "Generate" and get a commit message in under a second. Surface simplicity earned by backend depth.

The project also sharpened my perspective on local AI tooling. Ollama is excellent for developers who want a managed runtime, but it's a hard dependency for a consumer desktop app. Shipping the inference server as a sidecar binary — with all the cross-compilation friction that entails — is the only path to a truly self-contained product experience. That architectural choice is what makes GitPilot feel like a product rather than a developer demo.