Building Resume Tailor: Architecture of a Modern Electron + React Desktop App

Introduction
Resume Tailor is a desktop application that helps job seekers create tailored resumes and cover letters using AI. Rather than being a simple wrapper around a web app, it's built as a proper native application with file system access, multiple AI provider support, PDF generation, and application tracking.
This post walks through the key architectural decisions and why they were made.
Tech Stack Overview
The application uses the following technologies:
- Desktop Runtime: Electron for cross-platform support and native file access
- Frontend: React 18 + TypeScript for component model and type safety
- Styling: Tailwind CSS + shadcn/ui for rapid iteration
- Build Tool: Vite for fast HMR and ESM-first approach
- Validation: Zod for runtime validation + TypeScript inference
- PDF Generation: @react-pdf/renderer for React components to PDF
- AI Integration: CLI spawning + API calls for flexibility across providers
Project Structure
The codebase follows a clear separation between Electron's main process (Node.js) and renderer process (browser). The main process handles app lifecycle, window management, IPC handlers, and business logic services. The renderer contains the React application with UI components and pages.
Key insight: Schemas, types, and shared utilities live outside both main and renderer, allowing them to be imported by either process without duplication.
Electron Security Model
Electron apps have a reputation for security issues. Followed best practices including context isolation, disabled node integration, and sandbox mode. The renderer process runs in a sandboxed browser environment with no direct access to Node.js APIs.
The preload script serves as the only bridge, exposing a typed, limited API to the renderer. No arbitrary IPC calls are possible—only the methods we explicitly define.
IPC Communication Patterns
The ipc-handlers.ts file organizes approximately 40 handlers into semantic groups: file operations, settings, AI operations, and application tracking.
For long-running AI operations that can take 10-30+ seconds, progress is tracked with IPC events, sending updates for status changes and progress percentages.
Multi-Provider AI Architecture
One of the most interesting parts is how it supports multiple AI backends (Claude CLI, Codex CLI, Gemini CLI, OpenAI API).
All providers implement a common interface with execute, executeWithRetry, isAvailable, and getStatus methods. A base provider class handles shared logic like exponential backoff retries and JSON parsing.
The Claude provider spawns the CLI as a child process with specific flags to disable tools (saving tokens) and prevent loading project context. The provider registry allows graceful degradation—if Claude CLI isn't installed, fall back to Codex or Gemini.
Schema Validation with Zod
Zod provides runtime validation with TypeScript type inference. We define schemas once and get both validation and types. AI outputs are unpredictable, so we validate them strictly with schemas that include AI-specific additions like match scores and suggestions.
PDF Generation
We use @react-pdf/renderer which lets us write React components that render to PDF. Since blobs can't be serialized through IPC, we convert them to Uint8Array in the renderer and back to Buffer in the main process.
AI Output Sanitization
LLMs produce artifacts that need cleaning for professional documents. We remove zero-width characters, normalize unusual spaces, collapse multiple spaces, and strip markdown formatting when needed.
Service Layer Pattern
Business logic lives in services in the main process. All file I/O stays in one place, making it easy to test with mocks, providing a clear API for IPC handlers, and enabling caching for performance.
Key Takeaways
- Security first: Context isolation + preload script pattern keeps the renderer sandboxed while still enabling native functionality.
- Type safety end-to-end: Zod schemas are the single source of truth for data structures.
- Provider abstraction: Supporting multiple AI backends through a common interface allows flexibility and graceful degradation.
- Clean IPC patterns: Typed APIs in preload, organized handlers in main, progress tracking for long operations.
- Service layer in main process: Business logic stays in Node.js where it has full access to the file system and native APIs.
- Sanitize AI output: LLMs produce artifacts—clean them before using the output.
- Separate TypeScript configs: Different module systems for renderer (ESM) vs main process (CommonJS).
Conclusion
Building a professional Electron app requires thoughtful architecture across process boundaries, security concerns, and the unique challenges of desktop development. Resume Tailor demonstrates patterns that scale well and remain maintainable as features grow.

Toms Veidemanis
I write about engineering, technology, and problem-solving from a practical, systems-driven perspective. My background spans construction, energy efficiency, and software projects, with a focus on turning complex ideas into decisions that actually work in the real world. This blog is where I document what I’m building, testing, and learning—without hype or shortcuts.
