Local-first clinical software prototype combining on-device intelligence, anatomy-aware workflows, and an iOS-native EHR shell.
Seeds and manages patient charts, medications, appointments, clinical photos, and longitudinal encounter history directly on device with SwiftData.
Uses Apple Foundation Models for structured note generation, chart-grounded question answering, and patient-specific clinical assistance.
Ships an iOS-native EHR layout with agenda, patient dashboard, inbox, intelligence tab, chart notes, prescriptions, and supporting workflows.
Maps diagnoses, photos, and affected regions onto an interactive anatomical view so documentation and visual exam context stay connected.
Includes service-layer scaffolding for pulling clinical records from HealthKit and normalizing them into the local chart model.
Explores what dermatology and medical-device-adjacent software could look like if modern iOS capabilities replaced brittle legacy portals.
MedMod comes directly from my day job perspective. I spend my professional life around medical workflows, fragmented systems, and tools that rarely feel designed for the people actually using them. This project is my attempt to reimagine that stack as something local-first, fluid, and actually intelligent on the device in your hand.
The current implementation includes a tabbed EHR shell, seeded patient records, appointment and medication models, clinical history, chart-note scaffolding, an intelligence workspace, and anatomy-aware visualization. It is not just a visual mockup; it already models real patient objects and real encounter flows inside a functioning SwiftUI app.
MedMod bridges my clinical domain knowledge with my software work. Where LinkedOut proves full-stack AI product thinking and PlaudBlender proves Python systems architecture, MedMod shows I can translate healthcare reality into product architecture, interface design, and local AI workflows that feel native to Apple platforms.