Powering the Next Generation of AI Compute: EPIC Microsystems
AI compute is scaling at an unprecedented pace, making power delivery a first-order constraint on performance, cost, and deployment velocity—as discussed recently in our power in the data center post. Today, we are excited to announce our investment in EPIC Microsystems’s $21M Series A. Seligman Ventures led the round, with participation from AICONIC Ventures, Cambium Capital, and returning seed investors A&E Investments and Nepenthe Capital.
AI chips have crossed a critical threshold. Modern AI accelerators like NVIDIA’s B200 now exceed 1,000 watts per chip, and power needs keep increasing with each generation. The arithmetic is simple and unforgiving: for a modern 1,000W AI chip, at 0.7V core, the current is I = P / V ≈ 1,400 amps. Pushing such extreme levels of current laterally across a motherboard creates resistive losses and voltage droop that get harder to deal with as load transients sharpen and power needs continue to climb.
The Last Millimeter Matters
The industry has responded by transitioning from 12V to 48V rack architectures, but that solves only part of the problem. The real bottleneck lies in the final stretch, the path from the board to the silicon die itself. Resistive loss scales as P loss = I² R, so at high current, even tiny path resistances become expensive in terms of heat and stability. This is pushing leading players to re-architect the final stages of power delivery—bringing voltage regulation closer to the die and shrinking delivery path length from centimeters down to millimeters via vertical power delivery.
These are not cosmetic changes, however. Once a platform commits to vertical delivery and integrated regulation, the architecture becomes tightly coupled with decisions on packaging, thermals, qualification, and supply chain that are difficult to reverse later.
What EPIC is Building
EPIC's DC-DC power solutions enable vertical power delivery directly underneath the GPU/ASIC and offers an architectural transition path onto the GPU/ASIC chip package, dramatically reducing resistive losses while improving density, efficiency, and transient response in a thin form factor. EPIC enables a complete power solution that fits beneath the chip package—allowing the power solution to scale alongside the chip while freeing up board area currently consumed by power components. Their differentiation is rooted in a hybrid switch-capacitor architecture, offering superior power efficiency, current density, lower profile, thermals and better manufacturability than traditional inductor-based designs.
What convinced us to back EPIC was not just the technology, but the unique founder-market fit and the team's ability to navigate both the technical and commercial challenges of an evolving power delivery ecosystem. Combining co-founder and CEO Sabin Eftimie’s deep domain expertise with co-founder and COO Wonyoung Kim’s prior founder success—including bringing novel power architectures to high-volume production and leading Lion Semiconductor’s exit to Cirrus Logic—creates an exceptional founding team. Together, they possess the rare, nuanced understanding required to navigate go-to-market dynamics in a challenging and rapidly changing arena with long qualification cycles.
Looking Ahead
AI compute demand continues to grow exponentially. Without innovations in power efficiency and delivery, AI infrastructure risks hitting a wall—either through power availability constraints or through thermal limits that make further density increases impractical. With power delivery shifting becoming a critical infrastructure enabler, power solutions for AI accelerators represent a massive market opportunity.
EPIC was built for this moment, with the technical approach, team, and market positioning to become one of the foundational players of this era, where each megawatt counts. We are thrilled to support EPIC Microsystems as they work to power the next generation of AI compute.



