top of page

With support from the U.S. Department of Defense EnCharge AI partners with Princeton University on groundbreaking AI chip technology

  • Writer: Emerging Technology Insider
    Emerging Technology Insider
  • Mar 8, 2024
  • 4 min read

New collaboration to yield full-stack computing solution capable of supporting large-scale public- and private-sector advancements in AI through $18.6 million grant


ree

Today, EnCharge AI, the company commercializing next-generation AI accelerators, announced a partnership with Princeton University, supported by the U.S. Defense Advanced Research Projects Agency (DARPA) towards developing advanced processors capable of running AI models more efficiently than previously thought possible.


DARPA'S Optimum Processing Technology Inside Memory Arrays (OPTIMA) program is a $78 million effort to develop fast, power-efficient, and scalable compute-in-memory accelerators that can unlock new possibilities for commercial and defense-relevant AI workloads not achievable with current technology. As part of OPTIMA, DARPA has awarded an $18.6 million grant to a multi-year project proposed by Princeton University and EnCharge AI.


The rapid developments in AI have created skyrocketing processing requirements, currently addressed by massive server farms with high cost and power requirements. Broadening the adoption of AI from the cloud to real-time, mission-critical applications at the edge will require moving beyond existing processing technologies.  


DARPA'S OPTIMA recognized the need for significant breakthroughs rather than optimization of existing GPUs and other digital accelerators. DARPA specifically sought to fund "innovative approaches that enable revolutionary advances in science, devices, and systems," while using existing VLSI manufacturing techniques and specifically excluding "research that primarily results in evolutionary improvements to the existing state of practice."


Key to this Princeton University-EnCharge AI project is Dr. Naveen Verma, Professor of Electrical and Computer Engineering at Princeton and co-founder and CEO of EnCharge AI. Many of the innovations the OPTIMA project seeks to further develop were discovered in Dr. Verma's Princeton lab, in part with previous funding by DARPA and DoD.


The project will explore advancements and end-to-end workload execution of AI applications using the next generation of switched-capacitor analog in-memory computing chips pioneered by Dr. Verma's lab and commercialized by EnCharge AI. "The future is about decentralizing AI inference, unleashing it from the data center, and bringing it to phones, laptops, vehicles, and factories," Verma said. "While EnCharge is bringing the first generation of this technology to market now, we are excited for DARPA's support in accelerating the next generation to see how far we can take performance and efficiency." 


Switched-capacitor analog in-memory computing techniques have been proven over several generations of silicon developed at Princeton to deliver order-of-magnitude improvements in compute efficiency compared to digital accelerators while retaining precision and scalability that is not possible with electrical current-based analog computing approaches.


More information on switched-capacitor in-memory computing can be found in a series of foundational peer-reviewed published papers co-authored by Dr. Verma as well as at www.enchargeai.com/technology.


EnCharge AI CTO Dr. Kailash Gopalakrishnan noted that constraints posed by current AI processing technologies motivated EnCharge AI to participate in OPTIMA. "EnCharge brings together leaders from Princeton, IBM, Nvidia, Intel, AMD, Meta, Google and other companies that have led computing into the modern era. At some point, many of us realized that innovation within existing computing architectures as well as silicon technology-node scaling was slowing at exactly the time when AI was creating massive new demands for computing power and efficiency. While GPUs are the best available tool today, we are excited that DARPA is supporting the development of a new generation of chips to unlock the potential of AI."


Participation in OPTIMA comes on the heels of EnCharge AI's recently announced $22.6 million funding round involving new investors VentureTech Alliance, RTX Ventures and ACVC Partners to develop next-generation full-stack AI compute solutions that will fundamentally transform how AI is used from edge to cloud.


About EnCharge AI

EnCharge AI is a leader in advanced AI solutions for deployments from edge-to-cloud. EnCharge's robust and scalable next-generation in-memory computing technology provides orders-of-magnitude higher compute efficiency and density compared to today's best-in-class solutions. The high-performance solutions will enable the immense potential of AI to be accessible at scale, in power, size, and weight constrained applications. EnCharge AI launched in 2022 and is led by veteran technologists with backgrounds in semiconductor design and AI systems. EnCharge AI has raised $45 million to date.


For more information about EnCharge AI, please visit  https://enchargeai.com/


_______________________________________________


What are your company's business and financial objectives?










If your company has business or financial objectives, check out the advisory services for growing global technology companies provided by FC Global Strategies.


FC Global Strategies also sponsors Emerging Technology Insider.


Our calendar link to schedule a complimentary call.


FC Global Strategies' regional directors:


Jeffrey Friedland - Based in the U.S.

(North America and Global)

+1 646 450 8909


Ross Swan - Based in Singapore

(Southeast Asia, Australia, Asia-Pacific)

+65 9181 9472


David Krutonog - Based in Israel

(Eastern Europe, Middle-East and North Africa

+972 50 974 3429


Claudio Hebling - Based in Brazil

(Latin America)

+55 19 99377 748


Vincent Paul Joseph - Based in India

(South Asia, India and Sri Lanka)

+91 962 620 9090


Olah Abiodun - Based in Nigeria

(Sub-Saharan Africa)

+234 812 029 8222



ree


ree

ree

ree

bottom of page