Energy-based analog neural network framework

Published on March 3, 2023

Imagine you’re building a really intricate puzzle. You have all the small pieces laid out in front of you, and it’s up to you to connect them in just the right way. Well, that’s kind of what scientists have been doing with neuromorphic systems and machine learning models. They’re trying to build powerful networks that can mimic the behavior of the human brain. But it’s been slow going because the process is so laborious. That’s where this new open-source framework called EBANA comes in! It’s like a toolbox filled with all the pieces you need to build your own analog neural network. And the best part? It’s easy to use! Just like the Python programming language or Keras library, EBANA simplifies the process and hides all the complex details of analog simulations. It even includes common building blocks, but can also be extended with new concepts and models. With EBANA, researchers and practitioners can explore different network designs, test them on various datasets, and see how they perform. And thanks to its parallelization capability, experiments can be done quickly and efficiently. So if you’re curious about analog neural networks and want to dive deeper, check out the research behind EBANA!

Over the past decade a body of work has emerged and shown the disruptive potential of neuromorphic systems across a broad range of studies, often combining novel machine learning models and nanotechnologies. Still, the scope of investigations often remains limited to simple problems since the process of building, training, and evaluating mixed-signal neural models is slow and laborious. In this paper, we introduce an open-source framework, called EBANA, that provides a unified, modularized, and extensible infrastructure, similar to conventional machine learning pipelines, for building and validating analog neural networks (ANNs). It uses Python as interface language with a syntax similar to Keras, while hiding the complexity of the underlying analog simulations. It already includes the most common building blocks and maintains sufficient modularity and extensibility to easily incorporate new concepts, electrical, and technological models. These features make EBANA suitable for researchers and practitioners to experiment with different design topologies and explore the various tradeoffs that exist in the design space. We illustrate the framework capabilities by elaborating on the increasingly popular Energy-Based Models (EBMs), used in conjunction with the local Equilibrium Propagation (EP) training algorithm. Our experiments cover 3 datasets having up to 60,000 entries and explore network topologies generating circuits in excess of 1,000 electrical nodes that can be extensively benchmarked with ease and in reasonable time thanks to the native EBANA parallelization capability.

Read Full Article (External Site)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>