Description de l'image

Workshop on
neuromorphic learning

TOULOUSE October 9-10, 2024

The brain, and the nervous system more generally, are capable of combining memory, calculation, sensors and even emotions in a single unit. On another hand, mimicking the functionalities of the brain is a crucial issue for the computer industry, in order to improve the performance and power-efficiency of our computers, especially since the advent of AI. Similarly, a promising field of research is looking at possible connection strategies between biological and electronic systems.

What links can we create between electronic systems in inorganic materials and nervous systems in biological tissues? How can knowledge of ones stimulate knowledge of the others?

The workshop aims to bring together experts in neural or neuromorphic learning for a day and a half, in a face-to-face, interdisciplinary format, with presentation times long enough to encourage discussion and a high degree of flexibility. Speakers will be physicists, chemists, engineers, neuroscientists and biologists. Particular attention will be paid to the pedagogy of the presentations, so that they are accessible to a non-specialist audience.

This workshop is aimed at any researcher interested in neuronal / neuroscience / neuromorphic considerations, from student to world-renowned expert. Its aim is to forge links and find a common language between disciplines that consider similar problems, but don't necessarily have the opportunity to meet.

Practical details:


    Date: October 9-10, 2024
    Place: INSA Toulouse
    Build. 19, Amphi GEI 13 (ground floor left)


    Organizers:

    Jérémie Grisolia and Simon Tricard
    LPCNO, INSA, CNRS, Université de Toulouse

Program

Day Time Event
Wednesday October 9 Morning 9h00 - 9h30 : Welcome
9h30 - 10h30 : Félix Houard, Toulouse, FR (Chemist): Hybrid materials as neuromorphic models
10h30 - 11h00 : Coffee break
11h00 - 12h00 : Athanasios Vasilopoulos, Zurich, CH (Engineer): In-memory computing
Lunch 12h00 - 14h00 : buffet with all registrants
Afternoon 14h00 - 15h00 : Christian Nijhuis, Twente, NL (Chemist): Molecular electronics
15h00 - 16h00 : Amaury François, Montpellier, FR (Biologist): Neuronal circuits for touch and emotions
16h00 - 16h30 : Coffee break
16h30 - 17h30 : Christian Bergaud, Toulouse, FR (Engineer): Bioelectronic interface
Thursday October 10 Morning 9h00 - 10h00 : Sijia Ran, Toulouse, FR (Physicist): Phase change memory junctions
10h00 - 11h00 : Simon Thorpe, Toulouse, FR (Neuroscientist): Theory of neuronal computation
11h00 - 11h30 : Coffee break
11h30 - 12h30 : Simon Brown, Christchurch, NZ (Physicist): Electronics in granular materials
Lunch 12h30 - 14h00 : buffet with all registrants

List of speakers (alphabetic order)

Name Origin Scientific domain Research field
Christian Bergaud Toulouse, FR Engineer Bioelectronic interface
Simon Brown Christchurch, NZ Physicist Computation in networks of nanoparticles
Amaury François Montpellier, FR Biologist Neuronal circuits for touch and emotions
Felix Houard Toulouse, FR Chemist Hybrid materials for neuromorphic learning
Christian Nijhuis Twente, NL Chemist Molecular electronics
Sijia Ran Toulouse, FR Physicist Phase change memory junctions
Simon Thorpe Toulouse, FR Neuroscientist Theory of neuronal computation
Athanasios Vasilopoulos Zurich, CH Engineer In-memory computing

List of abstracts

Soft bioelectronics for in vivo and in vitro neural probes

by Christian Bergaud, Laboratory of Systems Analysis and Architecture (LAAS), Toulouse, France

We will explore the development and application of soft bioelectronics in designing neural probes for both in vivo and in vitro studies. These advanced probes can interact seamlessly with neural tissues by utilizing soft, biocompatible materials, minimizing damage and improving data acquisition. They can be used for stimulation, recording, and detection of neurotransmitters. Recently, their integration into 3D artificial bio-hybrid systems has further expanded their potential. These systems enable real-time analysis and adaptive responses, facilitating the creation of complex neural networks that can exhibit plasticity and adaptation, key features of natural neural systems. This innovative combination could not only enhance our understanding of neural processes but also holds significant promise for the development of advanced neuroprosthetics and treatments for neurological disorders.

Brain-like Computation with Percolating Networks of Nanoparticles

by Simon Brown, University of Canterbury, Christchurch, New Zealand

Self-assembled networks of nanoparticles have recently emerged as important candidate systems for brain-like (or neuromorphic) information processing. The essence of the approach is to take advantage of the intrinsic dynamical properties of these networks to implement brain-inspired approaches to computation. Our percolating networks of nanoparticles (PNNs, are self-assembled via simple deposition processes that are completely CMOS compatible, making them attractive for integration. The key to our approach is to terminate the deposition at the onset of conduction (the percolation threshold) when the electrical properties of the network are dominated by tunnel gaps between groups of particles. The memristive tunnel gaps turn out to have neuron-like properties, which means that PNNs can be viewed as networks of neurons. Both the structural and dynamical properties of PNNs have been shown to be brain-like and, in particular, avalanches of neuron-like spiking events have been shown to be critical. Criticality is a key feature of the biological brain that has been related to optimal information processing capability. We have explored brain-like computation with PNNs in two regimes, beginning with simulations that allow us to understand the processes and refine parameters, and then moving to experimental demonstrations. At low voltages, the devices are amenable to reservoir computation and we have successfully demonstrated time series prediction, non-linear transformation and spoken digit recognition. In the high voltage regime, the spiking behaviour of the ‘neurons’ has been exploited to perform Boolean logic and MNIST classification, and, most recently, optimization tasks.

How do we perceive our environment? Touch as an example of sensory integration in the central nervous system.

by Amaury François, Institute of Functional Genomics (IGF), Montpellier, France

How biological organisms convert physical information that surrounding their bodies into information with precise characteristics and valences that can be used to direct behaviors? In this talk, we will use the sense of touch as an example to explore recent advances in understanding the neuronal basis of sensory integration, from the molecular to circuit level.

Hybrid materials as neuromorphic models

by Felix Houard, Laboratory of Physics and Chemistry of Nano-Objects (LPCNO), Toulouse, France

Hybrid materials are composed of an inorganic part and a molecular part. An argument in favor of hybrid materials is that we can benefit from combining the advantages of the different components. The aim of this presentation is to show that nanoscale structuring of such materials is a promising strategy for shaping systems gathering memory, computing and sensing capabilities in a same unit. We will first describe how to synthesize such materials, how to measure their properties, and what the strategies are to train them in neuromorphic perspectives, i.e. in perspectives inspired by the functioning of the nervous system.

Molecular Switches that Change Switching Probabilities for Emulating Synaptic Behavior

by Christian Nijhuis, University of Twente, The Netherlands

Our brains constitute a molecular computer that is able to process enormous amounts of information with a tiny energy budget. Inspired by the energy efficiency of brains and the ever-increasing demand for miniaturised electronics, there is a drive to develop devices that mimic the dynamic character of neurons and synapses. We have been developing molecular switches that behave like synapses with the aim to realize spiking neural networks. I will introduce a new type of molecular switch that can remember its switching history. By coupling fast electron transport to slow proton addition steps via dynamic covalent bonds, the switches display time-dependent switching probabilities which can be used for brains-inspired and reconfigurable electronics. These artificial synapses are promising to develop alternative neural networks and open new ways to design electronic devices by exploiting their inherent dynamical properties.

Microscopic Origins of the Intermediate Resistance States in Ge-rich GeSbTe based Phase Change Memory

by Sijia Ran, Center for materials development and structural studies (CEMES), Toulouse, France

Phase change memory (PCM) is a good candidate for applications as solid-state neuromorphic devices. Indeed, the resistance of such a device can be tuned quasi-continuously by altering the amorphous-to-crystalline ratio in its active region. However, the ability to program these intermediate resistivity states (IRS) is often seriously hampered by their limited time to stabilize at a specific state and thermal stability. In this presentation, we focus on PCM fabricated using Ge-rich GeSbTe (G-GST) alloys, which have demonstrated excellent data retention ability suitable for digital devices working at high temperatures. We will show some specificities of G-GST cells through a combination of electrical, chemical, and structural studies. Due to the off-stoichiometry nature of G-GST alloys, chemical segregation occurs together with phase transition during programming of G-GST based PCM. The forming operation, which consists of melting of a dome of nanometric dimensions followed by its recrystallization, results in the redistribution of all the elements to form a composite structure made of GST grains inside the dome, pure Sb2 crystals close to the cathode, and Ge grains accumulation at the periphery of the dome. By tuning the amplitude of the RESET current, amorphous domes of different sizes and chemical compositions are generated, which offer IRS in G-GST cells. The compositions of these amorphous domes result from the mixing and homogenization of all the chemical elements that are initially found in the form of grains of different stoichiometries in the region but remain Ge- and Sb-rich, which ensures the high thermal stability of the IRS. This shows that, at difference from the conventional PCM based on Ge2Sb2Te5, in G-GST cells the information is stored not only in the crystallinity or amorphousness of the active region but also in its stoichiometry. Possible strategies to access these IRS will be presented and discussed.

Why is the brain so energy efficient? Simulating billions of neurons using off-the-shelf computational hardware.

by Simon Thorpe, Brain and Cognition Research Center (CerCo), Toulouse, France

While state-of-the-art AI systems are impressive, their power efficiency is a major problem. Simulating the entire human brain in real-time using conventional approaches would require hundreds of PetaFlops of compute and around 1 million times more energy than the 20 watts required by the brain. Neuromorphic computing could provide the solution and I will argue that spike-based computing is a vital part of the story. We have been developing a PetaBrain architecture that uses ultra-sparse activity and unary connections and allows systems with billions of neurons to be simulated using conventional computing hardware. In such systems, learning can be implemented by modifying the connections between neurons on the basis of ongoing activity.

In-Memory Computing for Deep Neural Network Inference

by Athanasios Vasilopoulos, IBM Research Europe, Zurich, Switzerland

State-of-the-art neural networks have demonstrated superhuman capabilities across a spectrum of tasks, and recently, generative models have led to a paradigm shift in the field of machine learning. However, the inherent inefficiencies of the hardware used to run such networks remain a major roadblock to the widespread adoption of this technology. In this talk, I will discuss why conventional AI accelerators face these inefficiencies and introduce a novel computing paradigm: In-Memory Computing using non-volatile memory devices. I will cover the current state of the field, the architectural innovations needed to develop competitive systems, and the software environment required to support modern applications.

Registration Form

Registration is free but compulsory

Sponsor:


    Description de l'image