Page 138 - Read Online
P. 138

Sun et al. Soft Sci. 2025, 5, 18  https://dx.doi.org/10.20517/ss.2024.77        Page 15 of 26

               development of new architectures and algorithms. While error-driven methodologies prioritize
               performance optimization and high accuracy, local learning paradigms draw inspiration from the
               mechanisms of biological learning systems, aiming for enhanced efficiency and robustness. An intriguing
               strategy is the integration of these two approaches within a single network. For example, a spike-based
               hybrid plasticity model using a brain-inspired meta-learning paradigm and a differential spiking dynamics
               model with parameterized local correlation-driven learning was established with significantly higher
               performance in several image classification tasks (99.50% vs. 95.00%), such as on the modified national
                                                             [95]
               institute of standards and technology (MNIST) dataset .
               The common algorithms used in skin-like sensors are ANN and SNN. ANN algorithms can be learned by
               updating the weighted value, with the signals of neural networks typically being continuous values. ANN
               algorithms include deep learning and CNNs. The primary distinction between ANN and SNN lies in signal
               processing: ANN computations involve continuous values, whereas SNN inputs consist of spikes with
                                                                            +
                               [96]
                                                                                          +
               timing information  [Figure 6A]. In biological neurons, the flow of Na  (sodium) and K  (potassium) ions
               across the neuronal membrane is fundamental for action potential generation - brief electrical impulses that
                                                                                             +
                                                                    +
               constitute neuronal pulses . When a neuron is stimulated, Na  channels open, allowing Na  ions to influx,
                                     [97]
               which depolarizes the membrane potential. If the depolarization reaches a certain threshold, an action
               potential is triggered, propagating the pulse along the axon. Subsequently, K  channels open to repolarize
                                                                                 +
               the membrane, restoring the resting potential. This ionic exchange and the resulting action potentials are
               critical for neuronal communication and synaptic plasticity, the ability of synapses to strengthen or weaken
               over time based on activity levels. The concept of membrane potential is directly modeled to emulate the
               electrical dynamics of biological neurons. Each artificial neuron within a SNN maintains an internal state
               that represents its membrane potential, which is analogous to the membrane potential in biological
               neurons. The illustration of biological synaptic plasticity mechanisms and neuronal dynamics is shown in
               Figure 6B. The leaky LIF model is one of the commonly used neuron models in SNN; the model imparts
               SNNs with brain-like neural dynamics, making them more energy-efficient and capable of higher
               computational performance, closely resembling the functioning of the biological brain. Combined with
               hardware implementations such as memristors, LIF neurons play a crucial role in SNNs for advanced
               neuromorphic computing. Unlike traditional CMOS applications, LIF neuron was implemented by a
               memristor. To improve the accuracy of SNNs, hybrid learning approaches that combine local STDP
               learning with global error-driven learning have been employed  [Figure 6C]. STDP is a local learning rule
                                                                     [98]
               where the strength of synaptic connections is adjusted based on the relative timing of pre- and postsynaptic
               spikes. This biologically inspired mechanism enables SNNs to learn temporal patterns without the need for
               global error gradients. In neuromorphic systems, volatile and symmetrically threshold-switching VO
                                                                                                         2
               memristors have been employed in the neuromorphic system, leveraging their dynamic behavior to create
               compact LIF and adaptive LIF neurons. These neurons were integrated into a long short-term memory
               SNN, enabling effective decision-making and accurate analysis of physiological data  [Figure 6D].
                                                                                            [99]
               Additionally, LIF neurons could be implemented using diffusive memristors. For example, an 8 × 8 1T1R
               memristive synapse crossbar was integrated with eight diffusive memristor-based artificial neurons
               [Figure 6E]. By integrating nonvolatile memristive synapses with diffusive memristors, a fully memristive
               ANNs was implemented for pattern classification using unsupervised learning method  [Figure 6F].
                                                                                            [100]
               Furthermore, the unsupervised learning method in a probabilistic neural network that utilized metal-oxide
               memristive devices as multi-state synapses was implemented to effectively cluster and interpret complex
               and unlabelled data in real-world scenarios . As the spiking deep learning paradigm gained momentum,
                                                    [101]
               however, traditional programming frameworks struggled to meet the growing demands for automatic
               differentiation,  parallel  computation  acceleration,  and  efficient  processing  and  deployment  of
               neuromorphic datasets. To address these limitations, the SpikingJelly framework was proposed to optimize
               the performance and scalability of SNNs  [Figure 6G]. In contrast to previous works, CNNs have been
                                                  [102]
   133   134   135   136   137   138   139   140   141   142   143