| Peer-Reviewed

Global Asymptotic Stability for a New Class of Neutral Neural Networks

Received: 28 August 2018     Accepted: 17 October 2018     Published: 9 November 2018
Views:       Downloads:
Abstract

In the present world, due to the complicated dynamic properties of neural cells, many dynamic neural networks are described by neutral functional differential equations including neutral delay differential equations. These neural networks are called neutral neural networks or neural networks of neural-type. The differential expression not only defines the derivative term of the current state but also explains the derivative term of the past state. In this paper, global asymptotic stability of a neutral-type neural networks, with time-varying delays, are presented and analyzed. The neural network is made up of parts that include: linear, non-linear, non-linear delayed, time delays in time derivative states, as well as a part of activation function with the derivative. Different from prior references, as part of the considered networks, the last part involves an activation function with the derivative rather than multiple delays; that is a new class of neutral neural networks. This paper assumes that the activation functions satisfy the Lipschitz conditions so that the considered system has a unique equilibrium point. By constructing a Lyapunov-Krasovskii-type function and by using a linear matrix inequality analysis technique, a sufficient condition for global asymptotic stability of this neural network has been obtained. Finally, we present a numerical example to show the effectiveness and applicability of the proposed approach.

Published in International Journal of Applied Mathematics and Theoretical Physics (Volume 4, Issue 3)
DOI 10.11648/j.ijamtp.20180403.13
Page(s) 78-83
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2018. Published by Science Publishing Group

Keywords

Global Asymptotic Stability, Neutral Neural Network, Time-Varying Delay, Sufficient Condition

References
[1] J. J. Hopfield. Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Aead. Sci. USA, 1984, 81: 3088-3092.
[2] S. Abe, J. Kawakami, K. Hirasawa. Solving inequality constrained combinatorial optimization problems by the Hopfield neural networks. Neural Networks, 1992, 5: 663-670.
[3] H. Tamura, Z. Zhang, X. S. Xu, M. Ishii, Z. Tang. Lagrangian object relaxation neural network for combinatorial optimization problems, Neurocomputing, 2005, 68: 297-305.
[4] R. L. Wang, Z. Tang, Q. P. Cao. A learning method in Hopfield neural network for combinatorial optimization problem. Neurocomputing, 2002, 48: 1021-1024.
[5] S. Rout, Seethalakshmy, P. Srivastava, J. Majumdar. Multi-modal image segmentation using a modified Hopfield neural network Original Research Article. Pattern Recognition, 1998, 31: 743-750.
[6] R. Sammouda, N. Adgaba, A. Touir, A. Al-Ghamdi. Agriculture satellite image segmentation using a modified artificial Hopfield neural network. Computers in Human Behavior, 2014, 30: 436-441.
[7] P. Suganthan, E. Teoh, D. Mital. Pattern recognition by homomorphic graph matching using Hopfield neural networks Original Research Article. Image and Vision Computing, 1995, 13: 45-60.
[8] N. Laskaris, S. Fotopoulos, P. Papathanasopoulos. A. Bezerianos. Robust moving averages, with Hopfield neural network implementation, for monitoring evoked potential signals Original Research Article. Electroencephalography and Clinical Neurophysiology/Evoked Potentials Section, 1997, 104: 151-156.
[9] D. Calabuig, J. F. Monserrat, D. Gmez-Barquero, O. Lzaro. An efficient dynamic resource allocation algorithm for packet-switched communication networks based on Hopfield neural excitation method. Neurocomputing, 2008, 71: 3439-3446.
[10] W. Zhang. A weak condition of globally asymptotic stability for neural networks. Applied Mathematics Letters, 2006. 19: 1210–1215.
[11] X. Li, Z. Chen. Stability properties for Hopfield neural networks with delays and impulsive perturbations. Nonlinear Analysis: Real World Applications, 2009, 10: 3253-3265.
[12] L. Wang, Y. Gao. Global exponential robust stability of reaction–diffusion interval neural networks with time-varying delays. Physics Letters A, 2006, 350: 342-348.
[13] X. Lou, Q. Ye, B. Cui. Parameter-dependent robust stability of uncertain neural networks with time-varying delay. Journal of the Franklin Institute, 2012, 349: 1891-1903.
[14] C. Marcus, R. Westervelt. Stability of analog neural networks with delay. Physical Review A, 1989, 39: 347-359.
[15] J. Wu. Symmetric functional-differential equations and neural networks with memory. Transactions of the American Mathematical Society, 1999, 350: 4799-4838.
[16] J. Wu, X. Zou. Patterns of sustained oscillations in neural networks with time delayed interactions. Applied Mathematics and Computation, 1995, 73: 55-75.
[17] K. Gopalsamy, X. He. Stability in asymmetric Hopfield nets with transmission delays. Physica D, 1994, 76: 1344-358.
[18] P. van den Driessche, X. Zou. Global attractivity in delayed Hopfield neural network models. SIAM Journal on Applied Mathematics, 1998, 58: 1878-1890.
[19] H. Zhao. Global asymptotic stability of Hopfield neural network involving distributed delays. Neural Networks, 2004, 17: 47–53.
[20] R. Rakkiyappan, P. Balasubramaniam. Delay-dependent asymptotic stability for stochastic delayed recurrent neural networks with time varying delays. Applied Mathematics and Computation, 2008, 198: 526–533.
[21] H. Xu, Y. Chen, K. L. Teo. Global exponential stability of impulsive discrete-time neural networks with time-varying delays. Applied Mathematics and Computation, 2010, 217: 537-544.
[22] R. Luo, H. Xu, W. Wang, J. Sun, W. Xu. A weak condition for global stability of delayed neural networks. Journal of Industrial and Management Optimization, 2016, 12: 505-514.
[23] R. Samli, S. Arik. New results for global stability of a class of neutral-type neural systems with time delays. Applied Mathematics and Computation, 2009, 210: 564–570.
[24] X. Zeng, Z. Xiong, C. Wang. Hopf bifurcation for neutral-type neural network model with two delays. Applied Mathematics and Computation, 2016, 282:17–31.
[25] C. Bai. Global stability of almost periodic solutions of Hopfield neural networks with neutral time-varying delays. Applied Mathematics and Computation, 2008, 203: 72–79.
[26] Z. Orman. New sufficient conditions for global stability of neutral-type neural networks with time delays. Neurocomputing, 2012, 97: 141-148.
[27] S. Lakshmanan, J. H. Park, H. Y. Jung, O. M. Kwon, R. Rakkiyappan. A delay partitioning approach to delay-dependent stability analysis for neutral type neural networks with discrete and distributed delays. Neurocomputing, 2013, 111: 81-89.
[28] J. Ye, H. Xu, E. Feng, Z. Xiu. Optimization of a fed-batch bioreactor for 1, 3-propanediol production using hybrid nonlinear optimal control. Journal of Process Control. 2014, 24: 1556-69.
[29] R. Luo, H. Xu, W. Wang, X. Wang. A new stability criterion of neutral neural networks with time-varying delays. Pacific Journal of Optimization, 2016, 12: 487-96.
[30] H. Zeng, K. L. Teo, Y. He, H. Xu, W. Wang. Sampled-data synchronization control for chaotic neural networks subject to actuator saturation. Neurocomputing, 2017, 260: 25-31.
Cite This Article
  • APA Style

    Ricai Luo, Gang Lin, Dongdong Yin. (2018). Global Asymptotic Stability for a New Class of Neutral Neural Networks. International Journal of Applied Mathematics and Theoretical Physics, 4(3), 78-83. https://doi.org/10.11648/j.ijamtp.20180403.13

    Copy | Download

    ACS Style

    Ricai Luo; Gang Lin; Dongdong Yin. Global Asymptotic Stability for a New Class of Neutral Neural Networks. Int. J. Appl. Math. Theor. Phys. 2018, 4(3), 78-83. doi: 10.11648/j.ijamtp.20180403.13

    Copy | Download

    AMA Style

    Ricai Luo, Gang Lin, Dongdong Yin. Global Asymptotic Stability for a New Class of Neutral Neural Networks. Int J Appl Math Theor Phys. 2018;4(3):78-83. doi: 10.11648/j.ijamtp.20180403.13

    Copy | Download

  • @article{10.11648/j.ijamtp.20180403.13,
      author = {Ricai Luo and Gang Lin and Dongdong Yin},
      title = {Global Asymptotic Stability for a New Class of Neutral Neural Networks},
      journal = {International Journal of Applied Mathematics and Theoretical Physics},
      volume = {4},
      number = {3},
      pages = {78-83},
      doi = {10.11648/j.ijamtp.20180403.13},
      url = {https://doi.org/10.11648/j.ijamtp.20180403.13},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijamtp.20180403.13},
      abstract = {In the present world, due to the complicated dynamic properties of neural cells, many dynamic neural networks are described by neutral functional differential equations including neutral delay differential equations. These neural networks are called neutral neural networks or neural networks of neural-type. The differential expression not only defines the derivative term of the current state but also explains the derivative term of the past state. In this paper, global asymptotic stability of a neutral-type neural networks, with time-varying delays, are presented and analyzed. The neural network is made up of parts that include: linear, non-linear, non-linear delayed, time delays in time derivative states, as well as a part of activation function with the derivative. Different from prior references, as part of the considered networks, the last part involves an activation function with the derivative rather than multiple delays; that is a new class of neutral neural networks. This paper assumes that the activation functions satisfy the Lipschitz conditions so that the considered system has a unique equilibrium point. By constructing a Lyapunov-Krasovskii-type function and by using a linear matrix inequality analysis technique, a sufficient condition for global asymptotic stability of this neural network has been obtained. Finally, we present a numerical example to show the effectiveness and applicability of the proposed approach.},
     year = {2018}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Global Asymptotic Stability for a New Class of Neutral Neural Networks
    AU  - Ricai Luo
    AU  - Gang Lin
    AU  - Dongdong Yin
    Y1  - 2018/11/09
    PY  - 2018
    N1  - https://doi.org/10.11648/j.ijamtp.20180403.13
    DO  - 10.11648/j.ijamtp.20180403.13
    T2  - International Journal of Applied Mathematics and Theoretical Physics
    JF  - International Journal of Applied Mathematics and Theoretical Physics
    JO  - International Journal of Applied Mathematics and Theoretical Physics
    SP  - 78
    EP  - 83
    PB  - Science Publishing Group
    SN  - 2575-5927
    UR  - https://doi.org/10.11648/j.ijamtp.20180403.13
    AB  - In the present world, due to the complicated dynamic properties of neural cells, many dynamic neural networks are described by neutral functional differential equations including neutral delay differential equations. These neural networks are called neutral neural networks or neural networks of neural-type. The differential expression not only defines the derivative term of the current state but also explains the derivative term of the past state. In this paper, global asymptotic stability of a neutral-type neural networks, with time-varying delays, are presented and analyzed. The neural network is made up of parts that include: linear, non-linear, non-linear delayed, time delays in time derivative states, as well as a part of activation function with the derivative. Different from prior references, as part of the considered networks, the last part involves an activation function with the derivative rather than multiple delays; that is a new class of neutral neural networks. This paper assumes that the activation functions satisfy the Lipschitz conditions so that the considered system has a unique equilibrium point. By constructing a Lyapunov-Krasovskii-type function and by using a linear matrix inequality analysis technique, a sufficient condition for global asymptotic stability of this neural network has been obtained. Finally, we present a numerical example to show the effectiveness and applicability of the proposed approach.
    VL  - 4
    IS  - 3
    ER  - 

    Copy | Download

Author Information
  • School of Mathematics and Statistics, Hechi University, Yizhou, China

  • School of Design and Built Environment, Curtin University, Perth, Australia

  • School of Mathematics and Statistics, Hechi University, Yizhou, China

  • Sections