New generalized Halanay inequalities and relative applications to neural networks with variable delays

The asymptotic behavior of solutions for a new class of generalized Halanay inequalities is studied via the fixed point method. This research provides a new approach to the study of the stability of Halanay inequality. To make the application of fixed point method in stability research more flexible and feasible, we introduce corresponding functions to construct an operator according to different characteristics of coefficients. The results obtained in this paper are applied to the stability study of a neural network system, which has high value in application. Moreover, three examples and simulations are given to illustrate the results. The conclusions in this paper greatly improve and generalize the relative results in the current literature.


Introduction
The delay dynamical systems have been applied in a lot of fields such as medicine biology, neural networks, physics, electrical engineering, and other fields of engineering and science.Stability has always been the most widely studied in the theory of dynamical systems.Therefore, research on the stability of delay dynamical systems has been very fruitful, see for instance .Recently, as a generalization of dynamical systems, many authors have studied the stability of Halanay inequality systems.

Lemma 1.4 (Ruan
Many authors used the stability results of Halanay inequality to study the synchronization and stability of neural networks.When studying the stability of dynamical systems, most of the authors (such as [12,13]) used Lyapunov's direct method.Yet, there are many problems which make this method invalid.For solving the problems encountered in the study of Lyapunov's direct method, Burton and other authors [14][15][16][17][18] investigated the stability of stochastic dynamical systems driven by Brownian motion using fixed point theory.Later, Shahram Rezapour and his collaborators [19][20][21] used the fixed point method to study the properties associated with the solution of stochastic fractional differential system.The results showed that the fixed point method can overcome many problems in the study of the stability of dynamical systems.
However, when using Halanay inequalities to discuss the stability of dynamical systems, the fixed point method is seldom used.In this paper, we study the asymptotic stability of dynamical system with variable delays via generalized Halanay inequalities by the fixed point method.In particular, the obtained conclusions improve and promote the results of some existing papers.See the examples in Sect. 4.
The remaining parts of the paper are designed as follows.The main theoretical conclusions are firstly proposed and then proved in Sect. 2. The conclusions in Sect. 2 are applied to study the global stability of neural networks in Sect.3. Examples with numerical simulations are illustrated in Sect. 4. The conclusions are given in Sect. 5.
For ξ ∈ S and ϕ ∈ S, we have (2.12) Therefore, we obtain that is a contraction mapping according to the contraction mapping principle.has a unique fixed point y(t) in S by the contraction mapping principle.The fixed point is a solution of (2.2) with y(s) = |ψ(s)| on [ψ(0), 0) and |y(t)| → 0 as t → ∞.
On the contrary, assume that condition (iii) is not met, then there is a sequence t l , t l → ∞ as l → ∞ such that lim l→∞ t l 0 f (s) ds = p for some p ∈ R by condition (i).We can select a constant Q > 0 satisfying 0 < t l 0 f (s) ds ≤ Q for all l ≥ 1.We define A(s) as follows for simplification: By condition (ii), we have This yields From the above, there is a convergent subsequence as { t l 0 e s 0 f (v) dv A(s) ds} is bounded.For the convenience, we may suppose that there exists some γ ∈ R + such that Then we can find an integer k > 0 large enough such that, for all l ≥ k, where β = sup t∈[0,+∞) e -t 0 f (v) dv , θ > 0 satisfies 8θβe Q + α < 1. Next, we will discuss the zero solution y(t) = y(t, t k , |ψ|) of system (2.2) with |ψ(t k )| = θ and |ψ(s)| ≤ θ for s ≤ t k .Then |y(t)| ≤ 1 for t ≥ t k .We may select ψ such that From (2.6), we obtain However, provided g i (t l ) → ∞ as l → ∞ holds.From condition (ii), we have |y(t l ) - (2.14).Therefore, for the asymptotic stability of system (2.2), condition (iii) is essential.Thus, system (2.1) is asymptotically stable if and only if condition (iii) holds.The proof is complete.

Applications
Consider the Grossberg-Hopfield neural network with multiple time-varying delays as follows: Here, self-inhibition a ij (t), the interconnection weights b ij (t), c ij (t) and h j (t), g j (t), k ij (t) are scalar integrable functions for t ∈ [0, +∞), inputs I i (t) : R + − → R are continuously functions, i ∈ I m .ψ(0) is defined as above.Definition 3.1 (Gopalsam [22]) The solution u(t) = (u 1 (t), . . ., u n (t)) of (3.1) is globally asymptotically stable if and only if every other solution v(t) = (v 1 (t), . . ., v n (t)) of (3.1) with v i (0) > 0 (i ∈ I m ) is defined for all t > 0 and satisfies Theorem 3.1 The functions h j (t), g j (t) satisfying the Lipschitz condition with Lipschitz's constant L j , P j are differentiable (j ∈ I m ).Assume that there is a positive constant α < 1 and some functions Then the neural network system (3.1) is globally asymptotically stable if and only if (iii) Proof For system (3.1),we know From Theorem 3.1, we can obtain the following inequalities: For t ≥ 0, define y(t) := max{x i (t), i ∈ I m }.For all t ∈ [0, +∞), let i t stand for the index such that y(t) = |x i t (t)|.So, we have for t ≥ 0 j=1 (-a j (t))y(t) + m j=1 b j (t)L j y(t) + m j=1 c j (t)P j sup k(t)≤s≤t y(s), t ≥ 0 y(t) = sup k(0)≤s≤0 x(s), t ≤ 0.

Examples
In this section, we present some examples and numerical simulations to test and verify our main conclusions.
Remark 4.3 Because a i (t), b ij (t), c ij (t) (i, j ∈ I m ) are unbounded, Theorem 3 in [9] and Proposition 3 in [10] cannot be applied to system (4.3).Besides, because delays are unbounded, Theorem 3 in [11] will be invalid.

Conclusion
In this note, we first used the fixed point method to study a new kind of generalized Halanay inequalities and obtained some sufficient conditions of asymptotic behavior.Then, we applied our conclusions to the study of the asymptotic synchronization and convergence of neural network systems.Finally, we presented some examples and numerical simulations to test and verify our main conclusions.The conclusions in this note improve and generalize the relative results in [4][5][6][7][8][9][10][11].Also, to the authors' knowledge, the study of stochastic differential systems with time lag driven jointly by Brownian and fractional Brownian motions is rare, and only the existence of uniqueness and convergence of solutions are studied.In addition, the study of stochastic time-lagged partial differential systems jointly driven by Brownian and fractional Brownian motion is even rarer at present.Therefore, it is our future research goal to study the properties associated with the so-

(3. 5 )
From Theorem 2.1, we can get the conclusion of Theorem 3.1.The proof is complete.

Example 4 . 1
Consider a delay dynamical system