Abstract. In this article, we study cellular neural networks (CNNs) with
time-varying coefficients, bounded and unbounded delays. By introducing a
new Liapunov function to approach unbounded delays and using technique
of Young inequality, we obtain some sufficient conditions which ensure global
exponential stability of CNNs with unbounded delays and without assumption on boundness of active functions. The main results in this paper are
new and complement previously known results
11 trang |
Chia sẻ: thanhle95 | Lượt xem: 244 | Lượt tải: 0
Bạn đang xem nội dung tài liệu Novel rerults on the global exponential stability of cellular neural networks with variable coefficients and unbounded delays, để tải tài liệu về máy bạn click vào nút DOWNLOAD ở trên
JOURNAL OF SCIENCE OF HNUE
Natural Sci., 2008, Vol. 53, N
◦
. 5, pp. 9-19
NOVEL RERULTS ON THE GLOBAL EXPONENTIAL STABILITY
OF CELLULAR NEURAL NETWORKS WITH VARIABLE
COEFFICIENTS AND UNBOUNDED DELAYS
Tran Thi Loan and Duong Anh Tuan
Hanoi National University of Education
Abstract. In this article, we study cellular neural networks (CNNs) with
time-varying coefficients, bounded and unbounded delays. By introducing a
new Liapunov function to approach unbounded delays and using technique
of Young inequality, we obtain some sufficient conditions which ensure global
exponential stability of CNNs with unbounded delays and without assump-
tion on boundness of active functions. The main results in this paper are
new and complement previously known results.
1. Introduction
It is well-known that CNNs proposed by L.O. Chua and L. Yang in 1988
(see [4]) have been extensively studied in both theory and applications. They have
been successfully applied in signal process, pattern recognition, associative memories
and especially in static image treatments. Such applications rely on the qualitative
properties of the neural networks. In hardware implementation, time delays occur
due to finite switching speeds of the amplifiers and communication time. Time delays
may lead to an oscillation and furthermore, to an instability of networks [1]. On the
other hand, it has also been known that the process of moving images requires the
introduction of delays in the signal transmission among the networks [3]. Therefore,
the study of stability of neural networks with delay is practically required.
We know that fixed time delays in model of delayed feedback systems serve
as a good approximation of a simple circuit having a small number of cells. The
neural network usually has a spatial nature due to the presence of various parallel
pathways, thus it is desirable to model them by introducing unbounded delays.
In this paper we consider general neural networks with variable and unbounded
time delays of the form
dxi(t)
dt
=− di(t)xi(t) +
n∑
j=1
aij(t)fj(xj(t)) +
n∑
j=1
bij(t)gj(xj(t− τij(t)))
+
n∑
j=1
cij(t)
∫ t
−∞
kij(t− s)hj(xj(s))ds+ Ii(t), (i = 1, n),
(1.1)
9
Tran Thi Loan and Duong Anh Tuan
where xi is the state of ith neuron (i = 1, n); n is the number of neuron; A(t) =
(aij(t))n×n, B(t) = (bij(t))n×n, C(t) = (cij(t))n×n are connection matrices; I(t) =
(I1(t), ..., In(t))
T
is the input vector; fi, gi, hi are the active functions of the neurons;
D(t) = diag(d1(t), ..., dn(t)), di(t) represents the rate in which the ith unit will reset
its potentiality to the resting state in isolation when disconnected from the network;
kij(t) (i, j = 1, n) are the kernel functions; τij(t)(i, j = 1, n) are the delays.
Results about the stability of this model are still few. Moreover, most of those
results have been derived in models with constant coefficients [5,7]. On the other
hand, the authors have assumed that the active function f is bounded (see [7,8]) or
f(0)=0 (see [8]).
In this paper, we use the Young inequality and construct a suitable Liapunov
function to give some new sufficient conditions for the global exponential stability of
the system (1.1). We do not require that the active function f is bounded, f(0) = 0
and the system (1.1) must have equilibrium point. Moreover, the main results in
[2,6] are special cases of the main results in this paper in some aspects.
The rest of the paper is organized as follows. Section 2 presents some defini-
tions and assumptions. In Section 3, the global exponential stability is obtained. An
example is given in Section 4 to illustrate our results.
2. Definitions and assumptions
In this section, we give some assumptions that are used in next section.
We consider system (1.1) under some following assumptions
(H1) Functions di(t), aij(t), bij(t), cij(t) and Ii(t)(i, j = 1, n) are defined, bounded
and continuous on R+. Functions τij(t)(i, j = 1, n) are defined nonnegative, bounded
by the constant τ and continuously differentiable on R+, inf
t∈R+
(1− τ˙ij(t)) > 0, where
τ˙ij(t) is the derivative of τij(t) with respect to t.
(H2) Functions kij : [0,∞) → [0,∞)(i, j = 1, n) are piecewise continuous on
[0,∞) and satisfy
∫∞
0
eskij(s)ds = pij(), where pij() are continuous functions on
[0, δ), δ > 0, pij(0) = 1.
(H3) There are positive constants Hi, Ki, Li (i = 1, n) such that 0 ≤ |fi(u)−
fi(u
∗)| ≤ Hi|u− u
∗|; |gi(u)− gi(u
∗)| ≤ Ki|u− u
∗|; |hi(u)− hi(u
∗)| ≤ Li|u− u
∗| for
all u, u∗ ∈ R and i = 1, n.
(H4) There is a positive constant a, bounded functions hij(t), lij(t), pij(t),
qij(t), mij(t), γij(t), βij(t), ωi(t), inf
t≥0
ωi(t) > 0, i, j = 1, n and r ≥ 1 such that
− ω˙i(t) + rωi(t)di(t)− (r − 1)
n∑
j=1
ωi(t)|aij(t)|
r−hij(t)
r−1 H
r−qij(t)
r−1
j
−
n∑
j=1
ωj(t)|aji(t)|
hji(t)H
qji(t)
i − (r − 1)
n∑
j=1
ωi(t)|bij(t)|
r−lij(t)
r−1 K
r−pij(t)
r−1
j
10
Novel results on the global exponential stability of cellular neural networks...
−
n∑
j=1
ωj(ψ
−1
ji (t))
|bji(ψ
−1
ji (t))|
lji(ψ
−1
ji (t))
1− τ˙ji(ψ
−1
ji (t))
K
pji(ψ
−1
ji (t))
i
− (r − 1)
n∑
j=1
ωi(t)
∫ ∞
0
|kij(s)|
r−γij(t)
r−1 |cij(t)|
r−mij(t)
r−1 L
r−βij (t)
r−1
j ds
−
n∑
j=1
∫ ∞
0
|kji(s)|
γji(t+s)ωj(t+ s)|cji(t+ s)|
mji(t+s)L
βji(t+s)
i ds ≥ a for all t ≥ 0.
Case r = 1, (H4) will transform into the following case.
(H∗4) There is a positive constant a and bounded functions ωi(t), inf
t≥0
ωi(t) > 0
such that
− ω˙i(t) + ωi(t)di(t)−Hi
n∑
j=1
ωj(t)|aji(t)| −Ki
n∑
j=1
ωj(ψ
−1
ji (t))
|bji(ψ
−1
ji (t))|
1− τ˙ji(ψ
−1
ji (t))
− Li
n∑
j=1
∫ ∞
0
kji(s)ωj(t+ s)|cji(t+ s)|ds ≥ a for all t ≥ 0,
where ψ−1ij (t) is an inverse function of ψij(t) = t− τij(t).
We denote by BC the Banach space of bounded continuous functions
φ : (−∞, 0]→ Rn with norm ‖φ‖ =
( n∑
i=1
sup
s≤0
|φi(s)|
r
) 1
r
.
The initial condition associated with (1.1) has the following form
x(θ) = φ(θ), θ ∈ (−∞, 0], where φ ∈ BC. (2.1)
We know that if hypotheses (H1), (H2) are satisfied, then the system (1.1)
has a unique solution x(t) = (x1(t), ..., xn(t))
T
satisfying the initial condition (2.1)
(see [11]).
Definition 2.1. The system (1.1) is said to be globally exponentially stable, if there
are constants ε > 0 and M ≥ 1 such that for any two solutions x(t), y(t) of the
system (1.1) with the initial functions φ, ψ, respectively, one has
‖x(t)− y(t)‖ =
( n∑
i=1
|xi(t)− yi(t)|
r
) 1
r ≤M‖ψ − φ‖e−εt for all t ∈ R+.
3. Global exponential stability
In this section, by constructing a suitable Liapunov function and using the
technique of Young inequality, we derive some sufficient conditions for the global
exponential stability of the system (1.1).
11
Tran Thi Loan and Duong Anh Tuan
Theorem 3.1. If the hypotheses (H1), (H2), (H3) and (H4) are satisfied then the
system (1.1) is globally exponentially stable.
Proof. Let x(t), y(t) be two arbitrary solutions of the system (1.1) with initial value
ψ, φ, respectively. Setting zi(t) = xi(t)− yi(t), we have
dzi(t)
dt
= −di(t)zi(t) +
n∑
j=1
aij(t)[fj(xj(t))− fj(yj(t))]
+
n∑
j=1
bij(t)[gj(xj(t− τij(t)))− gj(yj(t− τij(t)))]
+
n∑
j=1
cij(t)
∫ t
−∞
kij(t− s)[hj(xj(s))− hj(yj(s))]ds.
We consider two cases: r > 1 and r = 1.
When r > 1, define a Liapunov function as follows
V (t, zt) = V1(t, zt) + V2(t, zt) + V3(t, zt),
where
V1(t, zt) =
n∑
i=1
ωi(t)|zi(t)|
rerεt,
V2(t, zt) =
n∑
i=1
n∑
j=1
∫ t
t−τij(t)
ωi(ψ
−1
ij (s))
|bij(ψ
−1
ij (s))|
lij(ψ
−1
ij (s))
1− τ˙ij(ψ
−1
ij (s))
K
pij(ψ
−1
ij (s))
j |zj(s)|
r×
× erε(s+τij(ψ
−1
ij (s))ds,
V3(t, zt) =
n∑
i=1
n∑
j=1
∫ ∞
0
∫ t
t−s
|kij(s)|
γij(u+s)ωi(u+ s)|cij(u+ s)|
mij(u+s)L
βij(u+s)
j ×
× erε(u+s)|zj(u)|
rduds,
ε will be determined later on. Calculating the Dini derivative of V (t, zt), we get
D+V1(t, zt) ≤ e
rεt|zi(t)|
r−1
n∑
i=1
{
(ω˙i(t) + ωi(t)rε)|zi(t)|
+ r
n∑
i=1
ωi(t)
[
− di(t)|zi(t)|+
n∑
j=1
|aij(t)||fj(xj(t))− fj(yj(t))|
+
n∑
j=1
|bij(t)||gj(xj(t− τij(t)))− gj(yj(t− τij(t)))|
+
n∑
j=1
|cij(t)|
∫ t
−∞
kij(t− s)|hj(xj(s))− hj(yj(s))|ds
]}
12
Novel results on the global exponential stability of cellular neural networks...
≤ erεt
n∑
i=1
{(
ω˙i(t) + ωi(t)rε− rdi(t)ωi(t)
)
|zi(t)|
+ r
n∑
i=1
ωi(t)
[ n∑
j=1
|aij(t)|Hj|zj(t)|+
n∑
j=1
|bij(t)|Kj |zj(t− τij(t))|
+
n∑
j=1
|cij(t)|
∫ t
−∞
kij(t− s)Lj |zj(s)|ds
]}
|zi(t)|
r−1.
D+V2(t, zt) = e
rεt
n∑
i=1
n∑
j=1
(
ωi(ψ
−1
ij (t))
|bij(ψ
−1
ij (t))|
lij(ψ
−1
ij (t))
1− τ˙ij(ψ
−1
ij (t))
K
pij(ψ
−1
ij (t))
j e
rετij(ψ
−1
ij (t))×
× |zj(t)|
r − ωi(t)|bij(t)|
lij(t)K
pij(t)
j |zj(t− τij(t))|
r
)
.
D+V3(t, zt) = e
rεt
n∑
i=1
n∑
j=1
(∫ ∞
0
|kij(s)|
γij(t+s)ωi(t+ s)|cij(t+ s)|
mij(t+s)L
βij(t+s)
j ×
× erεs|zj(t)|
rds−
∫ ∞
0
|kij(s)|
γij(t)ωi(t)|cij(t)|
mij(t)L
βij(t)
j |zj(t− s)|
rds
)
.
By using Young inequality ab ≤ ap/p+ bq/q with a > 0, b > 0, p > 1, 1/p+ 1/q = 1,
we have
n∑
j=1
r|zi(t)|
r−1|aij(t)|Hj|zj(t)|
= r
n∑
j=1
(
|aij(t)|
r−hij(t)
r−1 H
r−qij (t)
r−1
j |zi(t)|
r
) r−1
r
×
(
|aij(t)|
hij(t)H
qij(t)
j |zj(t)|
r
) 1
r
≤ (r − 1)
n∑
j=1
|aij(t)|
r−hij(t)
r−1 H
r−qij (t)
r−1
j |zi(t)|
r +
n∑
j=1
|aij(t)|
hij(t)H
qij(t)
j |zj(t)|
r
and
n∑
j=1
r|zi(t)|
r−1|bij(t)|Kj|zj(t− τij(t))|
≤ (r − 1)
n∑
j=1
|bij(t)|
r−lij(t)
r−1 K
r−pij(t)
r−1
j |zi(t)|
r +
∑
j=1
|bij(t)|
lij(t)K
pij(t)
j |zj(t− τij(t))|
r,
n∑
j=1
r
∫ t
−∞
kij(t− s)|cij(t)||zi(t)|
r−1Lj |zj(s)|ds
=
n∑
j=1
r
∫ ∞
0
kij(s)|cij(t)||zi(t)|
r−1Lj |zj(t− s)|ds
13
Tran Thi Loan and Duong Anh Tuan
≤ (r − 1)
n∑
j=1
∫ ∞
0
|kij(s)|
r−γij(t)
r−1 |cij(t)|
r−mij (t)
r−1 L
r−βij (t)
r−1
j |zi(t)|
rds
+
n∑
j=1
∫ ∞
0
|kij(s)|
γij(t)|cij(t)|
mij(t)L
βij(t)
j |zj(t− s)|
rds.
From above inequalities and (H4) we can choose ε > 0 such that the following
estimate holds
D+V (t, zt) ≤ e
rεt
n∑
i=1
(
ω˙i(t) + ωi(t)[−rdi(t) + rε] + (r − 1)
n∑
j=1
ωi(t)|aij(t)|
r−hij(t)
r−1 H
r−qij (t)
r−1
j
+
n∑
j=1
ωj(t)|aji(t)|
hji(t)H
qji(t)
i + (r − 1)
n∑
j=1
ωi(t)|bij(t)|
r−lij(t)
r−1 K
r−pij(t)
r−1
j
+
n∑
j=1
ωj(ψ
−1
ji (t))
|bji(ψ
−1
ji (t))|
lji(ψ
−1
ji (t))
1− τ˙ji(ψ
−1
ji (t))
K
pji(ψ
−1
ji (t))
i e
rετ
+ (r − 1)
n∑
j=1
ωi(t)
∫ ∞
0
|kij(s)|
r−γij(t)
r−1 |cij(t)|
r−mij(t)
r−1 L
r−βij(t)
r−1
j ds
+
n∑
j=1
∫ ∞
0
|kji(s)|
γji(t+s)ωj(t+ s)|cji(t+ s)|
mji(t+s)L
βji(t+s)
i e
rεsds
)
|zi(t)|
r
≤ −
a
2
n∑
i=1
|zi(t)|
rerεt ≤ 0 for all t ≥ 0.
Therefore setting ω = min
i=1,n
{inf
t≥0
|ωi(t)|} then
ω
n∑
i=1
|zi(t)|
rerεt ≤ V (t, zt) ≤ V (0, z0), ∀t > 0.
Since
V (0, z0) = V1(0, z0) + V2(0, z0) + V3(0, z0) =
n∑
i=1
ωi(0)|zi(0)|
r
+
n∑
i=1
n∑
j=1
∫ 0
−τij(0)
ωi(ψ
−1
ij (s))
|bij(ψ
−1
ij (s))|
lij(ψ
−1
ij (s))
1− τ˙ij(ψ
−1
ij (s))
K
pij(ψ
−1
ij (s))
j |zj(s)|
rerε(s+τij(ψ
−1
ij (s))ds
+
n∑
i=1
n∑
j=1
∫ ∞
0
∫ 0
−s
|kij(s)|
γij(u+s)ωi(u+ s)|cij(u+ s)|
mij(u+s)L
βij(u+s)
j e
rε(u+s)|zj(u)|
rduds,
we obtain
V (0, z0) ≤ P
n∑
i=0
sup
t∈(−∞,0]
|zi(t)|
r,
14
Novel results on the global exponential stability of cellular neural networks...
where P does not depend on the solutions of the system (1.1).
When r = 1. Define a Liapunov function as follows
V (t, zt) =
n∑
i=1
ωi(t)|zi(t)|e
εt
+
n∑
i=1
n∑
j=1
Kj
∫ t
t−τij (t)
ωi(ψ
−1
ij (s))
|bij(ψ
−1
ij (s))|
1− τ˙ij(ψ
−1
ij (s))
|zj(s)|e
ε(s+τij(ψ
−1
ij (s))ds
+
n∑
i=1
n∑
j=1
Lj
∫ ∞
0
kij(s)
∫ t
t−s
ωi(u+ s)|cij(u+ s)|e
ε(u+s)|zj(u)|duds,
(3.1)
By (H∗4), we can choose a positive constant ε such that
ωi(t)(di(t)− ε)− ω˙i(t)−
n∑
j=1
ωj(t)|aji(t)|Hi −
n∑
j=1
Kie
ετωj(ψ
−1
ji (t))
|bji(ψ
−1
ji (t))|
1− τ˙ji(ψ
−1
ji (t))
−
n∑
j=1
Li
∫ ∞
0
kji(s)ωj(t+ s)|cji(t+ s)|e
εsds ≥
a
2
for all t ≥ 0.
Calculating the Dini derivative of V (t, zt), we get
D+V (t, zt) ≤ e
εt
{
n∑
i=1
[ω˙i(t) + ωi(t)(ε− di(t))]|zi(t)|
+
n∑
i=1
[
ωi(t)
n∑
j=1
|aij(t)||fj(xj(t))− fj(yj(t))|
+
n∑
j=1
ωi(t)|bij(t)||gj(xj(t− τij(t)))− gj(yj(t− τij(t)))|
+
n∑
j=1
ωi(t)|cij(t)|
∫ t
−∞
kij(t− s)|hj(xj(s))− hj(yj(s))|ds
+
n∑
j=1
Kjωi(ψ
−1
ij (t))
|bij(ψ
−1
ij (t))|
1− τ˙ij(ψ
−1
ij (t))
eετij(ψ
−1
ij (t))|zj(t)| −
n∑
j=1
ωi(t)Kj |bij(t)||zj(t− τij(t))|
+
n∑
j=1
Lj
∫ ∞
0
kij(s)ωi(t+ s)|cij(t+ s)|e
εs|zj(t)|ds
−
n∑
j=1
Lj
∫ ∞
0
kij(s)ωi(t)|cij(t)||zj(t− s)|ds
]}
≤ eεt
{
n∑
i=1
[ω˙i(t) + ωi(t)(ε− di(t))]|zi(t)|+
n∑
i=1
[ n∑
j=1
ωi(t)|aij(t)|Hj|zj(t)|
15
Tran Thi Loan and Duong Anh Tuan
+
n∑
j=1
ωi(t)Kj |bij(t)||zj(t− τij(t))|+
n∑
j=1
ωi(t)Lj |cij(t)|
∫ t
−∞
kij(t− s)|zj(s)|ds
+
n∑
j=1
Kjωi(ψ
−1
ij (t))
|bij(ψ
−1
ij (t))|
1− τ˙ij(ψ
−1
ij (t))
eετij(ψ
−1
ij (t))|zj(t)| −
n∑
j=1
ωi(t)Kj|bij(t)||zj(t− τij(t))|
+
n∑
j=1
Lj
∫ ∞
0
kij(s)ωi(t+ s)|cij(t+ s)|e
εs|zj(t)|ds
−
n∑
j=1
Lj
∫ ∞
0
kij(s)ωi(t)|cij(t)||zj(t− s)|ds
]}
≤ −eεt
n∑
i=1
{
ωi(t)(di(t)− ε)− ω˙i(t)−
n∑
j=1
ωj(t)|aji(t)|Hi
−
n∑
j=1
Kiωj(ψ
−1
ji (t))
|bji(ψ
−1
ji (t))|
1− τ˙ji(ψ
−1
ji (t))
eετ
−
n∑
j=1
Li
∫ ∞
0
kji(s)ωj(t+ s)|cji(t+ s)|e
εsds
}
|zi(t)|.
Thus, D+V (t, zt) ≤ −
a
2
n∑
i=1
zi(t) ≤ 0 for all t ≥ 0. Therefore, we obtain V (t, zt) ≤
V (0, z0) for all t ≥ 0. From (3.1) we have V (t, zt) ≥ ω
n∑
i=1
|zi(t)| for all t ≥ 0, where
ω = min
i=1,n
inf
t≥0
ωi(t). It is easy to see that
V (0, z0) ≤ P
n∑
i=0
sup
t∈(−∞,0]
|zi(t)|,
where P does not depend on the solutions of (1.1).
Hence, we get
( n∑
i=1
|xi(t)− yi(t)|
r
) 1
r ≤M
( n∑
i=1
sup
s∈[−∞,0]
|xi(s)− yi(s)|
r
) 1
r e−εt for all t ≥ 0, r ≥ 1
that is ‖x(t)− y(t)‖ ≤M‖φ−ψ‖e−εt for all t ≥ 0, where M > 1 is independent of
solutions of (1.1). This completes the proof of Theorem 3.1.
Remark 3.1. In our knowledge, all conditions in previous literature always demand
that parameters are constants. It is difficult to find them. But in this article, we only
need that parameters are functions. It supplies us with more choices.
16
Novel results on the global exponential stability of cellular neural networks...
Remark 3.2. By this method, we can obtain similar results when we substitute
fijl, gijl, hijl for fi, gi, hi.
Corollary 3.1. Assume that (H1), (H2), (H3) hold and there exist positive con-
stants ωi, hij , lij, pij, qij , mij , γij, βij , i = 1, n and r > 1 such that
rωidi(t)− (r − 1)
n∑
j=1
ωi|aij(t)|
r−hij
r−1 H
r−qij
r−1
j −
n∑
j=1
ωj|aji(t)|
hjiH
qji
i
− (r − 1)
n∑
j=1
ωj|bij(t)|
r−lij
r−1 K
r−pij
r−1
j −
n∑
j=1
ωj
|bji(ψ
−1
ji (t))|
lji
1− τ˙ji(ψ
−1
ji (t))
K
pji
i
− (r − 1)
n∑
j=1
∫ ∞
0
|kij(s)|
r−γij
r−1 ωi|cij(t)|
r−mij
r−1 L
r−βij
r−1
j ds
−
n∑
j=1
∫ ∞
0
|kji(s)|
γjiωj|cji(t+ s)|
mjiL
βji
i ds ≥ a > 0 for all t ≥ 0.
Then the system (1.1) is globally exponentially stable.
Corollary 3.2. Assume that (H1), (H2), (H3) hold and there exist positive con-
stants ωi, i = 1, n such that
ωidi(t)−
n∑
j=1
ωj |aji(t)|Hi −
n∑
j=1
ωj
|bji(ψ
−1
ji (t))|
1− τ˙ji(ψ
−1
ji (t))
Ki
−
n∑
j=1
∫ ∞
0
kji(s)ωj|cji(t+ s)|Lids ≥ a > 0
for all t ≥ 0, then the system (1.1) is globally exponentially stable.
Corollary 3.3. Assume that (H1), (H2), (H3) hold and there exist positive con-
stants ωi, i = 1, n such that
diωi −
n∑
j=1
ωjajiHi −
n∑
j=1
Kiωj
bji
inft≥0(1− τ˙ji(t))
−
n∑
j=1
Liωjcji ≥ a > 0.
Then the system (1.1) is globally exponentially stable, where di = inf
t≥0
di(t), aij =
sup
t≥0
|aij(t)|, bij = sup
t≥0
|bij(t)|, cij = sup
t≥0
|cij(t)|.
We consider the following autonomous neural networks
dxi(t)
dt
=− dixi(t) +
n∑
j=1
aijfj(xj(t)) +
n∑
j=1
bijgj(xj(t− τij))
+
n∑
j=1
cij
∫ t
−∞
kij(t− s)hij(xj(s))ds+ Ii, (i = 1, n).
(3.2)
17
Tran Thi Loan and Duong Anh Tuan
Corollary 3.4. Assume that (H1), (H2), (H3) hold and there exist positive con-
stants ωi, hij, lij, pij , qij, mij , γij, βij, i = 1, n and r > 1 such that
rωidi − (r − 1)
n∑
j=1
ωj|aij |
r−hij
r−1 H
r−qij
r−1
j −
n∑
j=1
ωj |aji|
hjiH
qji
i
− (r − 1)
n∑
j=1
ωj|bij |
r−lij
r−1 K
r−pij
r−1
j −
n∑
j=1
ωj |bji|
ljiK
pji
i
− (r − 1)
n∑
j=1
∫ ∞
0
|kij(s)|
r−γij
r−1 ωi|cij|
r−mij
r−1 L
r−βij
r−1
j
−
n∑
j=1
∫ ∞
0
|kji(s)|
γjiωj|cji|
mjiL
βji
i ds ≥ a ≥ 0.
Then the system (3.2) is globally exponentially stable.
Corollary 3.5. Assume that (H2), (H3) hold and there exist positive constants ωi >
0, i = 1, n such that
diωi −
n∑
j=1
ωjajiHi −
n∑
j=1
Kiωjbji −
n∑
j=1
Liωjcji ≥ a > 0, (3.3)
then the system (3.2) is globally exponentially stable.
4. An example
We consider when n = 2, the system (1.1) becomes
dxi(t)
dt
= −di(t)xi(t) +
2∑
j=1
aij(t)fj(xj(t)) +
2∑
j=1
bij(t)gj(xj(t− τij(t)))
+
2∑
j=1
cij(t)
∫ t
−∞
kij(t− s)hj(xj(s))ds+ Ii(t), (i = 1, 2),
(4.1)
where D(t) =
(
11 + 1
t2+1
0
0 11 + e−t
)
, A(t) =
(
1− e−t 1
2
(1 + 1
t2+1
)
1
t2+1
sin t
)
,
B(t) =
(
cos t e−t
1− | sin t| 2
3t2+2
)
, C(t) =
(
1 + sin t 2 sin2 t
2 cos2 t 1− sin t
)
, kij(t) = 5e
−5t, (i, j =
1, 2; t ≥ 0), fj(u) = gj(u) = hj(u) =
1
2
(u + arctan u), (i, j = 1, 2, u ∈ R), τij(t) =
τ(t) = 1 +
1
2
sin t. Hence, we have di = inf
t≥0
di(t) = 11, aij = sup
t≥0
|aij(t)| = 1,
18
Novel results on the global exponential stability of cellular neural networks...
bij = sup
t≥0
|bij(t)| = 1, cij = sup
t≥0
|cij(t)| = 2, inf
t≥0
(1 − τ(t)) ≥
1
2
, Hj = Kj = Lj = 1,
i, j = 1, 2. If we choose ωi = 2, i = 1, 2 then
diωi −
2∑
j=1
ωjajiHi −
2∑
j=1
Kiωj
bji
inft≥0(1− τ˙ji(t))
−
2∑
j=1
Liωjcji ≥ 1.
According to Corollary 3.3, we have the system (4.1) is globally exponentially stable.
5. Conclusions
In this paper, the general neural networks with variable and unbounded time
delays have been studied. By using Young inequality and constructing a suitable
Liapunov function, we give some sufficient conditions for the global exponential sta-
bility of the system (1.1) without assumption of boundedness on the active function
and existence equilibrium point of (1.1). The results obtained in this paper are new
and complement previously known results.
REFERENCES
[1] Cillvallrri PP, Gill LM, Pandolfi L, 1993. On stability of cellular neural networks
with delay. IEEE Transactions on Circuits and Systems-I, 40, pp. 157-164.
[2] M. Rehim, H. Jiang, Z. Teng, 2004. Boundedness and stability for nonau-
tonomous cellular neural networks with delay. Neural networks, Vol. 17, pp. 1017-1025.
[3] Roska, Chua LO, 1992. Cellualr neural networks with delay type template ele-
ments and non-uniform grids. International Journal of Circuit theory and Applications,
20(4), pp. 469-481.
[4] L.O. Chua and L. Yang, 1988. Cellular neural network: Theory.
IEEE.Trans.Circuits Syst., 35:10, pp. 1257-1272.
[5] J. Liang, J. Cao, 2007. Global out put convergence of recurrent neural networks
with distributed delays. Nonlinear Analysis: Real World Applications, 8, pp. 187-197.
[6] Boundedness and global stability for nonautonomous recurrent neural networks
with distributed delays. Chaos, Solitons and Fractals, 30, pp. 83-93.
[7] X. Liao, Q. Liu, W. Zhang, 2006. Delay-dependent asymptotic stability for neural
networks with distributed delays. Nonlinear Analysis: Real World Applications 7, pp.
1178-1192.
[8] Ju H. Park, 2006. On global stability criterion for neural networks with discrete
and distributed delays. Chaos, solitions and Fractals, 30, pp. 897-902.
[9] L.V. Hien, T.T. Loan, D.A. Tuan, 2008. Periodic solutions and exponential sta-
bility for shunting inhibitory cellular neural networks with continuously distributed de-
lays. Electron. J. of Diff. Equations, Vol