• Nenhum resultado encontrado

Nonuniform behavior and stability of Hopfield neural networks with delay

N/A
N/A
Protected

Academic year: 2021

Share "Nonuniform behavior and stability of Hopfield neural networks with delay"

Copied!
15
0
0

Texto

(1)

NEURAL NETWORKS WITH DELAY

ANT ´ONIO J. G. BENTO, JOS´E J. OLIVEIRA, AND C´ESAR M. SILVA

Abstract. Based on a new abstract result on the behavior of nonautonomous delayed equations, we obtain a stability result for the solutions of a general discrete nonautonomous Hopfield neural network model with delay. As an ap-plication we improve some existing results on the stability of Hopfield models.

1. Introduction

Due to their many applications in various engineering and scientific areas such as signal processing, image processing and pattern classification (see [6, 7]), neural network models are nowadays a subject of active research. One of the most im-portant goals in the study of neural network models is to establish conditions that assure the global stability of equilibrium states [15, 16, 17], of periodic solutions [10, 22] or, more generally, of a particular solution [11].

In the present work we consider a discrete-time nonautonomous neural network with time delay. The relevance of our setting is easily clarified. Although theoreti-cally speaking neural networks should be described by continuous-time models, it is essential to formulate discrete-time versions that can be implemented computation-ally [17, 18]. It is also important to consider delay in modelling neural networks in order to reproduce the effect of finite transmission speed of signals among neurons (there is a mathematical counterpart of this since time delay may cause instability and oscillation [14]). The nonautonomy is associated to the change of parameters such as neuron charging time, interconnection weights and external inputs in the course of time. This can be translated not only by time-varying parameters, but also by time-varying delays [5, 12, 20, 23]. There are still few stability results in the context of nonautonomous nonperiodic neural network models [21].

The proof presented here to establish our global stability results is different from the usual one. In fact, the classical method of proof used in [7, 9, 15, 17, 21] consists in proving that there is an equilibrium point or a periodic solution and then construct a suitable Lyapunov function that assures the global stability of the particular solution. On the other hand, the technique used here is different from the usual ones. Namely, we see our system as a sufficiently small perturbation of a nonuniform contraction and use Banach’s fixed point theorem in some suitable complete metric space to obtain the global stability of our system. This approach allows us to dismiss the requirement of existence of a stationary or, more generally, periodic solution and additionally to consider a more general form for the nonlinear

Date: February 3, 2020.

1991 Mathematics Subject Classification. 39A23; 39A30; 92B20.

Key words and phrases. Discrete neural networks; Nonautonomous delay equations; Asymp-totic stability; Periodic solutions.

(2)

part of the model. When we restrict to the particular case of a periodic Hopfield model, our conditions for existence of a globally stable periodic solution generalize results in [22]. For other results of stability of higher order difference equations see [13, 19, 2, 3, 4].

This work is organized in the following way. In section 2 we use the discretization technique in [17] to obtain a discrete version of a generalized neural network model that includes the well known Hopfield neural network models considered in [17, 22] and the bidirectional associative memory neural network models studied in [10, 15]. Next, we state our main stability result and we see that it is a consequence of the abstract result considered in section 3. As a corollary, we get global exponential stability for the models in [21] under distinct hypothesis from the ones assumed in that paper. After, for the periodic model, considering a Poincar´e map, we obtain the existence of a periodic solution as a consequence of the global exponential stability. This result improves one of the main results in [22]. To see this we present an illustrative example where our results can be applied but it is not possible to apply the results of Xu and Wu [22], due to an extra hypothesis required in their work. Finally, in section 3, we consider general discrete-time delayed models that include our neural network models as particular cases and obtain the abstract global stability result that we use to prove the stability results in section 2.

2. Hopfield Models

As a generalization of the continuous-time Hopfield neural network models pre-sented in [17, 22] we have x′ i(t) = −ai(t)xi(t) + N X j=1 kij(t, xj(t − αij(t))), t > 0, i = 1, . . . , N, (1) where ai : [0, +∞[→ [0, +∞[, kij : [0, +∞[×R → R, and αij : [0, +∞[→ [0, +∞[

are continuous functions with αijbounded and kijLipschitz on the second variable.

Here ai(t) is the neuron charging time.

Following the ideas in [17], to obtain a discrete-time analogue of the continuous-time model (1), we consider the following approximation

x′ i(t) = −ai([t/h]h)xi(t) + N X j=1 kij  [t/h]h, xj  [t/h]h −  αij([t/h]h) h  h  , (2)

i = 1, . . . , N , t ∈ [mh, (m + 1)h[ for m ∈ N0, where h is a fixed positive real number

(discretization step size) and [r] denotes the integer part of the real number r. Clearly, for t ∈ [mh, (m + 1)h[ we have [t/h] = m and the model (2) has the form

x′ i(t) = −ai(mh)xi(t) + N X j=1 kij  mh, xj  m −  αij(mh) h  h  , which is equivalent to eai(mh)tx′ i(t) + ai(mh)eai(mh)txi(t) = eai(mh)t N X j=1 kij(mh, xj((m − τij(m)) h)) , (3)

(3)

where τij(m) =  αij(mh) h  . Integrating (3) over [mh, t[, with t < (m + 1)h, we obtain Z t mh h eai(mh)sx i(s) i′ ds =  eai(mh)t− eai(mh)mh ai(mh) XN j=1 kij(mh, xj((m − τij(m)) h)) , which is equivalent to xi(t) = eai(mh)(mh−t)xi(mh)+  1 − eai(mh)(mh−t) ai(mh) XN j=1 kij(mh, xj((m − τij(m)) h)) . Letting t → (m + 1)h, we obtain xi((m+1)h) = e−ai(mh)hxi(mh)+  1 − e−ai(mh)h ai(mh) XN j=1 kij(mh, xj((m − τij(m)) h)) . (4) Thus, identifying xi(mh) with xi(m), ai(mh) with ai(m) and kij(mh, · ) with

kij(m, · ) and defining θi(m) = 1 − e −ai(m)h ai(m) , (5) equation (4) becomes xi(m + 1) = e−ai(m)hxi(m) + θi(m) N X j=1 kij m, xj(m − τij(m))  . (6)

The model (6) can be rewritten in the following way xi(m + 1) = ci(m)xi(m) +

N

X

j=1

hij m, xj(m − τij(m)), (7)

i = 1, . . . , N , m ∈ N0, where ci : N0→]0, 1[, τij : N0→ N0 are bounded functions

with τ := max{τij(m) : m ∈ N, i, j = 1, . . . , N }, and hij : N0× R → R are Lipschitz

functions on the second variable, i.e., there exist Hij: N0→ R+ such that

|hij(m, u) − hij(m, v)| 6 Hij(m)|u − v|, ∀u, v ∈ R, m ∈ N0.

In this paper we consider the Hopfield neural network model (7) that generalizes some existent models in the literature [10, 15, 17, 22].

Before stating our main result, we need to introduce some notation. Let ∆ =(m, n) ∈ Z2: m > n > 0

and, given a set I ⊆ R and a number r ∈ Z−

, define IZ= I ∩ Z. Consider the space

X of the functions

α : [r, 0]Z→ R

equipped with the norm

kαk = max

(4)

Given N ∈ N, we are going to consider the cartesian products XN and RN equipped

with the supremum norm, i.e., for α = (α1, . . . , αN) ∈ XN and y = (y1, . . . , yN) ∈

RN, we have

kαk = max

i=1,...,Nkαik = maxi=1,...,N

 max j=r,...,0|αi(j)|  and |y| = max i=1,...,N|yi|.

Given n ∈ N0and a function x : [n+ r, +∞[Z→ RN we denote the ith component

by xi, i.e., x = (x1, . . . xN). For each m ∈ N0such that m > n, we define xm∈ XN

by

xm(j) = x(m + j), j = r, r + 1, . . . , 0.

For each n ∈ N0 and each α ∈ XN, we denote by x(·, n, α) the unique solution

x : [n + r, +∞[Z→ RN

of (7) with initial conditions xn= α.

We now state our main global stability result for the neural network model given by (7). This theorem furnishes a bound for the distance between solutions of (7) based on bound for the products of consecutive neuron charging times, assuming that the Lipschitz constants of the nonlinear part of the model are sufficiently small. We will use it to obtain several results on the stability of several neural networks. Theorem 1. Consider model (7) and assume that there exist a double sequence (a′ m,n)(m,n)∈∆ such that a(i)m,n:= m−1Y s=n ci(s) 6 a′m,n, (8)

for all i = 1, . . . , N and all (m, n) ∈ ∆, and λ := max i=1,...,N   sup (m,n)∈∆    1 a′ m,n m−1X k=n a(i)m,k+1a′ k,n N X j=1 Hij(k)      < 1.

Then, for every α, α∗

: [r, 0]Z→ RN and every (m, n) ∈ ∆, we have

kxm(·, n, α) − xm(·, n, α∗)k 6 1 1 − λa ′ m,nkα − α ∗ k. Proof. Consider n ∈ N0 and α, α∗: [r, 0]Z→ RN. The change

y(m) = x(m, n, α) − x(m, n, α∗

) transforms (7) into the system

yi(m + 1) = ci(m)yi(m) + N X j=1 ehij(m, yj(m − τij(m))), (9) i = 1, . . . , N , m > n, where ehij(m, u) = hij(m, u + xj(m − τij(m), n, α∗)) − hij(m, xj(m − τij(m), n, α∗)).

Now, y = 0 is an equilibrium point of (9) and, by Theorem 5, we obtain, for all function β : [r, 0]Z→ RN, kym(·, n, β)k 6 1 1 − λa ′ m,nkβk

(5)

for all m > n. Letting β = α − α∗ , we conclude that kxm(·, n, α) − xm(·, n, α∗)k = kym(·, n, α − α ∗ )k 6 1 1 − λa ′ m,nkα − α ∗ k for all m > n. 

We stress that Theorem 1 includes situations that are nonuniform and even non-exponential. In fact, our setting is sufficiently general to allow situations where the neuron charging times ci(m) leads to a sequence a′m,nwith a more general

depen-dence on m and n than the usual uniform exponential behavior a′

m,n= D e

−µ(m−n).

Example 1. Choosing

ci(m) = e−νi+mε[1−(−1)

m]/2−(m+1)ε[1−(−1)m+1]/2

where νi> 0, for i = 1, . . . , N , ε > 0 and hij : N0× R → R Lipschitz functions on

the second variable , i.e.,

|hij(m, u) − hij(m, v)| 6 Hij(m)|u − v|, ∀u, v ∈ R, m ∈ N0, with Hij(m) = 1 − e−ε 2N e −νi−ε(m+1)[1−(−1)m+1]/2−εm, then k¯xm(·, n, ¯α) − ¯xm(·, n, ¯α∗)k 6 2 e−µ(m−n)+εnk ¯α − ¯α∗k. In fact, a(i)m,n= m−1Y s=n ci(s) = e−νi(m−n)+εn[1−(−1) n]/2−εm[1−(−1)m]/2 6e−νi(m−n)+εn6e−µ(m−n)+εn:= a′ m,n, where µ = min

i {νi}. Taking into account that for m > n > 0 we have

a(i)m,k+1a′ k,n a′ m,n = e(µ−νi)(m−k)−εm[1−(−1)m]/2+νi+ε(k+1)[1−(−1)k+1]/2 6eνi+ε(k+1)[1−(−1)k+1]/2 and N X j=1 Hi,j(k) = N X j=1 1 − e−ε 2N e −νi−ε(k+1)[1−(−1)k+1]/2−εk =1 − e −ε 2 e −νi−ε(k+1)[1−(−1)k+1]/2−εk, it follows that λ = max i=1,...,N   sup (m,n)∈∆    1 a′ m,n m−1X k=n a(i)m,k+1a′ k,n N X j=1 Hij(k)      6 max i=1,...,N " sup (m,n)∈∆ ( 1 − e−ε 2 m−1X k=n e−εk )# = 1 − e −ε 2 +∞ X k=0 e−εk=1 2.

(6)

A nonexponential and nonuniform example can be obtained choosing ci(m) =  m + 2 m + 1 −νi (m + 1)ε[1−(−1)m]/2 (m + 2)ε[1−(−1)m+1]/2 and Hij(m) = 1 − e−ε 2N (k + 2) −νi−ε[1−(−1)k+1]/2e−εk.

With this choice it follows that

k¯xm(·, n, ¯α) − ¯xm(·, n, ¯α∗)k 6 2  m + 1 n + 1 −µ (n + 1)εk ¯α − ¯α∗ k.

Despite the generality of our main result, when we apply it to the Hopfield neural network models existing in the literature, we are able to improve some of the known results.

In [22] the authors considered the following discretization of a nonautonomous continuous-time Hopfield neural network model, which is a particular case of (7),

xi(m + 1) = xi(m) e−ai(m)h+θi(m)   N X j=1 bij(m)fj(xj(m − τ (m))) + Ii(m)   , (10)

i = 1, . . . , N , where ai, bij, Ii: N0 → R and τ : N0 → N0 are bounded functions

with ai(m) > 0, 0 6 τ (m) 6 τ , fj : R → R are Lipschitz functions with Lipschitz

constant Fj> 0, θi(m) is given by (5) and h > 0 (h is the discretization step size).

We are going to use the following notation a−

i = infm ai(m) and b+ij = sup m

|bij(m)| and θi+= sup m

θi(m).

We have the following result that establishes the global exponential stability of all solutions of (10). Corollary 2. If a− i > N X j=1 b+ijFj. (11)

for every i = 1, . . . , N , then model (10) is globally exponentially stable, i.e., there are constants µ > 0 and C > 1 such that

kxm(·, n, α) − xm(·, n, α∗)k 6 C e−µ(m−n)kα − α∗k

for every α, α∗

: [−τ, 0]Z→ RN and every (m, n) ∈ ∆.

Proof. We will show that we are in the conditions of Theorem 1. Defining νi= a−i h,

by (11) there is positive number µ < min

i νi such that eνi−µ−1 eνi−1 a − i > N X j=1 b+ijFj, ∀i = 1, . . . , N. (12) Putting a′

m,n= e−µ(m−n), condition (8) is trivially satisfied because

(7)

Since θ+i = 1 − e−νi a− i , we have by (12) that λ = max i=1,...,N   sup (m,n)∈∆    1 a′ m,n m−1X k=n a(i)m,k+1a′ k,nθi(k) N X j=1 |bij(k)|Fj      6 max i=1,...,N   sup (m,n)∈∆ ( eµ(m−n) m−1X k=n e−νi(m−k−1)−µ(k−n) ) θ+i N X j=1 b+ijFj   < max i=1,...,N " sup (m,n)∈∆ (m−1 X k=n e(νi−µ)(k−m) ) eνi 1 − e −νi a− i eνi−µ−1 eνi−1 a − i # = max i=1,...,N " sup (m,n)∈∆  1 − e(νi−µ)(n−m) eνi−µ−1  (eνi−1)(eνi−µ−1) (eνi−1)a− i a− i # = max i=1,...,N " sup (m,n)∈∆ n 1 − e(νi−µ)(n−m) o# = 1

and this proves the corollary. 

In the next corollary we slightly improve condition (11) in the last corollary. To do that we need to define the concept of an M -matrix. We say that a square real matrix is an M -matrix if the off-diagonal entries are nonpositive and all the eigenvalues have positive real part.

Now consider the N × N -matrix M defined by M = diag(a− 1, . . . , a − N) −  b+ijFj

Corollary 3. If M is an M-matrix, then the model (10) is global exponential stable, i.e., there are µ > 0 and C > 1 such that

kxm(·, n, α) − xm(·, n, α∗)k 6 C e−µ(m−n)kα − α∗k.

for every α, α∗

: [−τ, 0]Z→ RN and every (m, n) ∈ ∆.

Proof. If M is an M-matrix, then (see Fiedler [8, Theorem 5.1]) there is d = (d1, . . . , dN) > 0 such that M d > 0, i.e.,

dia−i > N

X

j=1

djb+ijFj. (13)

The change yi(m) = d−i1xi(m), m ∈ N0 and i = 1, . . . , N , transforms (10) into

yi(m + 1) = yi(m) e−ai(m)h+θi(m)   N X j=1 ˜bij(m) ˜fj(yj(m − τ (m))) + ˜Ii(m)   , where

(8)

for m ∈ N0and u ∈ R. As fj are Lipschitz functions with constant Fj, then ˜fj are

also Lipshitz functions with constant ˜Fj= djFj. From (13) we have

a− i > N X j=1 d−1i b+ijdjFj which is equivalent to a− i > N X j=1 ˜b+ ijF˜j

and the result follows from the Corollary 2. 

Now we improve [22, Theorem 4.2], which proves the existence and global stabil-ity of the periodic solution of the Hopfield neural network model (10) with periodic coefficients. Let ω ∈ N and consider the model (10) where ai, bij, Ii: N0 → R and

τ : N0→ N0 are ω-periodic functions.

Theorem 4. If M is an M -matrix, then the model (10) has a unique ω-periodic solution which is globally exponentially stable.

Proof. Let n ∈ N0. From Corollary 3, there are µ > 0 and C > 1 such that

kxm(·, n, α) − xm(·, n, α∗)k 6 C e−µ(m−n)kα − α∗k, (14)

for all m > n and all α, α∗∈ XN. Now, choosing an integer k ∈ N such that

C e−µkω < 1 (15)

and defining the map P : XN → XN by P (α) = x

n+ω(·, n, α). For α, α∗∈ XN, we have kPk(α) − Pk(α∗ )k = kP (Pk−1(α)) − P (Pk−1(α∗ ))k = kxn+ω(·, n, Pk−1(α)) − xn+ω(·, n, Pk−1(α∗))k = kxn+ω(·, n, xn+ω(·, n, Pk−2(α))) − xn+ω(·, n, xn+ω(·, n, Pk−2(α∗)))k,

and, as the model (10) is ω-periodic, from (14), kPk(α) − Pk(α∗

)k = kxn+2ω(·, n, Pk−2(α)) − xn+2ω(·, n, Pk−2(α∗))k

= kxn+kω(·, n, α) − xn+kω(·, n, α∗)k

6C e−µkωkα − α∗

k.

From (15), the map Pk is a contraction on XN. As XN is a Banach space, we

conclude that there is a unique point ϕ ∈ XN such that Pk(ϕ) = ϕ. Noting that

Pk(P (ϕ)) = P (Pk(ϕ)) = P (ϕ), we have P (ϕ) = ϕ which means xn+ω(·, n, ϕ) = ϕ.

Finally, as x(m, n, ϕ) is a solution of (10) with ai, bij, Ii, τ ω-periodic functions,

we know that x(m + ω, n, ϕ) is also a solution of (10) and

x(m, n, ϕ) = x(m, n, xn+ω(·, n, ϕ)) = x(m + ω, n, ϕ)

for every m > n. Therefore x(m, n, ϕ) is a ω-periodic solution of (10) and, from (14), all other solutions converge to it with exponential rates. 

(9)

In [22, Theorem 4.2] the authors use the extra hypothesis: min 16i6N   bai− Fi N X j=1 bbji   > 0 (16) where bai= 1 ω ω−1X m=0 ai(m) and bbij = 1 ω ω−1X m=0 bij(m).

Note that hypothesis (H3) in [22] is equivalent to M being an M -matrix (see Fiedler [8, Theorem 5.1]).

Next we present an example similar to the example of Xu and Wu [22] that does not verify (16).

Example 2. In the model (10) with N = 2, let a1(m) = 25 + cos mπ ω , b11(m) = 16 + cos mπ ω , b12(m) = 4 + cos mπ ω a2(m) = 29 + sinmπ ω , b21(m) = 16 + sin mπ ω , b22(m) = 8 + sin mπ ω I1(m) = cosmπ ω , I2(m) = sin mπ ω , τ (m) = (1 + (−1) m)/2

f1(u) = arctan u, f2(u) = tanh u, ω = 10.

Clearly F1= F2= 1 and since

M =  a− 1 0 0 a− 2  −  b+11F1 b+12F2 b+21F1 b+22F2  =  7 −5 −17 19 

is an M -matrix (the eigenvalues are 2 and 24), by Theorem 4, this example has a unique ω-periodic solution which is globally exponentially stable.

However, since ba1− F1  |bb11| + |bb21|  = −7,

(16) is not satisfied and thus the result of Xu and Wu [22] cannot be applied to this example.

The following graphs illustrate our example with the initial conditions x1(−1) = x2(−1) = x1(0) = x2(0) = 0,

x1(−1) = x2(−1) = x1(0) = x2(0) = 0.09,

x1(−1) = x1(0) = −0.09 and x2(−1) = x2(0) = 0.09,

x1(−1) = x1(0) = 0.09 and x2(−1) = x2(0) = −0.09,

(10)

x1H-1L=x2H-1L=x1H0L=x2H0L=-0.09 x1H-1L=x1H0L=-0.09; x2H-1L=x2H0L=0.09 x1H-1L=x2H-1L=-0.09; x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0 20 40 60 80 t -0.05 0.05 0.10 x1 x1H-1L=x2H-1L=x1H0L=x2H0L=-0.09 x1H-1L=x1H0L=-0.09; x2H-1L=x2H0L=0.09 x1H-1L=x2H-1L=-0.09; x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0 20 40 60 80 t -0.05 0.05 0.10 x2 x1H-1L=x1H0L=-0.09; x2H-1L=x2H0L=0.09 x1H-1L=x2H-1L=-0.09; x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0.09 x1H-1L=x2H-1L=x1H0L=x2H0L=0 x1H-1L=x2H-1L=x1H0L=x2H0L=-0.09 -0.06 -0.04 -0.02 0.02 0.04 0.06 x1 -0.05 0.05 x2

(11)

3. Stability of nonuniform contractions Let Y be a Banach space and denote by X the space of functions

α : [r, 0]Z→ Y equipped with the norm kαk = max

j∈[r,0]Z

|α(j)|, where | · | is the norm in Y . Given N ∈ N, we are going to consider the cartesian products XN and YN equipped with

the supremum norm, i.e., for α = (α1, . . . , αN) ∈ XN and y = (y1, . . . , yN) ∈ YN,

we have

kαk = max

i=1,...,Nkαik = maxi=1,...,N

 max j=r,...,0|αi(j)|  and |y| = max i=1,...,N|yi|.

Given a function x : [r, +∞[Z→ YN we define, for each m ∈ N0, xm∈ XN by

xm(j) = x(m + j), j = r, r + 1, . . . , 0.

Let fm: XN → YN be Lipschitz perturbations such that

fm(0) = 0 (17)

for every m ∈ N0.

We are going to consider the following nonlinear delay difference equation x(m + 1) = Lmxm+ fm(xm), m ∈ N0, (18)

where, for each m ∈ N0, Lm: XN → YN is given by

Lmα =



L(1)mα1, L(2)mα2, . . . , L(N )m αN



, (19)

with L(i)m: X → Y a bounded linear operator for i = 1, . . . , N . For each n ∈ N0

and α ∈ XN, we obtain a unique function

x : [n + r, +∞[Z→ YN,

denoted by x(·, n, α), such that xn = α and (18) holds. Consequently, for each

(m, n) ∈ ∆, we can define the operator Fm,n: XN → XN given by

Fm,n(α) = xn(·, n, α), α ∈ XN.

Associated with equation (18), we will consider the linear difference equation vi(m + 1) = L(i)m(vi,m) (20)

i = 1, . . . , N , where vi,m ∈ X is defined, as usual, by vi,m(j) = vi(m + j), j =

r, r + 1, . . . , 0 and L(i)m is given by (19). For each n ∈ N0 and αi∈ X, we obtain a

unique function vi : [n + r, +∞[Z→ Y , denoted by vi(·, n, αi), such that vi,n = αi

verifying (20).

For each (m, n) ∈ ∆ and i = 1, . . . , N , we define the operator A(i)m,n: X → X by

A(i)m,nαi= vi,m(·, n, αi), αi∈ X. We can easily verify that

a) A(i)m,nis linear for each (m, n) ∈ ∆;

b) A(i)m,m= Id;

c) A(i)l,mA(i)m,n= A(i)

(12)

It is easy to prove by induction in m (see [1]) that Fm,n(α) =F(1) m,n(α), . . . , F(N )m,n(α)  , where, for i = 1, . . . , N , F(i)m,n(α) = A(i)m,nαi+ m−1X k=n A(i) m,k+1Γf (i) k (xk), α = (α1, . . . , αN), fk(xk) =  fk(1)(xk), . . . , fk(N )(xk)  and Γ : Y → X is defined by Γ u : [r, 0]Z → Y j 7→ Γu(j) = ( u if j = 0, 0 if j < 0, for all u ∈ Y .

Theorem 5. Let fm: XN → YN be Lipschitz functions such that (17) is satisfied

and consider equation (18). Leta(i)m,n

 (m,n)∈∆, i = 1, . . . , N , and let a ′ m,n  (m,n)∈∆

be double sequences such that

kA(i)m,nk 6 a(i)m,n6a ′ m,n

for all (m, n) ∈ ∆, where A(i)m,nare the evolutions operators of equation (20) derived

from equation (18). Assume that λ := max i=1,...,N " sup (m,n)∈∆ ( 1 a′ m,n m−1X k=n

a(i)m,k+1Lip(fk(i))a′ k,n )# < 1. Then kFm,n(α)k 6 1 1 − λa ′ m,nkαk for every (m, n) ∈ ∆.

Proof. Given n ∈ N0 and α ∈ XN\ {0}, let Cn,αbe the space of functions

x : [n + r, +∞[Z→ YN such that xn = α |x|Cn,α= sup  kxmk a′ m,nkαk : m > n  < +∞.

It is clear that Cn,α is a complete metric space with the metric defined by

d(x, y) = |x − y|Cn,α = sup  kxm− ymk a′ m,nkαk : m > n  . For every x ∈ Cn,αwe define

(Jx)m=

(

α if m = n

(13)

where, for each i = 1, . . . , N , ξm(i)= A(i)m,nαi+ m−1X k=n A(i) m,k+1Γf (i) k (xk).

Since for m > n we have k (Jx)mk 6 max i=1,...,N ( kA(i)m,nαik + m−1X k=n kA(i)m,k+1kkΓfk(i)(xk)k ) 6 max i=1,...,N ( a(i)m,nkαik + m−1X k=n

a(i)m,k+1Lip(fk(i))kxkk

) 6 max i=1,...,N ( a′ m,nkαik + m−1X k=n

a(i)m,k+1Lip(fk(i))a′

k,nkαk |x|Cn,α ) 61 + λ |x|Cn,α  a′ m,nkαk

and this implies that Jx belongs to Cn,α and

|Jx|Cn,α61 + λ |x|Cn,α. (21)

Hence J : Cn,α→ Cn,α.

Now we prove that J is a contraction. For every x, y ∈ Cn,α and every m > n

we have k (Jx − Jy)mk = k (Jx)m− (Jy)mk 6 max i=1,...,N (m−1 X k=n

kA(i)m,k+1kkΓfk(i)(xk) − Γfk(i)(yk)k

) 6 max i=1,...,N (m−1 X k=n

a(i)m,k+1Lip(fk(i))kxk− ykk

) 6 max i=1,...,N (m−1 X k=n

a(i)m,k+1Lip(fk(i))a′

k,nkαk |x − y|Cn,α ) 6a′ m,nkαkλ |x − y|Cn,α and thus |Jx − Jy|Cn,α 6λ |x − y|Cn,α.

Since λ < 1, J is a contraction and by the Banach fixed point theorem J has a unique fixed point x∗. By (21) it follows that the fixed point xverifies

|x∗

|Cn,α 6

1 1 − λ

and this proves the theorem. 

Acknowledgments

Ant´onio J. G. Bento and C´esar M. Silva were partially supported by FCT through CMA-UBI (project PEst-OE/MAT/UI0212/2014).

Jos´e J. Oliveira was supported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the Funda¸c˜ao para a Ciˆencia e a Tecnologia, through the Project PEstOE/MAT/UI0013/2014.

(14)

References

[1] L. Barreira, C. Valls, Stability in delay difference equations with nonuniform exponential behavior, J. Differential Equations 238 (2) (2007) 470–490.

URL http://dx.doi.org/10.1016/j.jde.2007.03.017

[2] L. Berezansky, E. Braverman, E. Liz, Sufficient conditions for the global stability of nonau-tonomous higher order difference equations, J. Difference Equ. Appl. 11 (9) (2005) 785–798. URL http://dx.doi.org/10.1080/10236190500141050

[3] E. Braverman, B. Karpuz, On global asymptotic stability of nonlinear higher-order difference equations, J. Comput. Appl. Math. 236 (11) (2012) 2803–2812.

URL http://dx.doi.org/10.1016/j.cam.2012.01.015

[4] E. Braverman, B. s. Karpuz, On stability of delay difference equations with variable coeffi-cients: successive products tests, Adv. Difference Equ. (2012) 2012:177, 8.

URL http://dx.doi.org/10.1186/1687-1847-2012-177

[5] W.-H. Chen, X. Lu, D.-Y. Liang, Global exponential stability for discrete-time neural net-works with variable delays, Physics Letters A 358 (3) (2006) 186–198.

URL http://dx.doi.org/10.1016/j.physleta.2006.05.014

[6] L. O. Chua, L. Yang, Cellular neural networks: applications, IEEE Trans. Circuits and Sys-tems 35 (10) (1988) 1273–1290.

URL http://dx.doi.org/10.1109/31.7601

[7] L. O. Chua, L. Yang, Cellular neural networks: theory, IEEE Trans. Circuits and Systems 35 (10) (1988) 1257–1272.

URL http://dx.doi.org/10.1109/31.7600

[8] M. Fiedler, Special matrices and their applications in numerical mathematics, Martinus Ni-jhoff Publishers, Dordrecht, 1986, translated from the Czech by Petr Pˇrikryl and Karel Segeth. URL http://dx.doi.org/10.1007/978-94-009-4335-3

[9] M. Gao, B. Cui, Global robust exponential stability of discrete-time interval BAM neural networks with time-varying delays, Appl. Math. Model. 33 (3) (2009) 1270–1284.

URL http://dx.doi.org/10.1016/j.apm.2008.01.019

[10] Z. Huang, S. Mohamad, Y. Xia, Exponential periodic attractor of discrete-time BAM neural networks with transmission delays, Comput. Math. Model. 20 (3) (2009) 258–277.

URL http://dx.doi.org/10.1007/s10598-009-9035-0

[11] Z. Huang, X. Wang, F. Gao, The existence and global attractivity of almost periodic sequence solution of discrete-time neural networks, Physics Letters A 350 (34) (2006) 182 – 191. URL http://dx.doi.org/10.1016/j.physleta.2005.10.022

[12] X.-G. Liu, M.-L. Tang, R. Martin, X.-B. Liu, Discrete-time BAM neural networks with variable delays, Phys. Lett. A 367 (4–5) (2007) 322–330.

URL http://dx.doi.org/10.1016/j.physleta.2007.03.037

[13] E. Liz, J. B. Ferreiro, A note on the global stability of generalized difference equations, Appl. Math. Lett. 15 (6) (2002) 655–659.

URL http://dx.doi.org/10.1016/S0893-9659(02)00024-1

[14] C. M. Marcus, R. M. Westervelt, Stability of analog neural networks with delay, Phys. Rev. A (3) 39 (1) (1989) 347–359.

URL http://dx.doi.org/10.1103/PhysRevA.39.347

[15] S. Mohamad, Global exponential stability in continuous-time and discrete-time delayed bidi-rectional neural networks, Phys. D 159 (3-4) (2001) 233–251.

URL http://dx.doi.org/10.1016/S0167-2789(01)00344-X

[16] S. Mohamad, K. Gopalsamy, Dynamics of a class of discrete-time neural networks and their continuous-time counterparts, Math. Comput. Simulation 53 (1-2) (2000) 1–39.

URL http://dx.doi.org/10.1016/S0378-4754(00)00168-3

[17] S. Mohamad, K. Gopalsamy, Exponential stability of continuous-time and discrete-time cel-lular neural networks with delays, Appl. Math. Comput. 135 (1) (2003) 17–38.

URL http://dx.doi.org/10.1016/S0096-3003(01)00299-5

[18] S. Mohamad, A. G. Naim, Discrete-time analogues of integrodifferential equations modelling bidirectional neural networks, J. Comput. Appl. Math. 138 (1) (2002) 1–20.

(15)

[19] M. Pituk, Global asymptotic stability in a perturbed higher-order linear difference equation, Comput. Math. Appl. 45 (6-9) (2003) 1195–1202, advances in difference equations, IV. URL http://dx.doi.org/10.1016/S0898-1221(03)00084-1

[20] S. Udpin, P. Niamsup, Global exponential stability of discrete-time neural networks with time-varying delays, Discrete Dyn. Nat. Soc. (2013) Art. ID 325752, 4.

URL http://dx.doi.org/10.1155/2013/325752

[21] X. Wei, D. Zhou, Q. Zhang, On asymptotic stability of discrete-time non-autonomous delayed Hopfield neural networks, Comput. Math. Appl. 57 (11-12) (2009) 1938–1942.

URL http://dx.doi.org/10.1016/j.camwa.2008.10.031

[22] H. Xu, R. Wu, Periodicity and exponential stability of discrete-time neural networks with variable coefficients and delays, Adv. Difference Equ. (2013) 2013:226, 19.

URL http://dx.doi.org/10.1186/1687-1847-2013-226

[23] J. Yu, K. Zhang, S. Fei, Exponential stability criteria for discrete-time recurrent neural networks with time-varying delay, Nonlinear Anal. Real World Appl. 11 (1) (2010) 207–216. URL http://dx.doi.org/10.1016/j.nonrwa.2008.10.053

A. Bento, Departamento de Matem´atica, Universidade da Beira Interior, 6201-001 Covilh˜a, Portugal

E-mail address: bento@mat.ubi.pt

J. Oliveira, Centro de Matem´atica, Universidade do Minho, Campus de Gualtar, 4710-057 Braga, Portugal

E-mail address: jjoliveira@math.uminho.pt

C. Silva, Departamento de Matem´atica, Universidade da Beira Interior, 6201-001 Covilh˜a, Portugal

E-mail address: csilva@mat.ubi.pt URL: www.mat.ubi.pt/~csilva

Referências

Documentos relacionados

Os resultados revelaram que, antes e depois da época de exames, quanto mais elevados são os níveis de ansiedade, maior a sonolência diurna, mais elevada a latência do sono e pior é

Abstract - Neural networks and hybrid models were used to study substrate and product inhibition observed in the enzymatic hydrolysis of cellobiose at 40ºC, 50ºC and 55ºC, pH

No que respeita às diferenças encontradas entre sexos na perceção de competência, embora não tenham sido significativas, verifica-se que são os rapazes de ambos

O presente trabalho propõe mostrar as potencialidades da banda desenhada como recurso criativo para o desenvolvimento da competência comunicativa no ensino-aprendizagem

Figura 3.3 - Evolução da carga máxima e do deslocamento máximo em função da energia, para o primeiro impacto e para ambas as geometrias estudadas Curvas de carregamento

Based on the two abstract models of Darwinian and Lamarckian evolutionary theories built using neural networks and genetic algorithms, this research aims to

Abstract – The purpose of this work was to evaluate a methodology of adaptability and phenotypic stability of alfalfa genotypes based on the training of an artificial neural

In Section 3, we present the results on global exponential stability of a general class of nonautonomous delay differential equations, which includes most of neural network models..