In this section, we perform some phaseplane analysis for system (1.1). Let \((x(t), y(t))=(x(t,x_{0},y_{0}), y(t, x_{0},y_{0}))\) be the solution of (1.1) satisfying the initial value
$$x(0, x_{0}, y_{0})=x_{0}, \qquad y(0, x_{0}, y_{0})=y_{0}. $$
We denote
$$G(x)= \int_{0}^{x}g(s)\,ds, \qquad F(y)= \int_{0}^{y}f(s)\,ds. $$
Lemma 2.1
Assume that conditions (\(h_{i}\)) (\(i=1,2,4\)) hold. Then every solution
\((x(t), y(t))\)
of (1.1) exists on the whole
taxis.
Proof
In view of (\(h_{1}\)), there exists a constant \(c_{0}>0\) such that
$$ \operatorname{sgn}(x)g(x)>0 \quad\mbox{for } \vert x \vert \geq c_{0}. $$
(2.1)
Let us define two Lyapunovlike functions \(V_{\pm}:\mathbf{R^{2}}\to \mathbf{R}\):
$$V_{\pm}(x, y)=G(x)+F(y)\pm y h(x), $$
where \(h:\mathbf{R}\to\mathbf{R}\) is a continuously differentiable function such that \(h(x)=M \operatorname{sgn}(x)\) for \(x\geq c_{0}\) with M from (\(h_{4}\)). We will prove that \(V_{\pm}\) are coercive, that is, \(V_{\pm}(x, y)\to+\infty\) as \(x+y\to+\infty\). From (\(h_{1}\)) we know that \(\lim_{x\to+\infty}G(x)=+\infty\). From (\(h_{2}\)) we get that
$$\frac{\alpha}{2}\leq\liminf_{y\to+\infty}\frac{F(y)}{y^{2}}\leq \limsup_{y\to+\infty}\frac{F(y)}{y^{2}}\leq\frac{\beta}{2}. $$
Since \(h(x)\) is bounded, we further see that the inequalities
$$ \frac{\alpha}{2}\leq\liminf_{y\to+\infty}\frac{F(y)\pm yh(x)}{y^{2}}\leq\limsup _{y\to+\infty}\frac{F(y)\pm yh(x)}{y^{2}}\leq\frac{\beta}{2} $$
(2.2)
hold uniformly with respect to \(x\in\mathbf{R}\). From (2.2) we have that the limits
$$\lim_{y\to+\infty}\bigl(F(y)\pm yh(x)\bigr)=+\infty $$
hold uniformly with respect to \(x\in\mathbf{R}\). Therefore \(V_{\pm}(x, y)\to+\infty\) as \(x+y\to+\infty\).
Next, we show that every solution \((x(t), y(t))\) of (1.1) is defined on the interval \([0, +\infty)\). Set \(V_{+}(t)=V_{+}(x(t), y(t))\). We first prove that there exist constants \(c_{1}>0\) and \(c_{2}>0\) such that
$$ V_{+}'(t)\leq c_{1} V_{+}(t)+c_{2}. $$
(2.3)
For simplicity, we denote \(p_{1}(t)=p_{1}(t, x(t), y(t))\), \(p_{2}(t)=p_{2}(t, x(t), y(t))\). By (1.1) we have
$$\begin{aligned} V_{+}'(t)={}&\bigl(p_{1}(t)h \bigl(x(t)\bigr)\bigr)g\bigl(x(t)\bigr)+y(t) h'\bigl(x(t)\bigr) \bigl(f\bigl(y(t)\bigr)+p_{1}(t)\bigr) \\ &+p_{2}(t) \bigl(f\bigl(y(t)\bigr)+h\bigl(x(t)\bigr)\bigr). \end{aligned} $$
If \(x(t)\geq c_{0}\), then we infer from (\(h_{4}\)), the definition of \(h(x)\), and (2.1) that
$$\bigl(p_{1}(t)h\bigl(x(t)\bigr)\bigr)g\bigl(x(t)\bigr)\leq0. $$
If \(x(t)\leq c_{0}\), then it follows from (\(h_{4}\)) and the continuity of \(g(x)\) and \(h(x)\) that there exists \(\alpha_{0}>0\) such that
$$\bigl(p_{1}(t)h\bigl(x(t)\bigr)\bigr)g\bigl(x(t)\bigr)\leq \alpha_{0}. $$
In view of (\(h_{2}\)), we conclude that there exist \(l_{0}>0\) and \(\alpha_{1}>0\) such that
$$ \bigl\vert f(y) \bigr\vert \leq l_{0} \vert y \vert + \alpha_{1} \quad\mbox{for }y\in \mathbf{R}. $$
(2.4)
Since \(h'(x)=0\) for \(x\geq c_{0}\), we know that \(h'(x)\) is bounded, and then there exists \(\beta_{0}>0\) such that \(h'(x)\leq\beta_{0}\), \(x\in\mathbf{R}\). It follows from (2.4) that
$$\bigl\vert y(t)h'\bigl(x(t)\bigr)f\bigl(y(t)\bigr) \bigr\vert \leq\beta_{0} l_{0} y^{2}(t)+\alpha_{1} \beta_{0} \bigl\vert y(t) \bigr\vert . $$
Meanwhile, in view of (\(h_{4}\)), we have
$$\bigl\vert y(t)h'\bigl(x(t)\bigr)p_{1}(t) \bigr\vert \leq M\beta_{0} \bigl\vert y(t) \bigr\vert . $$
From (\(h_{4}\)) and (2.4) we get that
$$\bigl\vert p_{2}(t)f\bigl(y(t)\bigr) \bigr\vert \leq M l_{0} \bigl\vert y(t) \bigr\vert +M \alpha_{1}. $$
Since \(h(x)\) is bounded for \(x\in\mathbf{R}\), there exists \(\beta_{0}'>0\) such that \(h(x)\leq\beta_{0}'\), \(x\in\mathbf{R}\). Consequently, we have
$$\bigl\vert p_{2}(t) h\bigl(x(t)\bigr) \bigr\vert \leq M \beta_{0}'. $$
Therefore, we obtain
$$ \begin{aligned}[b] V_{+}'(t)\leq{} &\beta_{0} l_{0} y(t)^{2}+(\alpha_{1}\beta_{0}+M \beta_{0}+Ml_{0}) \bigl\vert y(t) \bigr\vert +\bigl( \alpha_{0}+M\alpha _{1}+M\beta_{0}' \bigr) \\ \leq{} & \beta_{1} y(t)^{2}+\beta_{1}' \end{aligned} $$
(2.5)
with \(\beta_{1}=\beta_{0} l_{0}+\frac{1}{2}\) and \(\beta_{1}'=\frac{1}{2}(\alpha_{1}\beta_{0}+M\beta_{0}+Ml_{0})^{2}+(\alpha _{0}+M\alpha_{1}+M\beta_{0}')\). From (2.2) we know that there exist \(l_{1}>0\) and \(\beta_{2}>0\) such that
$$ y^{2}\leq l_{1}\bigl(F(y)+yh(x)\bigr)+\beta_{2} \quad\mbox{for } (x, y)\in\mathbf{R^{2}.} $$
(2.6)
Combining (2.5) and (2.6), we get
$$V_{+}'(t)\leq\beta_{1}l_{1}\bigl(F\bigl(y(t) \bigr)+y(t)h\bigl(x(t)\bigr)\bigr)+\beta_{1}\beta_{2}+ \beta_{1}'. $$
Since \(\lim_{x\to+\infty}G(x)=+\infty\), there exists \(G_{0}>0\) such that \(G(x)+G_{0}\geq0\) for \(x\in\mathbf{R}\). We conclude that
$$V_{+}'(t)\leq \beta_{1}l_{1}\bigl(G\bigl(x(t) \bigr)+F\bigl(y(t)\bigr)+y(t)h\bigl(x(t)\bigr)\bigr)+\beta_{1}l_{1}G_{0}+ \beta_{1}\beta _{2}+\beta_{1}'. $$
Set \(c_{1}=\beta_{1}l_{1}\) and \(c_{2}=\beta_{1}l_{1}G_{0}+\beta_{1}\beta_{2}+\beta_{1}'\). We get that \(V_{+}'(t)\leq c_{1}V_{+}(t)+c_{2}\). Then, for any finite \(\omega>0\), we have
$$V_{+}(t)\leq V_{+}(0)e^{c_{1}\omega}+\frac{c_{2}}{c_{1}}\bigl(e^{c_{1}\omega}1 \bigr) \quad\mbox{for } t\in[0, \omega). $$
Since \(V_{+}\) is coercive, there is no blowup for \((x(t), y(t))\) on any finite interval \([0, \omega)\). Therefore, \((x(t), y(t))\) exists on the interval \([0, +\infty)\).
To prove that \((x(t), y(t))\) exists on the interval \((\infty, 0]\), we consider another Lyapunovlike function \(V_{}(x, y)\). Set \(V_{}(t)=V_{}(x(t), y(t))\). Using the same methods as before, we can show that there exist two positive constants \(d_{1}\), \(d_{2}\) such that
$$ V_{}'(t)\geqd_{1}V_{}(t)d_{2}. $$
(2.7)
Then, for any \(\omega>0\), we have
$$V_{}(t)\leq V_{}(0)e^{d_{1}\omega}+\frac{d_{2}}{d_{1}}\bigl(e^{d_{1}\omega}1 \bigr) \quad\mbox{for }t\in(\omega, 0]. $$
Since \(V_{}\) is coercive, there is also no blowup for \((x(t), y(t))\) on any finite interval \((\omega, 0]\). Therefore, \((x(t), y(t))\) exists on the interval \((\infty, 0]\). The proof is complete. □
Since the uniqueness of the solutions to Cauchy problems of (1.1) is assumed, we can define the Poincaré map \(P: \mathbf{R}^{2}\to\mathbf{R}^{2}\) as follows:
$$P:(x_{0}, y_{0})\to(x_{1}, y_{1})= \bigl(x(2\pi, x_{0}, y_{0}), y(2\pi, x_{0}, y_{0})\bigr). $$
It is well known that the Poincaré map P is an areapreserving homeomorphism. The fixed points of P correspond to the 2πperiodic solutions of (1.1).
On the basis of the global existence of solutions of (1.1), we can get the elasticity property of solutions of (1.1) by using a classical result (Theorem 6.5 in [15]).
Lemma 2.2
Assume that conditions (\(h_{i}\)) (\(i=1,2,4\)) hold. Then, for any
\(T>0\)
and
\(R_{1}>0\), there exists
\(R_{2}>R_{1}\)
such that:

(1)
If
\(x_{0}^{2}+y_{0}^{2}\leq R_{1}^{2}\), then
\(x(t)^{2}+y(t)^{2}\leq R_{2}^{2}\), \(t\in[0, T]\).

(2)
If
\(x_{0}^{2}+y_{0}^{2}\geq R_{2}^{2}\), then
\(x(t)^{2}+y(t)^{2}\geq R_{1}^{2}\), \(t\in[0, T]\).
From Lemma 2.2 we know that if \(x_{0}^{2}+y_{0}^{2}\) is large enough, then \(x^{2}(t)+y^{2}(t)\neq0\), \(t\in[0, T]\). Thus we can take the polar coordinate transformation
$$x(t)=r(t)\cos\theta(t), \qquad y(t)=r(t)\sin\theta(t). $$
Under this transformation, system (1.1) becomes
$$ \left \{ \textstyle\begin{array}{l} \frac{dr}{dt}=g(r\cos\theta)\sin\theta+f(r\sin\theta)\cos \theta+p_{1}(t,r,\theta)\cos\theta +p_{2}(t,r,\theta)\sin\theta,\\\frac{d\theta}{dt}=\frac{1}{r}g(r\cos\theta)\cos\theta\frac {1}{r}f(r\sin\theta)\sin\theta \frac{1}{r}p_{1}(t,r,\theta)\sin\theta+\frac{1}{r}p_{2}(t,r,\theta )\cos\theta, \end{array}\displaystyle \right . $$
(2.8)
where \(p_{1}(t,r,\theta)=p_{1}(t, r\cos\theta,r\sin\theta)\) and \(p_{2}(t,r,\theta)=p_{2}(t,r\cos\theta,r\sin\theta)\). Let us denote by \((r(t), \theta(t))=(r(t, r_{0}, \theta_{0}), \theta(t, r_{0}, \theta_{0}))\) the solution of (2.8) satisfying the initial value
$$r(0,r_{0}, \theta_{0})=r_{0}, \qquad\theta(0, r_{0}, \theta_{0})=\theta_{0} $$
with \(x_{0}=r_{0}\cos\theta_{0}\), \(y_{0}=r_{0}\sin\theta_{0}\). We can rewrite the Poincaré map P in the polar coordinate form \(P: (r_{0}, \theta_{0})\to(r_{1}, \theta_{1})\),
$$r_{1}=r(2\pi, r_{0}, \theta_{0}), \qquad \theta_{1}=\theta(2\pi, r_{0}, \theta_{0})+2l\pi, $$
where l is an arbitrary integer.
Lemma 2.3
Assume that conditions (\(h_{i}\)) (\(i=1,2,4\)) hold. Then, for any
\(T>0\), there exists a constant
\(R>0\)
such that if
\(r(t)\geq R\), \(t\in[0, T]\), then
$$\theta'(t)< 0, \quad t\in[0, T]. $$
Proof
It follows from (\(h_{1}\)) that there exists \(a_{1}>0\) such that
$$\operatorname{sgn}(x)g(x)>M \quad\mbox{for } \vert x \vert \geq a_{1}, $$
which, together with (\(h_{4}\)), implies that
$$\operatorname{sgn}(x) \bigl(g(x)p_{2}(t, x, y)\bigr)>0 \quad\mbox{for } \vert x \vert \geq a_{1}, t, y\in \mathbf{R}. $$
From (\(h_{2}\)) and (\(h_{4}\)) we know that there exist two constants \(\gamma>0\) and \(a_{2}>0\) such that
$$\frac{f(y)+p_{1}(t,x,y)}{y}\geq\gamma\quad\mbox{for } \vert y \vert \geq a_{2}, t, x\in\mathbf{R}. $$
Therefore, if \(x(t)\geq a_{1}\) and \(y(t)\geq a_{2}\), then we have
$$ \frac{d\theta}{dt}=\frac{1}{r}\bigl(g(r\cos\theta) p_{2}(t,r, \theta)\bigr)\cos\theta\frac{1}{r}\bigl(f(r\sin\theta)+ p_{1}(t,r,\theta)\bigr)\sin\theta< 0. $$
(2.9)
If \(x(t)\leq a_{1}\) and \(y(t)\geq a_{2}\), then we have
$$\frac{1}{r}\bigl(f(r\sin\theta)+p_{1}(t,r,\theta)\bigr)\sin \theta\geq \gamma\sin^{2}\alpha, $$
where \(\alpha=\arctan\frac{a_{2}}{a_{1}}\). On the other hand, there exists \(A_{1}>0\) such that
$$\bigl\vert g(x) \bigr\vert + \bigl\vert p_{2}(t,x,y) \bigr\vert \leq A_{1} \quad\mbox{for } \vert x \vert \leq a_{1}, t,y\in \mathbf{R}. $$
It follows that if \(r(t)\) is large enough and \(x(t)\leq a_{1}\), then
$$\biggl\vert \frac{1}{r}\bigl(g(r\cos\theta) p_{2}(t,r, \theta)\bigr) \biggr\vert \leq\frac{A_{1}}{r(t)}\leq \frac{\gamma}{2} \sin^{2}\alpha. $$
Consequently, if \(r(t)\) is large enough and \(x(t)\leq a_{1}\), \(y(t)\geq a_{2}\), then we get
$$\begin{aligned}[b] \frac{d\theta}{dt}&\leq \biggl\vert \frac{1}{r}\bigl(g(r\cos\theta) p_{2}(t,r,\theta)\bigr)\cos\theta \biggr\vert \frac{1}{r} \bigl[f(r\sin\theta)+ p_{1}(t,r,\theta)\bigr]\sin\theta\\&\leq \frac{\gamma}{2}\sin^{2}\alpha.\end{aligned} $$
(2.10)
Finally, we know that there exists \(A_{2}>0\) such that
$$ \bigl\vert f(y) \bigr\vert + \bigl\vert p_{1}(t,x,y) \bigr\vert \leq A_{2} \quad\mbox{for } \vert y \vert \leq a_{2}, t,x\in \mathbf{R}. $$
(2.11)
If \(y(t)\leq a_{2}\) and \(r(t)\) is large enough, then we have that \(x(t)\) is also large enough. Therefore we get from (\(h_{1}\)), (\(h_{4}\)), and (2.11) that, for \(r(t)\) large enough,
$$\begin{gathered} \bigl[g(r\cos\theta)p_{2}(t,r,\theta)\bigr]\cos\theta+\bigl[f(r\sin \theta )+p_{1}(t,r,\theta)\bigr]\sin\theta \\\quad\geq g(r\cos\theta)\cos \theta(M+A_{2})>0.\end{gathered} $$
Furthermore
$$ \frac{d\theta}{dt}=\frac{1}{r}\bigl\{ \bigl[g(r\cos\theta)p_{2}(t,r, \theta )\bigr]\cos\theta+\bigl[f(r\sin\theta) +p_{1}(t,r,\theta)\bigr] \sin\theta\bigr\} < 0. $$
(2.12)
Combining (2.9), (2.10), and (2.12), we get the conclusion of Lemma 2.3. □
Lemma 2.4
Assume that conditions (\(h_{i}\)) (\(i=1,\ldots,4\)) hold. Let
m
be a positive integer. Then, for any given integer
\(n\geq1\), there exists a constant
\(\varrho_{n}>0\)
such that, for
\(r_{0}\geq\varrho_{n}\),
$$\theta(2m\pi, r_{0}, \theta_{0})\theta_{0}< 2n \pi. $$
Proof
From conditions (\(h_{1}\)) and (\(h_{2}\)) we know that there exists \(d>0\) such that
$$ \operatorname{sgn}(x)g(x)\geq M,\quad \vert x \vert \geq d \quad\mbox{and}\quad \operatorname{sgn}(y)f(y)\geq M,\quad \vert y \vert \geq d. $$
(2.13)
From Lemma 2.3 we know that there is a constant \(c_{m}\geq d\) such that
$$r(t)\geq c_{m}, \quad t\in[0, 2m\pi] $$
and
$$\theta'(t)< 0, \quad t\in[0, 2m\pi]. $$
Then the solution \((r(t), \theta(t))\) moves clockwise in the region
$$D=\bigl\{ (x,y)\in\mathbf{R}^{2}:x^{2}+y^{2}\geq c_{m}^{2}\bigr\} . $$
We now decompose the set D into eight regions as follows:
$$\begin{aligned}& D_{1}=\bigl\{ (x,y)\in D: \vert x \vert \leq c_{m}, y>0\bigr\} , \\& D_{2}=\bigl\{ (x,y)\in D:x\geq c_{m}, y\geq d\bigr\} , \\& D_{3}=\bigl\{ (x,y)\in D:x\geq c_{m}, \vert y \vert \leq d\bigr\} , \\& D_{4}=\bigl\{ (x,y)\in D:x\geq c_{m}, y\leqd\bigr\} , \\& D_{5}=\bigl\{ (x,y)\in D: \vert x \vert \leq c_{m}, y< 0\bigr\} , \\& D_{6}=\bigl\{ (x,y)\in D:x\leqc_{m},y\leqd\bigr\} , \\& D_{7}=\bigl\{ (x,y)\in D:x\leqc_{m}, \vert y \vert \leq d\bigr\} , \\& D_{8}=\bigl\{ (x,y)\in D:x\leqc_{m}, y\geq d\bigr\} . \end{aligned}$$
Next, we will estimate the time needed for the solution \((x(t), y(t))\) to pass through the regions \(D_{i}\) (\(i=1,\ldots, 8\)), respectively. Without loss of generality, we assume that \((x_{0}, y_{0})\in D_{1}\). Then, \((x(t), y(t))\) rotates following the cycle
$$D_{1}\to D_{2}\to D_{3}\to D_{4}\to D_{5}\to D_{6}\to D_{7}\to D_{8}\to D_{1}. $$
Given k (\(k=1,\ldots, 8\)), let \([t_{1}, t_{2}]\subset[0, 2n\pi]\) be such that
$$\bigl(x(t), y(t)\bigr)\in D_{k},\quad t\in[t_{1}, t_{2}], $$
and
$$\bigl(x(t_{i}), y(t_{i})\bigr)\in\partial D_{k} \quad(i=1,2). $$
We first treat the case \((x(t),y(t))\in D_{1}\), \(t\in[t_{1}, t_{2}]\). It follows from (\(h_{2}\)) that there exist constants \(\beta_{0}\geq \alpha_{0}>0\) and \(M_{0}>\) such that
$$ \alpha_{0} yM_{0}\leq f(y)\leq\beta_{0} y+M_{0}, \quad y\geq 0. $$
(2.14)
Therefore, if \((x(t), y(t))\in D_{1}\), then we have
$$x'(t)=f\bigl(y(t)\bigr)+p_{1}\bigl(t, x(t), y(t)\bigr) \geq\alpha_{0} y(t)M_{1} $$
with \(M_{1}=M_{0}+M\). From Lemma 2.2 we know that, for any constant l (\(>\sqrt{c_{m}^{2}+\frac{M_{1}^{2}}{\alpha_{0}^{2}}}\)), there is a constant \(l_{0}>l\) such that, for \(r_{0}\geq l_{0}\),
$$r(t)\geq l, \quad t\in[0, 2m\pi]. $$
As a result, we get that, for \(r_{0}\geq l_{0}\) and \((x(t),y(t))\in D_{1}\), \(t\in[t_{1}, t_{2}]\),
$$y(t)=\sqrt{r^{2}(t)x^{2}(t)}\geq\sqrt{l^{2}c_{m}^{2}}. $$
Consequently,
$$x'(t)\geq\alpha_{0}\sqrt{l^{2}c_{m}^{2}}M_{1}>0, $$
which implies that, for any sufficiently small \(\varepsilon>0\),
$$ t_{2}t_{1}\leq\frac{2c_{m}}{\alpha_{0}\sqrt{l^{2}c_{m}^{2}}M_{1}}< \varepsilon, $$
(2.15)
provided that l is sufficiently large. According to Lemma 2.2, we further know that (2.15) holds when \(r_{0}\) is sufficiently large.
Similarly, we have that if \((x(t), y(t))\in D_{5}\), \(t\in[t_{1}, t_{2}]\), then
$$t_{2}t_{1}< \varepsilon, $$
provided that \(r_{0}\) is sufficiently large.
We next treat the case \((x(t), y(t))\in D_{2}\), \(t\in[t_{1}, t_{2}]\). Let us define \(W_{+}:\mathbf{R^{2}}\to\mathbf{R}\) as follows:
$$W_{+}(x, y)=F(y)+G(x)M(xy). $$
Set
$$W_{+}(t)=F\bigl(y(t)\bigr)+G\bigl(x(t)\bigr)M\bigl(x(t)y(t)\bigr). $$
If \(x(t)\geq c_{m}\) and \(y(t)\geq d\), then we get from (\(h_{4}\)) and (2.14) that
$$\begin{aligned} W_{+}'(t)={}& f\bigl(y(t)\bigr) \bigl(p_{2}(t,x,y)M\bigr)+g\bigl(x(t)\bigr) \bigl(p_{1}(t,x,y)M \bigr)+ M\bigl(p_{2}(t,x,y)p_{1}(t,x,y)\bigr) \\ \leq{}& f\bigl(y(t)\bigr) \bigl(p_{2}(t,x,y)M\bigr)+g \bigl(x(t)\bigr) \bigl(p_{1}(t,x,y)M\bigr)+ M\bigl(Mp_{1}(t,x,y) \bigr) \\ \leq{}& f\bigl(y(t)\bigr) \bigl(p_{2}(t,x,y)M\bigr)+\bigl(g \bigl(x(t)\bigr)M\bigr) \bigl(p_{1}(t, x, y)M\bigr)\leq0, \end{aligned} $$
which implies that \(W_{+}(t)\) is decreasing when \((x(t), y(t))\) lies in the field \(D_{2}\). Hence, we get that, for \(t\in[t_{1}, t_{2}]\),
$$W_{+}(t)\geq W_{+}(t_{2}). $$
Consequently,
$$\begin{aligned}[b] &F\bigl(y(t)\bigr)+G\bigl(x(t)\bigr)M\bigl(x(t)y(t)\bigr)\\&\quad\geq F \bigl(y(t_{2})\bigr)+G\bigl(x(t_{2})\bigr)M \bigl(x(t_{2})y(t_{2})\bigr),\quad t\in[t_{1}, t_{2}].\end{aligned} $$
(2.16)
Since \(y(t_{2})=d\), there is a constant \(B>0\) such that \(F(y(t_{2}))\leq B\). It follows from (2.16) that
$$ F\bigl(y(t)\bigr)+My(t)\geq\bigl(G\bigl(x(t_{2})\bigr)G\bigl(x(t) \bigr)\bigr)M\bigl(x(t_{2})x(t)\bigr)M_{1},\quad t \in[t_{1}, t_{2}], $$
(2.17)
where \(M_{1}=B+Md\). According to (2.14), we have that, for \(y\geq0\),
$$ F(y)\leq\frac{1}{2}\beta_{0} y^{2}+M_{0}y. $$
(2.18)
Hence we get that, if \(t\in[t_{1}, t_{2}]\), then we infer from (2.17) and (2.18) that
$$\beta_{0} y^{2}(t)+2(M+M_{0})y(t)\geq 2\bigl(G \bigl(x(t_{2})\bigr)G\bigl(x(t)\bigr)\bigr)2M\bigl(x(t_{2})x(t) \bigr)2M_{1}. $$
Let us take \(\eta>\beta_{0}\) such that, for \(y\geq d\),
$$\eta y^{2}\geq\beta_{0}y^{2}+2(M+M_{0})y+2M_{1}. $$
Then we obtain
$$ \eta y^{2}(t)\geq2\bigl(G\bigl(x(t_{2})\bigr)G\bigl(x(t) \bigr)\bigr)2M\bigl(x(t_{2})x(t)\bigr). $$
(2.19)
Using the mean value theorem, we get
$$\begin{gathered} G\bigl(x(t_{2})\bigr)G\bigl(x(t)\bigr)M\bigl(x(t_{2})x(t) \bigr)\\\quad=\frac {1}{2}\bigl(G\bigl(x(t_{2})\bigr)G\bigl(x(t)\bigr) \bigr)+\biggl(\frac{1}{2}g(\xi)M\biggr) \bigl(x(t_{2})x(t) \bigr),\end{gathered} $$
where \(\xi\in[x(t), x(t_{2})]\). Since \(x(t)\geq c_{m}\), \(t\in[t_{1}, t_{2}]\), we can take \(c_{m}\) large enough such that \(g(\xi)\geq2M\). Therefore, we obtain that, for \(t\in[t_{1}, t_{2}]\),
$$G\bigl(x(t_{2})\bigr)G\bigl(x(t)\bigr)M\bigl(x(t_{2})x(t) \bigr)\geq\frac{1}{2}\bigl(G\bigl(x(t_{2})\bigr)G\bigl(x(t) \bigr)\bigr)), $$
which, together with (2.19), implies that
$$ y(t)\geq\sqrt{\frac{1}{\eta}\bigl(G\bigl(x(t_{2})G\bigl(x(t) \bigr)\bigr)\bigr)}. $$
(2.20)
Since \(x'(t)=f(y(t))+p_{1}(t,x(t),y(t))\), we infer from (2.14) and (2.20) that
$$x'(t)\geq \frac{\alpha_{0}}{\sqrt{\eta}}\sqrt{G\bigl(x(t_{2}) \bigr)G\bigl(x(t)\bigr)}(M_{0}+M). $$
Let us take a fixed positive constant L. Then we have that, for \(x(t)\in[c_{m}, x(t_{2})L]\),
$$G\bigl(x(t_{2})\bigr)G\bigl(x(t)\bigr)\geq G\bigl(x(t_{2}) \bigr)G\bigl(x(t_{2})L\bigr)=g\bigl(\xi'\bigr)L\to+ \infty, \quad x(t_{2})\to+\infty, $$
which implies that there exists a positive constant \(\eta_{0}<\frac{\alpha_{0}}{\sqrt{\eta}}\) such that, for \(x(t)\in [c_{m}, x(t_{2})L]\),
$$\frac{\alpha_{0}}{\sqrt{\eta}}\sqrt{G\bigl(x(t_{2})\bigr)G\bigl(x(t) \bigr)}(M_{0}+M)\geq \eta_{0}\sqrt{G\bigl(x(t_{2}) \bigr)G\bigl(x(t)\bigr)}. $$
Consequently, for \(x(t)\in[c_{m}, x(t_{2})L]\), we get
$$x'(t)\geq\eta_{0}\sqrt{G\bigl(x(t_{2}) \bigr)G\bigl(x(t)\bigr)}. $$
Let \(t_{*}\in[t_{1}, t_{2}]\) be such that \(x(t_{*})=x(t_{2})L\). Then we have that, for any sufficiently small \(\varepsilon>0\),
$$ \begin{aligned}[b] t_{*}t_{1}\leq{}& \frac{1}{\eta_{0}} \int_{c_{0}}^{x(t_{2})L}\frac{dx}{\sqrt {G(x(t_{2}))G(x)}} \\ \leq {}&\frac{1}{\eta_{0}} \int_{0}^{x(t_{2})}\frac{dx}{\sqrt {G(x(t_{2}))G(x)}} \\ ={}& \frac{\sqrt{2}}{\eta_{0}}\tau\bigl(x(t_{2})\bigr)< \frac{\varepsilon}{2}, \end{aligned} $$
(2.21)
provided that \(x(t_{2})\) is large enough. Consequently, we have that \(t_{*}t_{1}<\frac{\varepsilon}{2}\), provided that \(r_{0}\) is sufficiently large. We next estimate \(t_{2}t_{*}\). If \(t\in[t_{*}, t_{2}]\), then we have
$$\begin{aligned} y(t)={}&d+ \int_{t}^{t_{2}}g\bigl(x(s)\bigr)\,ds+ \int_{t}^{t_{2}}p_{2}\bigl(s,x(s), y(s)\bigr) \,ds \\ \geq{}& d+\nu\bigl(x(t_{2})\bigr) (t_{2}t)M(t_{2}t), \end{aligned} $$
where \(\nu(x(t_{2}))=\min\{g(x):x(t_{2})L\leq x\leq x(t_{2})\}\). Obviously, \(\nu(x(t_{2}))\to+\infty\) as \(x(t_{2})\to+\infty\). On the other hand, it follows from (2.14) that
$$x'(t)=f(y)+p_{1}\bigl(t, x(t), y(t)\bigr)\geq \alpha_{0}y(t)(M+M_{0}). $$
Therefore, we get that, for \(x(t_{2})\) large enough,
$$\begin{aligned} L={}& \int_{t_{*}}^{t_{2}}x'(s)\,ds\geq \alpha_{0} \int_{t_{*}}^{t_{2}}y(s)\,ds(M+M_{0}) (t_{2}t_{*}) \\ \geq{}& \alpha_{0}\biggl[d(t_{2}t_{*})+ \frac{1}{2}\nu\bigl(x(t_{2})\bigr) (t_{2}t_{*})^{2} \frac {1}{2}M(t_{2}t_{*})^{2}\biggr](M+M_{0}) (t_{2}t_{*}) \\ \geq{}& (\alpha_{0}dMM_{0}) (t_{2}t_{*})+\frac{1}{4}\alpha_{0}\nu \bigl(x(t_{2})\bigr) (t_{2}t_{*})^{2}, \end{aligned} $$
which, together with \(\nu(x(t_{2}))\to+\infty\) as \(x(t_{2})\to+\infty\), implies that, for any sufficiently small \(\varepsilon>0\),
$$ t_{2}t_{*}< \frac{\varepsilon}{2}, $$
(2.22)
provided that \(x(t_{2})\) is large enough or \(r_{0}\) is large enough. From (2.21) and (2.22) we know that, for any sufficiently small \(\varepsilon>0\),
$$t_{2}t_{1}< \varepsilon, $$
provided that \(r_{0}\) is large enough.
Similarly, we have that, if \((x(t), y(t))\in D_{i}\), \(i=4, 6, 8\), \(t\in[t_{1}, t_{2}]\), then
$$t_{2}t_{1}< \varepsilon, $$
provided that \(r_{0}\) is large enough.
We now consider the case \((x(t), y(t))\in D_{3}\), \(t\in[t_{1}, t_{2}]\). In this case, we have
$$ \bigl\vert y(t) \bigr\vert \leq d, \quad t\in[t_{1}, t_{2}]. $$
(2.23)
Integrating both sides of \(y'=g(x(t))+p_{2}(t, x(t), y(t))\) over \([t_{1}, t_{2}]\) and using \(y(t_{1})=d\) and \(y(t_{2})=d\), we get
$$\begin{aligned} 2d={}& \int_{t_{1}}^{t_{2}}g\bigl(x(s)\bigr)\,ds \int_{t_{1}}^{t_{2}}p_{2}\bigl(s,x(s),y(s)\bigr) \, ds \\ \geq{}& (\mu_{*}M) (t_{2}t_{1}), \end{aligned} $$
where \(\mu_{*}=\min\{g(x(t)): t_{1}\leq t\leq t_{2}\}\). From (2.23), (\(h_{1}\)), and Lemma 2.2 we get that \(\mu_{*}\to+\infty\) as \(r_{0}\to\infty\). Therefore, we have that, for any sufficiently small \(\varepsilon>0\),
$$t_{2}t_{1}< \varepsilon, $$
provided that \(r_{0}\) is large enough.
Similarly, we have that, if \((x(t), y(t))\in D_{7}\), \(t\in[t_{1}, t_{2}]\), then
$$t_{2}t_{1}< \varepsilon, $$
provided that \(r_{0}\) is large enough.
From the previous conclusion we get that, for any sufficiently small \(\varepsilon>0\), there is \(\varrho_{1}>0\) such that if \(r_{0}\geq \varrho_{1}\), then \((x(t), y(t))\in D\), and if
$$\theta(s_{2})\theta(s_{1})=2\pi, $$
then
$$0< s_{2}s_{1}< 8\varepsilon. $$
Consequently, the motion \((x(t),y(t))\) rotates clockwise a turn in a period less than 8ε. Therefore, for any integer \(n\geq 1\), there is \(\varrho_{n}>0\) such that, for \(r_{0}\geq\varrho_{n}\), the motion \((x(t), y(t))\) can rotate more than n turns during the period \(2m\pi\).
The proof of Lemma 2.4 is thus complete. □