• Nenhum resultado encontrado

4.4 Efficient estimator of correlations

4.4.3 Conclusions

0 1000 2000 3000 -0.2

0.0 0.2 0.4 0.6 0.8 1.0

<rto A(t)rto A(t+)>

(a)

-3000 -2000 -1000 0 1000 2000 3000

-0.2 -0.1 0.0 0.1 0.2 0.3 0.4 0.5 0.6

<rto A(t)rto B(t+)>

(b)

Figure 4.12: The decay functions, in case of the persistent random walk, sampled with Weibull sampling intervals. The autocorrelations decrease slowly.

(a) decay of autocorrelation: D rAt

0(t)rtA

0(t+xt0)E

(b) decay of cross-correlation:

D rAt

0(t)rtB

0(t+xt0)E .

independently with sampling intervals drawn from a Weibull distribution (again with parameters a=20 and b=0.7). This construction generates slowly vanishing correlations (that would be exponentially decaying without the asynchronous sampling). Again we generate 50 pairs of time series, each being 25000 steps long, the persistency isα=0.999. Figure 4.12 shows two of the decay functions (the decay of the autocorrelations are identical so we only show one of them). Here we set∆t0=50.

In Figure 4.13(a) we show the results for the correlation coefficients on different time scales. The decomposition method gives good results in this case too. The ratio of the standard deviations at ∆t =1000 is close to 3.5, signaling that in order to obtain the same precision, we need roughly one order of magnitude more data points in case of direct measurements than for the decomposition method. Figure 4.13(b) shows the extrapolation to the asymptotic value of the correlation, using piecewise Cubic Hermite Interpolation method. The extrapolated curve intercepts the y-axis at the value of 1.002. Applying the extrapolation to the endpoints of the error bars, comparing the direct measurements and the decomposition results, we find a factor of 20 improvement in the precision.

0 200 400 600 800 1000 0.4

0.5 0.6 0.7 0.8 0.9 1.0

direct m easurem ents

decom position

t

t

(a)

0.000 0.005 0.010 0.015 0.020

0.5 0.6 0.7 0.8 0.9 1.0

t

1/t

(b)

Figure 4.13: (a) Comparison of the directly measured correlation coefficients and the coefficients determined through the decomposition method in case of the persis- tent random walk, with Weibull sampling. In blue we show the average and the standard deviation of the direct measurements computed over the ensemble of 50 time series pairs. In red we show the average and the standard deviation obtained by the decomposition method, computed over the ensemble of 50 time series pairs.;

(b) The correlation as a function of 1/∆t for the persistent random walk, with Weibull sampling. The circles show the correlations determined using the decompo- sition method, and the (red) curve is determined by using piecewise Cubic Hermite Interpolation method. The extrapolation gives an asymptotic correlation value of 1.002.

good statistics. We demonstrated our method on generated data sets, show- ing that the error of correlations determined by our method is much smaller than the errors of correlations measured directly, using long time windows.

Extrapolating to the asymptotic correlation from the determined correla- tion values leads to an increase of almost a magnitude in the accuracy of the estimation of the underlying correlation. A very important question in the estimation of the asymptotic correlation value is the determination of the shortest meaningful time scale, ∆t0, on which we measure the decay of correlations. The asynchronicity of the signals slows down these decays. We showed that also in case of non-trivial decay of correlations the decomposi- tion gives a good estimation of the asymptotic correlation value [TK08].

Dynamics of the order book around large price changes

In this Chapter we study the dynamics of the limit order book of the London Stock Exchange around large price changes. Analysing the limit order book allows us to look at the market at the level of single orders, and this way connect the microscopic dynamics to macroscopic measurables and possibly to human behaviour. Our aim is to analyse large price changes that can happen relatively often, maybe every month in case of liquid stocks. It is important to stress that we are not interested in market crashes or bubbles (when such changes happen throughout the market) but in large intra-day price changes specific to a particular stock.

[ZAK06] analysed the post-event dynamics of large price changes ap- pearing on short time scales for the New York Stock Exchange and Nasdaq.

They found sharp peaks in the bid-ask spread, volatility and traded volume at the moment of the events with slow decay to normal value that in some cases could be characterised by a power law. Several studies dealt with the analysis of the structure of the limit order book preceding a large price change [FGL+04, WR06, PLM06, JLGB08]. Their results show that – in contrast to earlier belief – the volume of market orders play a minor role in the creation of large price jumps. Instead it is the disappearance of liquidity in the limit order book that results in extreme price changes. [PLM06] stud- ied the relaxation of the bid-ask spread after large spread variations. They found a slow relaxation to normal values, characterised by a power law with exponent around0.4–0.5. [JLGB08] cross-correlate high-frequency time se- ries of stock returns with different news feeds, showing evidence that news cannot explain the price jumps: In general jumps are followed by increased volatility, while news are followed by lower volatility levels.

We mainly focus on the post-event dynamics and the relaxation of the different measures. We also show results concerning the reason for large events: We find that the distribution of the first gap on the two sides of the

50

book in pre-event periods are different from that of usual periods, suggesting that the number of these unoccupied price levels (the granularity of the limit order book) may be associated with large price changes. This enforces the previous results of Refs. [FGL+04, WR06, PLM06, JLGB08] showing that there are usually liquidity crises causing large price jumps, not huge market orders “eating up” one side of the book. We study the dynamics of the volatility and the bid-ask spread near large events. We find that both have peak in the moment of the price change. Their relaxation is slow and can be characterized by a power law. Analysing the behaviour of market participants, we show results on the bid-ask imbalance, the number of standing limit orders in the book, the aggregated number and aggregated volume of limit orders arriving, the aggregated number of cancelations and the relative rate of different order types. We find that the shape of the book and the relative imbalance changes very strongly, with a peak at the event and slow decay afterwards. The activity of both limit orders arriving and being canceled increases and, after a peak at the event, decays according to a power law. Surprisingly we find that the relative rates of limit orders, market orders and cancelations do not vary strongly in the vicinity of price jumps.

For the relaxation of most of the above measures we find power laws with similar exponents around 0.3−0.4. The exponents are interestingly very similar.

To further study the possible reasons for the relaxation of the volatility and the bid-ask spread, we construct a zero intelligence multi-agent model mimicking the actual trading mechanism and order flow. When introduc- ing large price jumps in the model, we find power law relaxations in both of the values. This suggests that the slow relaxations can be reproduced without complicated behavioural assumptions. We show analytic results on relaxation of the spread in the zero intelligence model.

Documentos relacionados