• Aucun résultat trouvé

Issues faced in high-frequency trading industry

Dans le document The DART-Europe E-theses Portal (Page 51-55)

2.2 Qualitative observations and context

2.2.3 Issues faced in high-frequency trading industry

"#$1 "#$121 "#$13 "#$131 "#$14 "#$141 "#$15 "#$151 "#$1% "#$1%1 "#$11 "#$111 "#$1& "#$1&1 "#$1' "#$1'1 "#$1# "#$1#1 "#$1" "#$1"1

1234

789A6

Figure 2.1: Schematic view of the pro-rata Limit Order Book.

This pro-rata microstructure is in use in some derivatives markets (e.g. London Inter-national Financial Futures and options Exchange, or Chicago Mercantile Exchange), and will be the subject of a whole chapter of this thesis.

2.2.3 Issues faced in high-frequency trading industry

In this subsection, we sum up the main industrial issues where high-frequency trading applies. We focus on the strategic stakes of high-frequency trading, and we put aside the technology issues such as latency minimization, direct market access or hardware speed improvement, which are however crucial aspects of the high frequency trading practice.

Indeed, our aim in to provide coverage for several distinct use of high frequency trading strategies, which are listed and summarized below.

Indirect trading costs minimization

Indirect trading costs minimization consists in obtaining the highest possible price from a sell trade, or obtaining the lowest possible price for a buy trade.

This problem naturally arises when the traded volume is large, due to finite liquidity offering in the LOB (see the above section) : indeed, a large single transaction at market price can desequilibrate the LOB by consuming several levels at once. For example, if an investor sends a market order to buy e.g. 200 shares in the book represented in table 2.1, the result of that transaction is:

• 80 shares at 50.01

• 53 shares at 50.02

• 67 shares at 50.03

therefore, the ask price at the end of this transaction is 50.03 with a volume offered of 14.

Then, the Volume Weighted Average Price of this single transaction is (80×50.01 + 53× 50.02 + 67×50.03)/200 = 50.0193 which is about one tick greater than the ask price before the transaction, which leads to a loss of 2 bp. This effect is known as market impact. To give a comparison point, a strategy that trades on a daily basis, and that is expected to make a 5% return a year, have a daily expected return of 2 bp, and this is wiped out by the market impact. Moreover, several other costs, as the cost of crossing the spread, the brokers’ fee or latency-related issues can penalize a single trade. Therefore we see that it is of crucial importance for portfolios managers to ensure the best possible execution of their trades.

Actors involved in the indirect trading costs optimization are both investors such as large hedge funds or investment banks, that develops their proprietary solution to this problem, and brokers, that typically have a large daily volume to trade on behalf of their clients.

The brokers are moreover bound by the MIFiD regulations in Europe, and RegNMS act in the US, that force them to operate best execution algorithms. Some estimates that about 70%−80% of the european equities [34] traded volume is done by execution algorithms, and other algorithmic trading.

Classical solutions to this problem can be classified around two central ideas: the space-optimization methods, and the time-space-optimization methods.

The space optimization procedure has received little focus from from academic litera-ture, but some works are available, e.g. [48]. The idea underlying this method is to profit from the fact that an asset can often be traded on several distincts marketplaces. There-fore, by splitting a large parent order into smaller children orders, and dispatching them on several marketplaces, the investor is able to take more liquidity at the same time, hence to

be less exposed market impact. This technique is known as smart order routing (SOR), and is extensively implemented by numerous brokers in the industry. The optimization proce-dure in such tools typically involve latency considerations [49], along with high-frequency trading tools to be able to update quickly the trade schedule.

On the contrary, the time optimization procedure received extensive academic coverage, for example [3], [31] or [35]. The idea underlying this method is to split a large parent order into smaller children orders, and to pass the children orders on a extended time period.

One can see the optimization procedure here as finding a balance in the following trade-off:

if the investor trades quickly, they will face no market risk, but will have a large market impact ; on the contrary, if they trade slowly, they will face a large market risk, due to price movements, but will have reduced market impact. Several solutions to this problem have been proposed, with different assumptions, and the general technique is to trade according to a predefined schedule (optimal trading pattern) that arises when balancing the above mentionned trade-off under simplifying assumptions. We will give a lot more precisions on this topic in the following sections.

Finally, from an industrial perspective, some issues remains in that topic. Firstly, the detectability of trade optimization techniques is central to brokers and portfolio managers.

Indeed, the massive use of execution algorithms is know to be at the source of autocorre-lation in trade signs (see [18]) or lagged correautocorre-lation in the trade data of the same asset on two distinct marketplaces. Therefore, such algorithms are very sensitive to the response of the LOB they trade onto, and therefore are less efficient when easily detected by com-petitors. Secondly, mixed market/limit orders execution strategies have so far received less focus from academical literature (see [67] or [37]), although the use of limit order trading is much cheaper than market order trading, and therefore extensively used in the industry in optimal execution strategies.

Pure alpha strategies

Now, let us focus on pure alpha strategies, which is a jargon term that refers to profit maximisation strategies that are largely irrespective of market conditions. This category includes the following strategies:

• Market-making strategies. This class of strategies are based on the idea that using limit orders trading, one can buy at the bid price, and sell at the ask price, and therefore gain the bid/ask spread. Such a strategy typically involve continuously providing bid and ask quotes, along with optimally chosing the prices and quantities of these quotes. The market maker will aim at balancing their inventory, i.e. keeping their position on the risky asset close to zero at all times, and therefore reducing their market risk.

• Statistical arbitrage strategies. This class of strategies are based on the idea that one can exploit the statistical relationship between asset prices (e.g. the cointegration

structure of a market sector, or the relationship between an index and its components) to profit from transient inefficiencies. Such strategies are typically data-intensive, they are directionnal over a short-term horizon and repeat a large number of times the same bet in order to reduce the variance of the outcome. Very often, such strategies are aggressive strategies, meaning that they take liquidity in the LOB (hit orders).

They are also critically dependent on the latency of the trading infrastructure, due to competition between actors running the same strategy.

• Mixed strategies, that are the combination of the two above strategies classes.

Actors involved in such strategies include investment banks, hedge funds, proprietary trading firms and dedicated market-makers. The advantages of running these types of strategies is that their performance is very stable accross market conditions, and therefore the investor is not exposed to market risk. On the contrary, shortcomings of running pure alpha strategies is of two kinds: first, the absolute performance of the strategy is bounded most of the time, due to the fact that arbitrage opportunities are rare, and second, the operational risk is high, since technological performance is of crucial importance in this activity.

This class of strategies was studied in academic litterature, with an emphasis on market-making strategies.

Firstly, the market-making strategies have been succesfully presented as an inventory management problem since the pionner works of Amihud and Mendelsohn in 1980 [5] and Ho and Stoll in 1981 [42], and this approach was modernised in the work of Avellaneda and Stoikov in 2008 [7]. The underlying idea in this approach is take a risk/reward approach:

the market-maker objective is to make the spread, i.e. to buy an asset at the bid, and sell it at the ask price, and therefore gain the bid/ask spread as a revenue. When doing this, the market-maker is subject to the market risk, i.e. the risk of holding a non-zero position in the risky asset, subject to price change. Therefore, the limit orders trading operated by the market-maker has two opposite goals: on one hand, they seek at maximizing the number of trades in which they participate, in order to maximize revenue from making the spread, and on the other hand, they need to keep their position on the risky asset close to zero at all time, in order to keep the market risk low, and this constraint leads to offering a more aggressive price at ask when they hold a long inventory, and conversely. This subject recently received sustained interest in academic works, with for example the works [16], [35] and [37].

Secondly, statistical arbitrage strategies have received less academic interest despite of their wide popularity among high frequency traders. The general idea of such strategies is to build a predictive price indicator based on market phenomena observation, and then trade accordingly. Let us illustrate this principle with two examples. In the work [6], the authors developed a generalized pairs trading approach: they perform a principal

compo-nent analysis on stocks returns, and then obtain amarket portfoliothat explains the stocks returns. Then, the main idea is to assume that the residual between one single stock and the market portfolio should revert to its mean, and trade accordingly. Another example is in the work [21], where the authors propose a simple statistical arbitrage strategy to illustrate the relevance of a predictive price indicator based on a poissonian model for a LOB. Based on the current state of the LOB, they are able to compute the probability of price going up or down in the next milliseconds, and they propose a HF strategy to exploit this information. Finally, in chapter 6, we propose a way to include such predictive price indicator to a mixed limit/market orders strategy.

The next section is devoted to outlining the main results of this thesis.

Dans le document The DART-Europe E-theses Portal (Page 51-55)