The trading environment of the future part one : Order Management Systems (OMS) /Order Execution Management Systems (OEMS) and algorithms (algos). Adding value through the correct algo choices?
The history of stock trading is long, and incredibly, many of the concepts and instruments used appear to have appeared rapidly since the inception of structures permitting formal trading. Classical Roman times purportedly “boasted companies with transferable shares.. but these were usually not enduring endeavours and no considerable secondary market existed”. The first brokers may have been French “courtiers de change”, who in the 12th century managed and regulated agricultural communities’ debts on behalf of banks. The Van der Beurze family held informal meetings to trade commodities in Bruges and Antwerp in the 13th Century. The Dutch East India Company (VOC) was founded in 1602, the first company to issue bonds and shares to the general public. It was traded continuously on the Amsterdam Exchange (considered to be the first formal Stock Exchange – even if the first trading room was a simple roofless courtyard!) Derivative trading started rapidly thereafter, including instruments such as options and repos – and short selling also, the contentious nature of which also appears to date back to this time, as it was banned by the Dutch authorities as early as 1610! The United Kingdom’s trading practices were first formalised by the opening of the Royal Exchange by Elizabeth I of England in 1571 (the London Stock Exchange opened on the 3rd of March 1801). The Buttonwood agreement in 1792 gave rise to organised trading, the precursor for the founding of the New York Stock Exchange. Another factor to note is that a human behavioural element has also stood the test of time: apparently, during the 17th century, “stockbrokers were not allowed in the Royal Exchange due to their rude manners!”.
If the concepts and certain elements of trading behaviour are not necessarily new, the means by which trades are conducted have evolved. Trading floors were densely populated, the physical gestures and sounds of traders were codified and comprehensible to the initiated, bewildering to the uninitiated. The community was gathered around a central, unique marketplace. Interestingly, in a pandemic situation such as that time of writing, social distancing would have been extremely difficult to respect – quid of the capacity of worldwide markets to function in an acceptable manner for the several months for which the COVID-19 pandemic has so far been under way.
Trading via machines constitutes a physical delocalisation of trading floors or their virtualisation – bringing the floor to the participant rather than vice versa. This is a sharp contrast to the centuries-old history of trading preceding the electronic era and has had profound effects in a relatively short timeframe. Noteworthy are also the changes in market rules which either permitted the rise of electronic trading and/or came about as a result of the new horizons made possible by technological progress. Furthermore, events created by human behaviours, deemed to be uncompetitive, have also advanced the cause for electronic (in this case read ‘unable to be tampered with by humans’ on the order’s path from order-giver to execution platform). A quick journey to the origins of electronic trading permits a fuller comprehension of the progress made and challenges faced. Some key dates and stages in the US are as follows:
- 1969: creation of the first Electronic Communication Network (ECN), Instinet, whose purpose was to enable trading outside of regular market hours.
- 1971 birth of the NASDAQ (National Association of Securities Dealers Automated Quotations market) – the v1 of electronic stock markets,. The functionality was limited. Bids and offers were posted electronically, updated just once a day and orders were received by telephone and filled by humans until the 1980s. Benefits proclaimed nonetheless are that this served “to bring down the bid-ask spread and thus help lower the cost of trading”.
- 1975: abolition of fixed commissions by the SEC, creating opportunities for those prepared to trade at a discount.
- 1976: introduction of NYSE’s DOT (Designated Order Turnaround) system, which allowed brokers to route 100-share orders directly to specialists on the floor – and thereby bypassing floor brokers. These trades were still matched by humans nonetheless.
- 1984: the adoption of SuperDOT by the NYSE, a more sophisticated system allowing orders up to 100,000 shares to be routed directly to the floor (bypassing more floor brokers).
- 1987: the market crash. A portion of blame for this is regularly attributed to electronic trading activity having been able to drive prices above reasonable levels and vice versa. However, during the crash, broker-dealers had been found to have ignored their ringing telephones – giving an incentive for the expansion of the SOES (Small Order Execution System). This permitted the entry of small trades electronically versus via the telephone, which, assuming the trades were conform, had to be accepted by market-makers. The amount of orders placed by telephone declined drastically from this point onwards. The system may have improved liquidity in smaller stocks but also was the victim of adverse action by some “SOES bandits” who found ways to play the system (gaming).
In the late 1990s, internet traffic increased dramatically, personal computers became more powerful, ECNs (electronic communication networks) permitted more volume and faster access to bids and offers, online brokerages came into being and proliferated, commissions dropped rendering online trading more profitable and a bull market for technology shares served as a backdrop to incite general and popular interest in the stock markets. “Day trading” was made possible for smaller, so-called “non-professional” players.
2001: Stock trading in pennies began, destroying the profitability of the previous system which allowed a cut of at least a sixteenth of a dollar. The NYSE introduced Direct+, which provided immediate automatic execution of limit orders up to 1,099 shares. Also, hardware was faster enabling the birth of algorithms whose purpose was to make trading decisions involving the timing, pricing, and quantity of orders placed. This then paved the way for the technological advances required for HFT (High Frequency Trading).
Regulators, in their quest for maximising the potential for competition, forced the NYSE to go electronic with the Reg-NMS rule in 2005, whose key element was the trade-through rule, obliging participants to respect the best bid or offer prices. A number of changes have followed, including the fact that some ECNs have become exchanges, exchanges have grown, merged with competitors and even changed statutes & goals (essentially towards profit-making even for the formerly incumbent Stock Exchanges).
In Europe, electronic trading came into place within similar timeframes. In London, for example, the regulatory ‘big bang’ of 1986 abolished fixed commissions, ended the distinction between stockjobbers and stockbrokers on the London Stock Exchange, and screen-based trading replaced open outcry. In 1997, the faster and more efficient SETS (Stock (Exchange) Electronic Trading Service) was launched. In Paris, floor trading for equities was replaced by the CAC electronic system in 1987, Xetra was introduced in Germany in 1997. Naturally, the means used by brokers to access the central Stock Exchanges – and the alternative markets which have come to life following numerous and successive regulatory changes, have become more and more sophisticated, a few decades in cyber terms being effectively quite a long time.
Electronic and automated trading – like most technological advances – is the subject of debate. Advantages proclaimed include: spread reduction (therefore a reduction in overall trading costs), avoidance of some adverse behaviours biased towards broker-dealer profitability (e.g. not answering phones in market crashes), increased competition, decreased market inefficiencies, instant and accurate order placing, correct trade timing, reduction of manual errors and automated checks on multiple market conditions and the idea that markets have been rendered more liquid and trading more systematic by ruling out the impact of human emotions on trading activities. Disadvantages include the fact that gaming is possible (and furthermore appears to have existed from the beginning) and the capacity to “exacerbate the market’s negative tendencies by causing flash crashes and immediate loss of liquidity” (an example being the situation where a market maker might withdraw his quotes leaving the order book bare in fast or uncertain markets and thus fragile/easy to move). Nonetheless palliative measures have been introduced by regulators such as the introduction of circuit breakers and limit up and limit down interruptions to trading. As is often the case, new technology provides new opportunities but includes new risks.
Irrespective of one’s opinion upon the process, algo trading is here, is well-established and is unlikely to go away. A plethora of statistics are being continually generated trying to determine the amount of total trading undertaken by algos (especially those involved in HFT). There is no general consensus apart from the fact that it is likely to be well over half of the US equity market, so likely to be similar amounts elsewhere worldwide. Algos are even being progressively used in trading liquid debt markets. So embracing the concept and looking forward, what attributes are needed in an algorithm and how can a trader make a difference? It makes sense to remind ourselves of the basics before becoming wide-eyed by the technological prowess. A trade is the result of an investment decision, which usually forms part of a wider investment strategy. Once a decision to trade has been made, a further decision as to how to trade is required – (assuming the quantity to be traded is superior than the usual bid/offer). The choice of trading strategy will reflect the broader investment strategy – and may be influenced by specific factors of the marketplaces in question. A proprietary trader will have different needs than those trading on an agency basis. Taking two extremes as examples, a scrapping or spread-playing HFT trading strategy entails extremely short-term ownership of an instrument – sometimes just milli-seconds – and can only be carried out on relatively liquid stocks via robust and sophisticated trading systems. Arbitrage or pair-trading strategies also require similar base parameters. Conversely, a long-term buy-and-hold investment decision might tolerate a fraction less precision in the stock’s initial acquisition price – but the size of the trade might engender liquidity issues, potentially adversely and consequently moving the purchase price. Traders’ benchmarks (which permit a judgement of the trade execution) thus vary and fundamentally (and most important to traders), this impacts how the traders themselves are judged as a consequence.
The suites of algorithms which have been developed since inception initially reproduced manually-executed trading strategies. The ‘classics’ include:
- Percentage of Volume (POV) – the algorithm participates at the volume-rate determined by the trader.
- Volume-weighted Average Price (VWAP) – this breaks up the order into chunks in order to participate most when the volumes are likely to be the heaviest – usually according to historical data – in order to replicate as closest as possible the VWAP.
- Time-weighted Average Price (TWAP) – this spreads an order evenly throughout the trading period selected.
- Implementation Shortfall (IS) – Andre Perold defined this as “the difference in price between the time a portfolio manager makes an investment decision and the actual price achieved. Another component is the opportunity cost of any quantity unexecuted during the implementation”. In essence, this type of algorithm will try to capture liquidity where the price is favourable and decrease participation when it is not. In a similar vein, algorithmic strategies can define several price levels deemed favourable for order acceleration or deceleration. Another strategy can be the adoption of algorithms which seek liquidity at the IS price or better on sources other than primary exchanges in order to try and remain anonymous.
- Algorithms which attempt to trade passively, such as those active up to the mid-spread price.
- Algorithms which follow trends and initiate trades when these are deemed favourable. These use technical trading indicators such as moving averages and breakouts of price channels.
- Algorithms for pair trading and arbitrage. These permit the simultaneous execution of trades to capture gains from market inefficiencies.
- Index Fund Rebalancing algos via program trades when index changes are underway (where multiple lines are traded and trade timing and volume participation is customised to a certain degree). The “target close” option, which endeavours to execute the trade as close to the anticipated closing price, often forms an integral part of program trading algo tools.
- Algorithms permitting the trading of multiple instruments simultaneously (such as options and their hedging, delta neutral strategies).
- Mean reversion/trading range algorithms, which activate when the instrument moves in and out of its usual trading range.
- Systemic Internaliser (SI) algorithms which fill orders from liquidity sources internal to the trader replacing the need to go to market.
- Gaming (unsurprisingly not usually declared as such). These algorithms attempt to detect or “sniff out” large orders and such information, if correctly ascertained, can indicate front-running opportunities for principal traders…
Bearing in mind the number of market participants and their differing objectives, it is clear that “one size” cannot possibly fit all. How can an algo suite give a trader a competitive edge? The basic prerequisites for algorithmic trading should be considered in order to apprehend the opportunities for advancement :
- Access to real-time market data and price quotes – and possibly earnings and company ratio data
- Market connectivity – directly or indirectly via vendors
- Acceptable latency – the time needed for data to transit between the necessary points. A price quote needs to travel from the marketplace to the trader’s terminal, then it needs to be processed, then analysed, then translated into a trading decision, then the corresponding order must be placed at the broker, who then must forward it to the marketplace in question… Latency has decreased dramatically. In the US in 1998, trade execution time was a few seconds, by 2010 this was reduced to a few milliseconds and today, one-hundredth of a micro-second and it is still going down…
It appears logical that benefitting from algo enhancements will be impossible should any of the prerequisites be substandard – although new solutions are appearing for some aspects such as the firmware model (for latency) which integrates hardware and software.
Market behaviour necessarily reflects market participants’ beliefs and anticipations. Algos which can be customised and configured by the user can put traders’ ideas into practice. The capacity to back test algorithims’ behaviour on historical data is highly recommended. Naturally past events will not necessarily repeat themselves exactly – but many aspects of market behaviour are recurrent variations on themes. Real-time analysis of trades underway and where possible, across multiple instruments (should this be relevant), would also provide a real advantage with the capacity to make real-time strategy adjustments.
Other areas for exploration include algorithms which trade upon receipt of new company information by means of formal news streams. This approach entails the capacity for automatic analysis of the news disseminated in a timely (practically instantaneous) manner. Company earnings, dividends, profit warnings –a lot of information – practically a traditional broker’s full offering – might be processed by the algorithm in order for it to react instantly. A similar approach would be automatic screening of trading ideas or opinions on stock performance posted on less formal networks such as social media – which could be integrated into the algorithms’ “thought processes”. It is noteworthy that adopting this type of algo as opposed to a scrapping HFT trading strategy can have additional benefits – such as requiring less latency performance capacity. Multiple strategies are also likely to be employed whereby algos post passively and sweep aggressively if certain criteria are fulfilled. It may well be that algos whose performance has been consistent in times of low volatility need tweaking to encompass the newly rediscovered higher volatility levels of recent times. Looking at the wider processes involved in the investment process, an edge may well be found in improving straight-through processing, starting upstream with fund management regulatory functionality – going through to back office processes. Orders can thus be generated and approved in a speedier fashion, giving a head start to the trading process. And in this respect, considering the technical expertise required, it is highly recommended to collaborate with those capable of providing solutions which can be customized rather than trying to go it alone.
FIA praises CFTC for adopting Electronic Trading and Bankruptcy rules
Electronic Trading: For more than a decade, FIA and FIA PTG have taken a leadership role in identifying risks and strengthening safeguards related to electronic trading in the futures markets globally. Since April 2010, FIA has published six papers proposing industry best practices and guidelines related to these important topics. In addition, FIA has submitted comprehensive responses to numerous CFTC discussions and rulemaking initiatives and participated in multiple CFTC-hosted meetings and industry roundtables. For example, in October 2019, Alicia Crighton, managing director at Goldman Sachs and a member of the FIA board of directors, presented to the CFTC’s Technology Advisory Committee FIA’s best practices for exchange risk controls related to automated trading systems. FIA commends the CFTC for its deliberate approach to this rulemaking and welcomes the opportunity to continue to work with the CFTC and market participants to promote safe, reliable, and vibrant electronic markets. Bankruptcy Reform: FIA commented in July on the CFTC’s recent update to Part 190 of the agency’s regulations dealing with the bankruptcy of FCMs and clearinghouses. In the response, FIA generally supported the commission’s recommendations, but cautioned that a new section (subpart C) would establish a new regime to govern the bankruptcy of a clearinghouse and should be studied further before being approved. FIA appreciates the changes to the proposal and for working with industry to improve the product.
Stringham, Edward Peter: Private Governance: Creating Order in Economic and Social Life. (Oxford University Press, 2015, ISBN 9780199365166), p.42 accessed via https://en.wikipedia.org/wiki/Stock_market accessed Dec 1 2020
https://en.wikipedia.org/wiki/Stock_market accessed Dec 1 2020
https://en.wikipedia.org/wiki/New_York_Stock_Exchange accessed Dec 1 2020
https://en.wikipedia.org/wiki/London_Stock_Exchange accessed Dec 1 2020
Accessed Dec 1 2020
Accessed Dec 1 2020
Accessed Dec 1 2020
Accessed Dec 2 2020
Accessed Dec 1 2020
Accessed Dec 1 2020
Staff Report on Algorithmic Trading in U.S. Capital Markets As Required by Section 502 of the Economic Growth, Regulatory Relief, and Consumer Protection Act of 2018 August 5, 2020
Accessed Dec 2 2020
Other sources not from which direct quotes have not been made:
Ernst & Young. Capital Markets: innovation and the FinTech landscape. How collaboration with FinTech can transform investment banking.
Accessed Dec 1 2020