As of March 1, 2009, The MTTLR Blog is migrating to http://www.mttlrblog.org. All new updates will be posted at the new location.

Wednesday, October 29, 2008

Will Co-location Kill the Stock Exchange* or, Is Too Much Tech Bad for Business?

by: Elina Druker, Associate Editor, MTTLR

Image Stacked Servers by redjar.
Used under a Creative Commons BY-SA 2.0 license.

A Brief Introduction to Relevant Developments

Technology is changing the course of investing. According to Ivy Schmerken of Advanced Trading, about 85% of all trading on U.S. exchanges is automated.2 Computer algorithms (algorithmic trading) are entering trading orders for over 30% of trades on traditional stock exchanges and possibly as high as 80% on American and equity markets.3 Trading algorithms are “a series of calculated steps strung together to … buy and sell large blocks of stock… [which] can be tuned to execute almost any strategy” in order to take advantageof momentary opportunities.4

The success of an algorithm depends, not only on the formula’s strategy, but on the speed at which information is input and the trade executed. Algorithmic traders are constantly pushing trading systems to be faster. The cost of speed is astounding. According to Joel Clark, of Waters, a recent Tabb Group study “estimates that reducing the latency of transaction processing to gain a microsecond of improvement costs a firm approximately $250. Reducing latency by 6 milliseconds, then, equates to 6,000 microseconds and a possible cost of $1.5 million.”5

As a result of this ‘need-for-speed’ exchanges are in competition, not only with each other,6 but with other, faster trading methods such as Alternative Trading System (ATS). Unlike exchanges, which have been around long enough to have outdated infrastructures, new ATSs have the newest technology available, enabling them to “complete trades at up to ten times the speed of older rivals.”7 Popular ATSs, such as Electronic Communication Networks (ECNs) and most recently, Crossing Networks and Dark Pools, have numerous additional advantages over exchanges, including minimized market impact of bulk trades and anonymity. “What’s troubling overall for exchanges is how much market share they are losing to the other … innovative technology venues,” says Brad Bailey, senior analyst with the consulting firm Aite Group.8 Traditional exchanges’ market-share dropped from 86% to 73% in 2007-2008, and is expected to keep falling. Alternative Trading Systems now process 13% of all matched trades.9 ATS may be open to gaming, through pinging and front-running, and may fragment the market, but possible legal implications (and the S.E.C.) haven’t gotten in the way.10 One method of eliminating latency is called proximity hosting.

Telecommunication companies, such as Savvis, place their hardware in or near an exchange’s data center. They then sell rack space close to the trading venue, or access to trading servers, the venue’s gateway or software.11 Subscribers of proximity hosting receive low latency connections to one or multiple closely located trading venues.12

Implications of Automated Trading

To stay competitive, exchanges have had to develop their own technologies. They have had to update physical infrastructure, develop technology to move trades from the floor to electronic trading, disseminate market data faster and create “low touch/ no touch trading strategies.”13 They have also changed the nature of the exchange business. Hundreds of independent exchanges have begun a mass consolidation. BNY ConvergEx’s Managing Director, Joe Cangemi expects much more consolidation, predicting that “there will be three or five survivors.”14

Exchanges already wear many hats and charge for their services every step of the way. They charge traded companies “listing fees” and commission fees, and investors pay member and admission fees15, fees for physical seats on the floor16, fees per transaction17 for executing and updating trades, and market data providers also pay for publishing online and for subscriptions18 to real-time information.

Recently, exchanges have entered the telecommunications business. They have started selling physical co-location of investor servers at the exchange’s data center, competing with those technology firms who offer low-latency proximity hosting. Co-location is like proximity hosting, except that the exchange hosts the subscriber’s box in its data center. It is closer to the information feed, thus faster, than proximity hosting. Co-location can diminish latency between a client and the exchange to below 64 microseconds.19 The London Stock Exchange’s co-location service, TradElect will double to 20,000 continuous messages per second, with end-to-end execution latency reduced to three milliseconds in October 2008.20

Co-location, however, is a dangerous strategy for all parties.

First, it’s expensive for subscribers. That means that institutional investors, nearly all of whom are using algorithms, can buy an advantage. Larger, well established investor groups can afford to co-locate while smaller boutiques21 cannot. This could create a barrier to market entry for small institutional investors. A few weeks ago, there were 6 major investment banks. Suddenly, the landscape has changed and co-location systems may prevent new market players from filling the spots left by the crisis.

Second, exchanges do not have the infrastructure to prevent jitters in latency. 22 Jitters and fluctuations in information can create unpredictable results. Unlike third party telecommunication companies, who can maintain some minimal amount of technology risk management, algorithms co-located with an exchange may act on unfiltered, possibly flawed data. Technology companies are in the business of keeping their software and hardware up-to-date. Exchanges wear too many hats to keep ahead of the technology curve. They simply are not in the best position to manage the flow of information.

So, there are risks. So what? Well, if large institutional investors can block small investors from entering the modern algorithmic trading market, and can make risky decisions instantly, maybe someone should revive risk management and competition. Regulations need to prevent exchanges from wearing the technology-provider hat. Algorithmic trading already creates plenty of room for error and there is no room for additional error on the exchange side. Besides, should access to the best information really be the factor separating large and small investors?

* The Buggles, Video Killed the Radio Star, 1979.
2 Ivy Schmerken, Exchange Consolidation Wave Is Expected to Continue in 2008, Advanced Trading, Nov. 1, 2007.
3 Wikipedia, Algorithmic Trading.
4 Mara Der Hovanesian, Cracking The Street's New Math, Businessweek.com, Apr. 18, 2005.
5 Joel Clark, Still the Need for Speed, Waters, Apr. 1, 2008.
6 Wikipedia, Algorithmic Trading.
7 The battle of the bourses, The Economist, May 29 2008.
8 Schmerken, supra note 2, at page 4.
9 The Economist, supra note 7.
10 Video game? Dark pools battle pingers, gamers in unregulated markets, Financial Week, Apr. 22, 2008.
11 Proximity Hosting: Plug’n’Trade or Pay’n’Wait?, Automated Trader, 2008.
12 Check your speed, The Trade News, Oct. 16, 2007.
13 Id.
14 Schmerken, supra note 2, at page 4.
15 Fees Calculator, London Stock Exchange, 2008.
16 Hillary Wicai, The Marketplace Report: Pricey Seats on the NYSE, National Public Radio, Jul. 26, 2005.
17 NYSE to cut back trading floor, costs, LA Times, September 13, 2007, at print edition C-4.
18 Nasdaq OMX launches free real-time market data service, Finextra.com, Jun. 2, 2008.
19 Rich Miller, Proximity Hosting: When Microseconds Matter, Data Center Knowledge, Dec. 3, 2007.
20 Penny Crosman, London Stock Exchange Offers Collocation, Wall Street &Technology, September 2, 2008.
21 Admittedly, some small investors may gain access to co-location though their brokers, but at a high cost far exceeding what they would have had to pay for data in the past as emerging market players.
22 Sun Microsystems, Inc., Low Latency: Eliminating Application Jitter with Solaris™, May, 2007.

Labels: ,


Anonymous tito said...

Interesting and well-researched post.

"Regulations need to prevent exchanges from wearing the
technology-provider hat."

I'm not sure this is either desirable or possible. Modern exchanges are
fundamentally transaction processing businesses. They are all about
technology. They are also, happily, highly regulated. Who better to
provide these critical services?

"should access to the best information really be the factor separating
large and small investors?"

Alas, this has always been and, I expect, always will be a key factor
separating large and small investors. From the pricey Bloomberg
terminal sitting on the institutional trader's desk to the tick data
feeds, to the specialized DBs for managing those feeds, to the tens of
thousands of dollars a month subscriptions to machine-readable news
sources, and on and on - the "edge" might change over time but it is
invariably costly. How else could it be?

December 2, 2008 at 4:11 AM  

Post a Comment

Subscribe to Post Comments [Atom]

<< Home