Here at Navistaff, we view the hottest area in finance right now to be Algorithmic Trading...but what is it? Where is it going?
To answer these questions: an article by my colleague Jerry Tierney.
Algorithmic trading. This is a term becoming exponentially more prevalent in the financial world every day. Most of you reading this article are probably familiar with algorithmic trading on some level, but we will explore first a general understanding of what is algorithmic trading. In addition, what is its current role in the marketplace, where is it going moving forward, and what is the technological impact on the “trading system” community.
Algorithmic trading, sometimes loosely grouped under the handle “electronic trading”, is made up of a system that collects market data and analyzes the information, and executes trades based on established sets of trading strategies. Basically, the system will place a buy or sell order of a defined quantity based on a quantitative model that automatically generates the timing of orders and the size of orders based on goals specified by the parameters and constraints of the algorithm. To give an example, A trading firm may have employed the strategy that when the spread between Microsoft and Oracle exceeds a certain level in the streaming market data , the algorithmic trading system will be alerted to buy Microsoft and sell Oracle. Some commonly used strategies are VWAP (Volume Weighted Average Price ), TWAP (Time Weighted Average Price), and Pairs.
For the algorithm to result in a successful trade, there are massive quantities of real-time market data streaming through these systems. The primary concern of traders is the speed of this data. High throughput. Low latency. These are two key terms in the ongoing battle for faster market data. Realistically, a matter of milliseconds (1/1000 of a second) can be the difference between a successful trade and an unsuccessful trade. So, slow market data (by no more than a few hundred milliseconds) means that the next guy, whose system is running only thousands of a seconds faster, is executing successfully, while you and your firm are losing opportunities to profit.
According to the Financial Information Forum, a centralized information bureau for U.S. Equities and options market data run by the Securities Industry Automation Corp. (SIAC), message traffic is growing very quickly. In November 2005, the sustained 1-minute peak for market data was 121,000 messages per second. This is up 116% from the November 2004 rate of 56,000 messages per second. With this peak message rate increasing so dramatically, we must constantly raise the ceiling for traffic to ensure the system will be equipped to handle sudden spikes in data. OPRA, the Options Price and Reporting Authority, provides quote and trade data from the six US options exchanges. As of the summer of 2006, any anyone getting a direct OPRA feed must be able to handle a peak messaging rate of 173,000 messages per second, or 1.3 billion messages daily. That number has risen from 53,000 per second at the end of 2004. Basically, the market data messaging infrastructures need to become more and more robust.
So back to the term low latency. The goal of every successful algorithmic trading firm is zero, or near zero latency. Everyone is looking for ways to reduce delays in the transmission of information. One way to accomplish this is to eliminate the middle man. Consolidated market data providers like Comstock, Reuters, and Thomson are continually working to lower latency. Still, if an algorithmic platform provider, such as Flextrade can get the feed directly from the source, aggregate it, and provide that to their customers, that model will always be faster since the data is making one less stop on its journey. And latency, according to Vijay Kedia, president of Flextrade Systems, “Is as important an issue as the data itself. Anyone who gets data straight from the source finds an immediate shortcut.” So Flextrade now gets all of its feeds straight from the sources, the New York Stock Exchange, NASDAQ, and the ECN’s.
The next challenge is for these firms to achieve near zero latency in their internal messaging platforms. Messaging platforms will push data around internally. Usually these systems will operate under a “publish/subscribe” model, where the internal applications and systems receive only relevant data. Algo trading firms put a huge load on their messaging infrastructures, based on the speed necessary, as well as the vast volumes of data processed by the system. The industry leader in messaging products has been Tibco’s Rendezvous. According to Trader’s Magazine, a rising star in this middleware messaging area is a three-year old company named 29West. This Chicago based company is positioning itself as the “David to Tibco’s Goliath”. Their claim is that they have “a higher performing messaging product to Tibco’s RV”. The 29West product is called LBM, which stands for “latency busters messaging”. Upstart 29West is posturing themselves to eventually make a run at Tibco’s dominance in the marketplace, but they are also aware that no large bank will just blanketly swap out their messaging infrastructure overnight. The cost and risk associated with such a move is just too high. A more likely scenario is a global investment bank experiments with a new messaging product in one area, and provided that goes well, eventually integrate it into more and more business units. So, we could over a course of two, or more likely five to ten years, see 29West dramatically grow their market share.
Eventually, with the intense competition for market data speed, we will hit the ceiling. Everyone will be operating at top speed. So, where do we go from here? When we reach this point, the focus may shift to quality of data. According to Mary Knox, research director in Investment services research at Gartner, Inc., “That (speed) gives you a competitive edge for a while. But at some point you hit the speed of light and things just don’t get any faster. If a firm is not competing on sheer speed, than its competing on what it does with the market data information and the kinds of filters or decisioning rules that are applied to it.” So quality and reliability of data also loom as two huge issues for algorithmic trading firms moving forward. We are certainly still in the “need for speed” phase, but companies are already looking forward. This will be the ability of algorithms to recognize and make decisions on more complex patterns. Once systems are receiving data at light speed, we will begin to aggressively discuss better ways these systems can process and analyze information, as well as the overall importance of quality market data.
Recent Comments