Miles from the Curb

IT recruiting on Wall Street.

Algorithmic Trading

Here at Navistaff, we view the hottest area  in finance right now to be Algorithmic Trading...but what is it?  Where is it going?

To answer these questions:  an article by my colleague Jerry Tierney.

JtAlgorithmic trading. This is a term becoming exponentially more prevalent in the financial world every day. Most of you reading this article are probably familiar with algorithmic trading on some level, but we will explore first a general understanding of what is algorithmic trading. In addition, what is its current role in the marketplace, where is it going moving forward, and what is the technological impact on the “trading system” community.

Algorithmic trading, sometimes loosely grouped under the handle “electronic trading”, is made up of a system that collects market data and analyzes the information, and executes trades based on established sets of trading strategies. Basically, the system will place a buy or sell order of a defined quantity based on a quantitative model that automatically generates the timing of orders and the size of orders based on goals specified by the parameters and constraints of the algorithm. To give an example, A trading firm may have employed the strategy that when the spread between Microsoft and Oracle exceeds a certain level in the streaming market data , the algorithmic trading system will be alerted to buy Microsoft and sell Oracle. Some commonly used strategies are VWAP (Volume Weighted Average Price ), TWAP (Time Weighted Average Price), and Pairs.

For the algorithm to result in a successful trade, there are massive quantities of real-time market data streaming through these systems. The primary concern of traders is the speed of this data. High throughput. Low latency. These are two key terms in the ongoing battle for faster market data. Realistically, a matter of milliseconds (1/1000 of a second) can be the difference between a successful trade and an unsuccessful trade. So, slow market data (by no more than a few hundred milliseconds) means that the next guy, whose system is running only thousands of a seconds faster, is executing successfully, while you and your firm are losing opportunities to profit.

According to the Financial Information Forum, a centralized information bureau for U.S. Equities and options market data run by the Securities Industry Automation Corp. (SIAC), message traffic is growing very quickly. In November 2005, the sustained 1-minute peak for market data was 121,000 messages per second. This is up 116% from the November 2004 rate of 56,000 messages per second. With this peak message rate increasing so dramatically, we must constantly raise the ceiling for traffic to ensure the system will be equipped to handle sudden spikes in data. OPRA, the Options Price and Reporting Authority, provides quote and trade data from the six US options exchanges. As of the summer of 2006, any anyone getting a direct OPRA feed must be able to handle a peak messaging rate of 173,000 messages per second, or 1.3 billion messages daily. That number has risen from 53,000 per second at the end of 2004. Basically, the market data messaging infrastructures need to become more and more robust.

So back to the term low latency. The goal of every successful algorithmic trading firm is zero, or near zero latency. Everyone is looking for ways to reduce delays in the transmission of information. One way to accomplish this is to eliminate the middle man. Consolidated market data providers like Comstock, Reuters, and Thomson are continually working to lower latency. Still, if an algorithmic platform provider, such as Flextrade can get the feed directly from the source, aggregate it, and provide that to their customers, that model will always be faster since the data is making one less stop on its journey. And latency, according to Vijay Kedia, president of Flextrade Systems, “Is as important an issue as the data itself. Anyone who gets data straight from the source finds an immediate shortcut.” So Flextrade now gets all of its feeds straight from the sources, the New York Stock Exchange, NASDAQ, and the ECN’s.

The next challenge is for these firms to achieve near zero latency in their internal messaging platforms. Messaging platforms will push data around internally. Usually these systems will operate under a “publish/subscribe” model, where the internal applications and systems receive only relevant data. Algo trading firms put a huge load on their messaging infrastructures, based on the speed necessary, as well as the vast volumes of data processed by the system. The industry leader in messaging products has been Tibco’s Rendezvous. According to Trader’s Magazine, a rising star in this middleware messaging area is a three-year old company named 29West. This Chicago based company is positioning itself as the “David to Tibco’s Goliath”. Their claim is that they have “a higher performing messaging product to Tibco’s RV”. The 29West product is called LBM, which stands for “latency busters messaging”. Upstart 29West is posturing themselves to eventually make a run at Tibco’s dominance in the marketplace, but they are also aware that no large bank will just blanketly swap out their messaging infrastructure overnight. The cost and risk associated with such a move is just too high. A more likely scenario is a global investment bank experiments with a new messaging product in one area, and provided that goes well, eventually integrate it into more and more business units. So, we could over a course of two, or more likely five to ten years, see 29West dramatically grow their market share.

Eventually, with the intense competition for market data speed, we will hit the ceiling. Everyone will be operating at top speed. So, where do we go from here? When we reach this point, the focus may shift to quality of data. According to Mary Knox, research director in Investment services research at Gartner, Inc., “That (speed) gives you a competitive edge for a while. But at some point you hit the speed of light and things just don’t get any faster. If a firm is not competing on sheer speed, than its competing on what it does with the market data information and the kinds of filters or decisioning rules that are applied to it.” So quality and reliability of data also loom as two huge issues for algorithmic trading firms moving forward. We are certainly still in the “need for speed” phase, but companies are already looking forward. This will be the ability of algorithms to recognize and make decisions on more complex patterns. Once systems are receiving data at light speed, we will begin to aggressively discuss better ways these systems can process and analyze information, as well as the overall importance of quality market data.

April 20, 2006 | Permalink | Comments (5) | TrackBack (0)

When Trading Technology Goes Awry

CrashTokyo Stock Exchange Inc., operator of one the world's largest stock markets, is likely to face further scrutiny over the state of its IT systems after it was forced to halt trading 20 minutes earlier than normal on a day in Mid-January because its computer system was close to capacity.

The trading halt at 2:40 p.m., local time, came after the exchange warned during the market's lunchtime break that trading would end early should volumes reach 4 million during the afternoon session. At lunchtime, the number of trades had already reached 2.32 million and they hit 4 million at 2:25 p.m. The system is designed to handle 4.5 million trades per day.

The heavy volume came on the back of fallout from allegations of wrongdoing at Livedoor Co. Ltd., a major domestic Internet portal, and poor results from Intel Corp. and Yahoo Inc. The benchmark Nikkei 225 index closed down 3 percent at 15,341 points, which was its biggest one-day fall in almost a year.

The shutdown is the latest in a string of IT systems-related problems to have hit the exchange in recent months.

In December the bourse's software was called into question after an erroneous order to sell 610,000 shares of J-Com Co. Ltd., a newly listed company, US$0.009 each was accepted despite the stock trading around      ¥610,000 at the time and the amount being 40 times the number of issued shares. Mizuho Securities Co.suffered billions of yen in losses as a result of the mistaken trade, which should have been to sell a single share, and the exchange was also criticized because it was unable to cancel the transaction even when the problem was discovered.

A month earlier trading was suspended for half a day after a software patch was incorrectly applied to the market's trading system causing the system to crash. System vendor Fujitsu Ltd. took responsibility for that mistake and in mid-December, after the J-Com debacle further shook confidence in the market, President of Tokyo Stock Exchange Takuo Tsurushima said he would resign.

"In Japan all the exchanges have very limited experience regarding online trading," said Junichi Saeki, a research vice president at IDC in Tokyo. "In the U.S. market they have more than 5 years of experience, so similar types of issues have been dealt with. In Japan it's just in the last year that online trading has been popular."

The exchange's inability to deal with the increasing number of orders points to IT management problems within the organization, according to Saeki.

"Unfortunately, the stock market organization doesn't have a very capable IT manager," he said "It doesn't have a lot of money for IT (and) they do not think about the importance of IT. Even the top management doesn't know that."

 The exchange was also criticized last year by authorities and market participants and Japan's Financial Services Agency instructed the exchange to submit a business improvement plan by the end of January.

Link:  http://www.computerpartner.nl/article.php?news=int&id=2377   

February 15, 2006 | Permalink | Comments (0) | TrackBack (0)

The Trade Floor Reigns Supreme

Trading_floorTo people that aren't overly familar with what I do, here it is:  My professional life is devoted to locating and delivering world-class engineers and software developers to my 1st-tier global investment banking clients.  In simplest terms, these are the people that develop the infrastructure behind the systems that trader's execute trades on.   If these systems are slower than their competitors, even if only by a fraction of a  second, an optimal stock price can be snatched up by a rival firm and fortunes can be lost.  Well, that may be a bit dramatic, but there is little doubt that the Wall Street trading floor is evolving from the yelling and screaming of traders to one of the quiet hum of computer-based trading models.

For this reason most financial firms rely heavily on the speed, performance, and reliability of their systems.  Such dependence on technology has made the Wall Street trading floor one of the most exciting, cutting-edge, and lucrative areas for developers to be in.  If you don't believe me, have a read through Phil Albinus' article in Waters this past month (below):

Snapshot"The trading floor still rules.  Even after all the talk and all the steps taken to reach regulatory compliance and increase back-office effeciency, the trading floor still commands the attention of each and every CIO.  Let's face it:  That's where the action is.

Like anything in the global capital markets, it all comes down to making money.  Cleaning up your back office is important and any new amount of trading efficiency will save money down the road, but these projects don't increase the heart rate.  There are gleams in the eyes of tech staffers when they talk about the trading floor.  That's where the important things happen, where trades are made and deals are brokered.  On the other hand, if you've seen the back office, you've seen them all.  Sure, one firm might have more space thanks to sleek blade servers or entire functions outsourced to Ireland or India, but the trading floor commands attention.

The irony of the trading-floor-as-king concept is that the reign is slowly but surely dissolving.  The same wave of automation we have seen in the back office - where, once, actual people kept things humming - is now happening on the trading floor.  Traders can't process micro-second market data ticks and execute at the right instant.  A computer can."

December 12, 2005 | Permalink | Comments (0) | TrackBack (0)

Hot December on Wall Street

ChartTwo weeks back, I was actually considering a post about the usual December slowdown in the job market.  I'm glad I didn't.  On a macro-level, economic growth for the 3rd quarter in the US was recently revised to 4.3%.  That is impressive on any level, but to think it came after record-high oil prices and 2 massive natural disasters is flat-out incredible.

With respect to our little niche, IT hiring on Wall Street is back to near 1999-2000 levels.  The lead IT recruiter at one of the top investment houses on the street recently told me that this December will be the busiest month, not just December, he's had in all of his years at the firm.  He alone has 31 full-time, permanent roles that need to be filled by the end of the month.  So, while everyone else on the street is waiting until the new year to start landing top-shelf positions, get proactive and begin the search now.  By the time the interview process plays out and you set a start date, you will have already received your 2005 bonus.

The areas in hottest demand for me right now are equities direct market access  developers and application support specialists, java developers within prime brokerage, and C++/Java/C# developers in any and all sorts of derivatives products desks.  My two major clients are absolute world-class investment firms, both coming off of record 2005 profits and bonuses.  If you know me, you know which firms I'm talking about.  If not, please send me a quick note and I'd be happy to fill  you in.   

December 03, 2005 | Permalink | Comments (4) | TrackBack (1)

Derivatives, Derivatives, Derivatives.

As Mike Biederman's guest article below noted, developers are in great demand in the equity derivatives space.  One of my major investment banking clients is doing some serious hiring in this area, 12 openings to be filled within the next few months.  Ideally, they are looking for 2-5 year Java/C++ and/or C# developers.  While financial system development experience is ideal, if you are a bright, well-educated developer who'd love to work on a top trading floor...please send me your resume.

Overview
This expansion is essentially a technology play: derivatives growth will follow from the capacity and flexibility of our pricing, modeling, market-making, trading, risk management and control systems.

  • To provide fast and consistent pricing, quoting and scenario-based risk analysis, we are accelerating development of massively scalable calculation servers.
  • To support increasing volumes of issuance and market making and increasingly complicated structured products, we are re-engineering the data architecture behind our booking and distribution systems.
  • To trade and risk-manage products which depend on new kinds of market data, we are building tools to manipulate and analyze these data, and frameworks to study them historically.
  • To make all this information useful to traders, salespeople and other customers, we are crafting rich GUI clients.
  • In all these efforts we are building on a successful ongoing partnership with the respective trading desks and the business unit as a whole.


General Requirements

Candidates should have a keen interest in the modeling, pricing and risk management of equity derivatives. They should have excellent problem-solving skills, including the ability to teach themselves on the job. They should have good presentation and communication skills as all of the positions require much client interaction, and good teamwork skills as collaborative efforts are the rule. They should also be adept at setting and resetting their priorities throughout the course of a typically busy day. Technically, expertise in one of C++, C# and Java is required and knowledge of a second of these is strongly preferred; candidates will demonstrate their language expertise in both a written test and a technical interview. Two to four years of relevant work experience is ideal. Knowledge of derivatives is a plus, as is knowledge of Perl.


Range of Experience

We are looking to higher people across a broad range, from recent graduates through to senior VP level.


Specific Roles

GUI Developer
C# a requirement, as development is primarily in .NET. Joins a unified team building user interfaces for risk management, data analysis, derivatives pricing and trading. The team's challenge is to design components that can be combined flexibly for these very different purposes and provide a responsive interface to service oriented architecture.

Server-side Developer
Strong C++ a requirement. Joins a team architecting and building our pricing and risk management servers. In addition to the obvious challenges of efficiency, scalability, robustness, we need to design systems that can change over time in response to changing demands without becoming unmanageable.

Database/Distributed Developer
Strong DB skills (Sybase, DB2) required, preferably including optimization and datawarehousing. Experience in one OO language (C#, Java, C++) required as role will stretch to application development. Joins a team maintaining the division*s credit risk, market risk, collateral management and customer valuation systems. The challenge is maintaining the integrity and flexibility of the division's critical risk warehouses whilst catering to a wide range of front, middle and back office clients injecting a steady stream of new requirements. Excellent opportunity to acquire or build on broad equity product and processing knowledge.

Quantitative Developer
Strong C++ a requirement, as is knowledge of partial differential equations and numerical analysis. Joins a team maintaining the division's core analytics across both applications and trading desks. The work is a challenging mixture of self-contained fixes, subtle enhancements to production systems, and open-ended design problems. The successful candidate will have a sharp eye for detail and an ongoing commitment to testing.

November 10, 2005 | Permalink | Comments (0) | TrackBack (0)

»
My Photo

Wall Street & Technology RSS

Recent Posts

  • Algorithmic Trading
  • When Trading Technology Goes Awry
  • The Trade Floor Reigns Supreme
  • Hot December on Wall Street
  • Derivatives, Derivatives, Derivatives.
  • Where've you been all my (professional) life?
  • A message from Mike Biedermann
  • VC++/MFC Developers needed for Financial Technology firm in Roseland, NJ.
  • MFTC Explained
  • Question of the day

Archives

  • April 2006
  • February 2006
  • December 2005
  • November 2005
  • October 2005
Subscribe to this blog's feed

Recent Comments

  • wapnokied on Where've you been all my (professional) life?
  • agoloWait on Where've you been all my (professional) life?
  • jeonrelarmatt on MFTC Explained
  • CobEloleHalay on Where've you been all my (professional) life?
  • CobEloleHalay on Where've you been all my (professional) life?
  • CobEloleHalay on Where've you been all my (professional) life?
  • Enetlesnupe on Where've you been all my (professional) life?
  • crorAvaivabah on Where've you been all my (professional) life?
  • evantiftime on Where've you been all my (professional) life?
  • Bpmhalrqxz on Where've you been all my (professional) life?