Abstract image of stocks and markets
Market Infrastructure

NMS II: A Strange Way to Fix a Two-Tiered Market

We’ve said it before: The U.S. consolidated quote is the envy of the fragmented world. But based on the level of industry complaints it seems we may take for granted the benefits it provides.

We’ve said it before: The U.S. consolidated quote is the envy of the fragmented world. But based on the level of industry complaints it seems we may take for granted the benefits it provides. To the surprise of many, even the SEC has been spurred into action.

The SIP was once such a unifying product, consolidating the whole industry around a single best price, and protecting investors at those levels from missed fills or worse prices, even if they traded off exchange.

New SEC proposals include major rule changes that fragment not only the quote, but also the SIP providers. Instead of a single National Best Bid and Offer (NBBO) there will be a multiverse of Many Best Bids and Offers (MBBOs). Getting there also requires the industry duplicate fixed costs.

That will create a new latency arms race while moving data costs outside the SEC’s jurisdiction—two problems the changes were intended to fix.

SEC approves rules to streamline and simplify

One complaint about the SIP is that, with three tapes and two SIPs, it is unnecessarily complex and duplicates resources. So on May 6, the SEC announced that it approved a new rule to streamline the SIP into one plan. But that’s where the streamlining stops. The only real consolidation is with the administrator. The SIP technology providers will still exist in their current locations.

The SEC also tries to fix a conflict of interest problem. Currently, SROs get to vote on the prices the SIP plans sell their data. The SEC decided to give customers votes and give smaller competitors a greater share of the vote in the operating committee that governs the SIP, which just creates a different conflict of interest. Now the buyers of data get to vote on prices they want to pay without being restricted from using that data to compete with data producers. And this comes after those revenues have been declining.

Chart 1: SIP revenues have been falling since NMS was introduced, and have been reduced from 29% of SEC revenues to just 17%; Nasdaq direct feed costs are also flat over this time

SEC vs SIP vs Nasdaq baseline data revenue

Despite the fact that SIP revenues have actually fallen since Reg NMS was adopted (Chart 1), many also still complain that it costs too much. There is also a concern that the way SIP revenues are allocated back to exchanges also helps start-up exchanges resulting in increased fragmentation.

We agree it’s not perfect, and have proposed enhancing the revenue share formula by reducing payments to small exchanges that don’t add much to best prices as well as rewarding visible liquidity that actually results in trades. That would ensure that almost all SIP revenues continue to be a reward for price discovery: a market good.

Importantly though, SIP costs now help offset a free riding problem. Ironically, those free riders are among the entities being given more votes on the prices they will pay.

Finally, the SEC’s new proposal suggests that data providers are “utilities” and should get paid on a cost-plus basis, but then sets up a “competitive consolidator” model with 12 or more Reg SCI entities distributing exchange data. Hard to see how that won’t add new fixed costs the industry must bear.

Although these changes generally reduce the SEC’s control of data distribution and resale prices, other changes that include depth in the SIP vastly extend the government’s role in what investors will need to buy.  But that’s a topic for another day.

SEC proposes rules to complicate and multiply

The bigger question is whether competing consolidators fix any other problems. One problem it seems designed to tackle is geographic latency.

We recently discussed how the additional hop that the SIP data needs to do (going back to a single consolidator to create the NBBO) means the SIP will always be slower than direct feeds. That delay is known as geographic latency. Although it’s a subtlety:

  • For a human, the geographic latency makes no difference at all (it is de minimis).
  • For an algorithmic or electronic trader the SIP will never be as fast (it is a race condition).

Many complain this creates a “two tier” market. The haves (fast data) and have not (SIP data). But it’s also important to consider who is harmed by “slower” SIP prices. There are two distinct use cases for prices:

  1. A cheap and reliable NBBO: The SIP works fine for “slower” human traders, who won’t benefit from additional expensive infrastructure.
  2. Solutions for more sophisticated professional traders with race conditions: Each sophisticated trader has different opportunity costs meaning they also have different optimal investments in speed. Something we’ve seen from their purchases is that most find custom solutions more cost-effective than being forced to all buying the same products.

Importantly, for the majority of the traders the “two tier” market adds no cost.

The “problem” arises when sophisticated users buy the SIP data to save money. Even then, that decision might be optimal, but proving that involves some complicated opportunity cost math, and quantifying (another) conflict of interest between brokers (who pay for infrastructure) and their buy-side customers (who get better fills). But we digress.

Can we minimize race conditions?

Let’s be clear: Race conditions are impossible to solve. Even if you’re fastest by a picosecond, you are still first.

If we really tried to neutralize speed we would create a very EQUAL playing field. With all prices, trades and fills arriving simultaneously to all investors. As our time is relativity study showed, this would be much easier if all trading took place in one place (we would suggest Carteret).

Unfortunately, even if you did all that, hardware and software set-ups will still create differences in response times. Races would still have winners.

It’s kind of irrelevant anyway, because what the SEC has proposed is close to the opposite of this.

Can competing consolidators eliminate geographic latency?

What the SEC’s Valentine’s Day proposal includes is a competing consolidator proposal. A “free market” solution to fixing geographic latency and BBO construction.

One thing this will do is remove the “hop” that non-primary-market trades needs make to get on the SIP and then get back to investors (see below).

However it will also:

  • Create a multiverse of many BBOs (MBBOs).
  • Each that define BBO in their own space-time (and still delayed by transmission times).
  • So at the exact time of a fill, MBBOs will all be different (and stale).
  • It makes locked and crossed markets even more likely because in flight orders won’t have to wait to see if the market-wide bid or offer refills after a trade.
  • All of which makes it (even) harder to know if you traded through a better quote or missed a quantity at a different venue.

In short, although distributed SIPs remove the most of the geographic latency, the hop to customers means they don’t equalize everyone. But the competing SIPs will by definition create a whole new speed race.

A worked example of competitive SIP BBOs

This example takes the scenario in our time is relativity study, with a trade starting at Carteret and liquidity in AAPL on Nasdaq and NYSE. We add competing consolidators (BBOs) at each of the three market locations (Carteret, Secaucus and Mahwah). This shows how quotes change as trades travel between venues.

At 100µs a Carteret BBO would obviously know about a trade on Nasdaq. But so too would a consolidator using microwave in Secaucus. However, if another consolidator was using fiber, they would not know that Nasdaq had already traded (yet).

Chart 2: The market for AAPL at Time = 100µs

BBO chart

At 200µs all trades are complete. In reality, no market is offered $325.01 anymore.

But only a microwave Mahwah BBO would know that and would be updated to $325.02, representing the best quote on Nasdaq. It works that way because NYSE is the last market to be cleared in the sweep.

All other BBOs would be “stale” as the report from NYSE would still be in transit. So they would show liquidity still offered at $325.01.

Chart 3: The market for AAPL at Time = 200µs

BBO chart

Clearly a competitive BBO located nearer to a customer is faster to update than the SIP. But:

  • This won’t eliminate the two-tier market.
  • It will add fixed costs.
  • The investment won’t benefit retail or human traders.
  • It won’t even stop BBO’s being stale prices.
  • It still won’t be faster than direct feeds.

The question is: Will it be worth the cost?

Investing for the 1%?

The cost-benefit question is complicated by thinking “who” is this designed to help. Our data shows that direct feed customers make up less than 1% of all SIP customers. But as direct feeds are still faster, the beneficiaries of these 12 new consolidators are a fraction of the 1%.

Whatever costs this creates will benefit a relatively small group of new SIP customers and it’s not very fair if the 99% of current SIP users have to pay for it.

Chart 4: SIPS are for the masses; direct feed customers are a fraction of all investors

SIP customers vs Nasdaq depth customers

A new arms race?

At its core, much of this debate is predicated on the fact that the “SIPS are slow.” But the charts above show the same race condition (and best-ex-compliance concerns) will also exist with consolidators.

  • There will undoubtedly be faster and slower competitive consolidators.
  • Many investors will (still) be compelled to buy the fastest consolidator.
  • The fastest consolidator would be able to charge more for their data-feed (and will have higher costs too).

That creates a new speed arms race.

Worse, the SEC might lose its ability to approve consolidator mark-ups.

Currently, data prices are regulated by the SEC, which as we’ve shown, has kept our “same store” rate of growth in line with inflation (Chart 1).

Are we engineering a solution worse than the problem?

This is an extensive and dramatic change to market rules which try to tackle a number of “problems” (things people complain about). Recent 606 changes took years to implement, and were changed significantly at the requests of commenters, but these much bigger changes seem to be on a much shorter deadline.

The costs here are not immaterial. Gone is the golden source NBBO that protects investors and (mostly) synchronizes our fragmented markets. Required are latency arbitrage to keep markets in line. Studies from other countries with similarly fragmented and unprotected quotes suggest that latency arbitrage profits in the U.S. could rise by almost $3 billion.

So what are the benefits?  Some have noted this is about at most a “few hundred million dollars” that the 1% pay to exchanges.

Seems a high price to pay given most of the “problems” in the market are far from solved.

Table 1: Tracking the problems NMS II is designed to solve

SIP Proposal problem chart

Source Nasdaq Economic Research (*: Somewhat) 

Careful what you don’t comment on…

Unfortunately, these changes are directed at sophisticated investors. The “1%” of customers that manage over 50% of assets and even more of the trading. It requires head traders and market experts to weigh in and think critically.

Although most of them are dealing with COVID-19, the SEC didn’t wait to approve the January proposal and has indicated it is not open to extending the deadline on the Valentine’s Day proposals.

Comments on the rest of these new rules are due on May 26.

If you don’t comment now, don’t complain for the next decade if trading costs you more, liquidity is fragmented and high frequency trading firms are making more profits.

Phil Mackintosh


Phil Mackintosh is Chief Economist and a Senior Vice President at Nasdaq. His team is responsible for a variety of projects and initiatives in the U.S. and Europe to improve market structure, encourage capital formation and enhance trading efficiency. 

Read Phil's Bio