SEC Consultation on Proposed Rule on Market Data Infrastructure
Norges Bank Investment Management (“NBIM”) appreciates the opportunity to comment on the Notice of Proposed Rule on Market Data Infrastructure under the Securities Exchange Act. We welcome this initiative by the Securities and Exchange Commission (“SEC”), and recognise the need to reform and modernise the market data infrastructure of US equity markets.
NBIM is the investment management division of the Norwegian Central Bank (“Norges Bank”) and is responsible for investing the Norwegian Government Pension Fund Global (the “fund”). NBIM is a globally diversified investment manager with assets valued at NOK 10,088 billion (USD 1.149 trillion) as of December 31, 2019, of which NOK 2,842 billion (USD 324 billion) was invested in US listed equities. We have a vested interest in a regulatory environment that promotes well-functioning markets in financial instruments, facilitates the efficient allocation of capital and risk, and promotes long-term economic growth. Such an environment requires balancing the interests and incentives of various types of market participants, ensuring a level playing field in financial markets.
Market data is a particularly important component of well-functioning markets. It facilitates and reflects the price discovery process by market participants on trading venues. This makes it an essential and central part of the market infrastructure. Its centrality means that any reform or evolution of market data needs to be considered in the broader context of the whole market infrastructure.
We are in broad support of the SEC’s proposed reform of market data infrastructure and the provision of consolidated market data. We believe that the proposed rule solves for many of the issues of the current model of consolidated data provision, including those we have previously shared with the SEC.
However, we believe it is important to consider the proposed rule in the broader context of the whole market infrastructure, its users and how it is paid for. The rule may have a significant impact on all these, beyond that outlined in the SEC notice’s economic analysis. In the next section, we outline our view of the role of market data in the broader market infrastructure, and of the impact of current market data pricing models. We then present NBIM’s use cases and needs for market data, representing a large, global asset manager. Finally, we discuss the potential longer-term impact on broad market structure of the SEC’s proposed rule, and the need for further economic consideration and analysis.
Market Data and Infrastructure
Market data is an essential, integral part of market infrastructure as it both facilitates and reflects the price discovery process by market participants on trading venues. In our view, market data cannot be considered in isolation. Most market participants must consume market data in some form, the costs of which are part of doing business. Trading venues produce and disseminate market data as a result of their trade matching; it is not always straightforward to separately attribute trading venues’ operating costs to the matching engines and to market data. In addition, exchanges have a uniquely central and important role in equity markets, and clearly contribute to the well-function of markets[1].
Market power of trading venues and especially exchanges differs significantly across their business lines – trade matching, listing services, market data and colocation and connectivity. While the market for trade matching and listing services is generally competitive, exchanges enjoy pricing power for their market data products as well as colocation and connectivity. For the latter, the pricing power is driven by geographical proximity and latency minimisation.
For market data, the exchanges’ pricing model stems from the fact that market data from different venues are imperfect substitutes for some market participants, while they are complements for others. For liquidity providers, higher-turnover investors and broker/dealers, a portion of their revenue comes from earning the bid/ask spread – either directly or indirectly through differentiation in market impact cost. This makes market data from different exchanges a complementary good. For other market participants, including ourselves, different exchanges’ market data may be substitute goods. The level of substitutability depends on approximate no-arbitrage bounds of prices across exchanges that are provided by higher-frequency market participants.
The revenue from the exchanges’ various business lines is used for critical market infrastructure items, which are used to execute trades and source liquidity. In line with many other modern technologies, much of the cost of this infrastructure is fixed, while marginal costs per trade are approaching zero.
The existing exchange pricing model mirrors the exchange’s cost structure. It combines relatively high fixed subscription costs, through data packages, connectivity and co-location, with very low marginal access fees. On the positive side, this has led to relatively ample and deep liquidity in US equity markets, compared to other markets globally. It has also enabled technology investments that have provided for remarkable resilience and scalability, as evidenced in the recent high-volume days.
These desirable features have not come for free. We observe increasing concentration in the financial industry – in the asset manager space, the broker/dealer community, and in the liquidity provider/market maker space. There are barriers to entry based on necessary scale to be able to absorb the fixed costs of infrastructure, market data and connectivity.
The challenge is to weigh these features based on an economic analysis. We view the discussion around the consolidated tape as part of this cost-benefit analysis. There can be little doubt that technological advances allow for considerable modernisation of the consolidated data provisioning. There is no technological reason to limit the data provided on the consolidated tape to what we have now. There is capacity to include odd lots, depth of book information and indicative auction data. The discussion also needs to consider the broader question of how the equity market infrastructure should be paid for.
Consolidated data and NBIM
NBIM is a consumer of consolidated trade and quote data for US equity data, primarily for reference and research purposes. However, we find that consolidated data by itself is insufficient for our needs, on both a pre- and post-trade basis. On a pre-trade basis, we consume direct feeds to gather depth-of-book information that guides our trading decisions. On a post-trade basis, the NBBO as defined by the consolidated tape processors, co-located at main listing exchanges’ data centres, may differ from the physical reality experienced by market participants if their order routers are located elsewhere. While this difference only manifests itself during a minute fraction of the trading day as measured in calendar time, it remains significant when measured in trade time. Evaluating the trading performance of broker/dealers whom we employ as agents to execute our trades requires us to use an aggregation of direct feeds that reflects the physical reality of their location.
We need to use direct feeds – or an aggregation of direct feeds – for two reasons: To gain information that is not contained in the consolidated tapes as they are now; and to reflect the physical reality of the broker/dealers we are evaluating. The insufficiency of the consolidated tapes for us is not due to their latency, compared to direct feeds. However, empirically we find that algorithmic executions by broker/dealers cannot in general be competitive if they do not use direct feeds. Most of the time, the view of the market state based on the consolidated tapes, or on a feed from a third-party aggregator co-located with the broker dealer, will be identical to that resulting from in-house processing of direct feeds. However, it is the times when the views do not coincide that have the greatest impact on algorithm performance.
In our experience, therefore, broker/dealers that do not undertake data aggregation in-house, and do not use the fastest connectivity available, will in general not be consistently competitive. This does not preclude using third-party technology to do the data aggregation, as long as it is done in-house to avoid incremental latency.
Based on these use cases for ourselves, and for the broker/dealers that we employ as agents to execute trades for us, the SEC’s proposed reform of market data infrastructure will have a differentiated impact.
We believe it unlikely that broker/dealers’ algorithmic offering would be competitive using just consolidated feeds, even after the reform. The additional latency inherent in third-party aggregation is sufficient to ensure this. We expect this latency disadvantage of consolidated data for broker/dealers and other higher-turnover market participants to continue and possibly even increase in the future. ‘Smart order router’ (SOR) technology continues to evolve and to provide a richer set of strategies. One aspect of future SOR technology might be a physically distributed approach, which does not require knowledge of the full consolidated market state to trigger order placement or cancellation activity. In addition, we expect latency minimisation to continue to be a performance differentiator for broker/dealer algorithmic offerings, particularly for more opportunistic strategies. In our view, this will likely limit the market opportunity for consolidated tape offerings at institutional broker/dealers.
From an asset manager’s perspective and for our own data needs, on the other hand, we would expect that much of our use case for direct feeds would be eliminated if the SEC’s rule is implemented as proposed, and if there is a competitive consolidated tape offering with the processor physically located in the same data centre as the broker/dealers we employ as agents. The proposed inclusion of depth-of-book information is likely to be sufficient to provide us with the pre-trade transparency of market liquidity we require. The inclusion of additional data on auction imbalances, as well as of odd lots, would provide us with additional visibility.
For our post-trade use of consolidated data to conduct internal market structure and trading research, and to evaluate broker/dealer performance, the physical location of the processor is critical. The SEC’s proposal of a competitive processor market, with lowest-latency connectivity to exchanges ensured, provides the opportunity for such a processor to emerge. We would expect a processor located at the same data centre as most institutional broker/dealers to gather significant market share.
Market structure impact of proposed rule
The SEC’s proposed rule on market data infrastructure is likely to have a significant impact on the broader market structure. We would encourage an analysis and a consideration of this impact. The first-order effects of the rule on investors such as ourselves are likely to be positive – it broadens our use case for standardised, consolidated market data, and reduces our need for direct feeds. The effects on broker/dealers and on higher-turnover market participants is more ambiguous – under the current market structure, it is unlikely that they would be able to substitute for their need of lowest-latency, internally-aggregated direct feeds. Meanwhile, the first order effect on exchanges is likely to be negative, to the extent that some market participants can substitute the consolidated tape for direct feeds, and if the exchanges’ pricing model over direct feeds is limited.
It is likely that these first-order effects will lead to further, second order effects. These may have an impact on the broader market structure. Current US equity markets, relative to their global peers, are characterised by considerably deeper liquidity, tighter spreads and higher trading volumes. This is partially due to the size of US capital markets, and the heterogeneity of its participants’ objective functions. Another driver is the particular business model and regulatory environment in the US which has generated robust competition across exchanges in attracting order flow, in contrast to market data provisioning services. This has resulted in exchanges being at best revenue neutral in their trading operations (particularly when excluding auction revenue). Capital expenditure and innovation is financed through their other business lines, including data feeds.
The second-order effects of the SEC’s proposed rule may impact this market model, and lead to changes in market characteristics such as liquidity depth, spreads and volumes. We would encourage the SEC and market participants to consider these potential impacts.
Conclusion
We support the SEC’s initiative on reforming the US equity market data infrastructure, and appreciate the thoroughness of the proposed rule. We believe that the rule has the potential to significantly modernise US equity market structure and to provide a more level playing field. We would expect this rule to increase the use case for consolidated data, including for institutional asset managers.
We agree with the details of the SEC’s proposal on the expanded definition of core data – including odd lots, some depth of book, and additional data on auctions. We believe that this is an exhaustive list of data to include, given current market structure and practices. However, it might be prudent to allow for further modification of the definition of core data as market structure evolves.
We also strongly agree with the SEC’s proposal on competing processors, including diversification in physical location. We believe that this proposal by itself increases the use case for consolidated data for many market participants, even in the absence of the proposed expanded definition of core data.
We agree with the SEC’s provisions for exchange data access for competing processors, including the availability of the fastest connectivity. However, we believe that there is complexity in ensuring the reliability of the consolidated feeds produced, given the generally inverse relationship between speed of a data connection and its reliability. Processors and exchanges will have to develop a protocol for the conditional use of slower, more reliable feeds to ensure reliability of the consolidated data feeds. This protocol will have to include details on who initiates the switch between different feeds, the documentation on feed type used, as well as pricing of redundant feeds.
We believe that the SEC’s proposed rule on market data infrastructure needs to be evaluated in the broader context of US equity market infrastructure. The rule has the potential to provide for more competition – amongst exchanges, amongst broker/dealers and amongst liquidity providers. This should further improve the quality of US equity markets.
We appreciate to have had the opportunity to comment on this important initiative and welcome any further questions or discussion.
Emil R. Framnes Global Head of Trading, NBIM |
Simon Emrich Market Structure and Trading Research, NBIM |
[1] See our Asset Manager Perspective on the ‘Role of Exchanges in Well-Functioning Markets’, http://www.nbim.no/en/publications/asset-manager-perspectives/2015/role-of-exchanges-in-well-functioning-markets/