As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants.
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.
Emma Quinn, AllianceBernstein’s Head of Asia Pacific Trading discusses accessing liquidity through dark pools, aggregation and asset allocation.
Trading Volumes, Liquidity and Asset Allocation I think that you’ll see trading volumes rise when you get an asset allocation back into equities, and people have more conviction in the markets. The reason that there’s just no liquidity in the markets is not because people are worried about exchange mechanisms or aspects like that, it’s about the macroeconomic environment and the allocation into equity.
I don’t think that we’re going to see volumes in other asset classes recover faster than allocation into equities as we’ve already seen that allocation change. People are either bullish or bearish, and are set for what they think is going to happen. And so we are in a position that people will just trade around their positions without making any significant move either way until we get some clarity on the macroeconomic environment.
The Rapid Expansion of Dark Pools and Access to Desirable Liquidity We use dark pools to access liquidity for orders we would not normally place in the central limit order book. I think dark pools aid price discovery. There has to be post-trade transparency but once that happens you’ve actually got more transparency on a market than you normally would. In this sort of environment, because you’re not putting out so much into a central limit order book, what used to be 10% of average daily volume is now 30% of average daily volume, you’re obviously leaving more in a dark pool if your order size hasn’t changed. I do think dark pool liquidity aids you, as your expected cost is going to be lower and thanks to post trade transparency in dark pools the market sees a block trade that it would not have seen.
With regard to desirable liquidity, I think the onus is on the buyside to actually put in parameters that can minimize risk. Obviously, you don’t want to go into a dark pool blindly. The same thing could be said of going on to the central limit order book. The same thing can happen to you on a central order book as can in a dark pool, if you’re not smart about the way you trade in a fragmented environment you leave yourself open to be gamed.
Best Execution We have an unbundled commission policy and as such our traders are not limited to paying based on a research vote. We have the discretion to use the broker that will give us the best execution outcome. This discretion is important and enables us to focus purely on the best execution outcomes for our clients.
Impact of Direct and Indirect Costs Imposed on Buy-side Traders by Liquidity Fragmentation We spend a lot of time on quantitative trading strategies and both post and pre-trade cost analysis – we have pre-trade expected costs in our trader management system and we also look at post trade – both daily and weekly as it is not enough just to look at one trade in isolation as so many factors can contribute to whether you have got a trade right or wrong.
Some of the cost of fragmentation has already been borne by the buy-side and sell-side, such as having to have smarter systems and employ quantitative trading. Brokers are now wearing additional costs with some regulators looking to recoup the costs that come with the increase in surveillance costs for a fragmented market. The brokers may have made savings due to the fact that we now have multiple markets and with that came a compression on exchange fees, but they could well and truly be paying that out now to regulators.
Schroders’ Head of Asian Trading, Jacqueline Loh, shares her thoughts on trading in Asia, offering comments on which markets are primed for change, how to find value in dark pools and whether unbundling is as useful as people say it is.
Fragmentation arising from multiple sources of liquidity is a necessary step in the evolution of best execution and in the long term, fragmentation will increase the quality of trade executions in Asia. What it means for the buy-side is investment in infrastructure spending to develop new order routers and the like, so we can electronically seek out and have exposure to multiple liquidity sources. For the sell-side, it means acceptance that there will be more competition for the same block of business in the marketplace. It means different things for different buy-side firms as well.
When I think about the investor ID markets in Asia, I am not sure any model is particularly productive because ID markets make it administratively more difficult to trade. IDs can make best execution very difficult to implement, especially if cash and stock checking is the primary consideration. Some of the ID markets, namely Taiwan and Korea, allow trading through omnibus accounts and that seems to be the way it is evolving. The ID markets are slowly going away, but having said that, the most productive example is probably China because the brokers seem to have a handle on exactly how much cash and stock you have in your account, and therefore how much you can sell and buy. You cannot overspend or oversell, and it is relatively easy to take part in IPOs.
Trade allocation used to be a problem with investor IDs; for example, explaining to compliance and regulators why the prices are not exactly the same between accounts. In these cases the use of omnibus accounts really help. Executing through omnibus ID means you know exactly what is in an account and do not experience many of the issues associated with overselling or settlement. It is a lot cleaner.
With retail-heavy markets, anonymity is the primary consideration for us. We tend to trade more using electronic means and make use of dark pools in retail-heavy markets. In addition to that, the algos we use will be more price-specific, rather than volume-participation models, which are more price impacting.
Best Execution, in the Dark?
You would think that dark pools would have more success in markets where spreads are currently wide and there is a need to be anonymous, which would imply ASEAN markets. In practice, however, it has had more success in Hong Kong, and that is because there are more users of electronic trading there. Perhaps the users are a little more sophisticated as well insofar as they are willing to take accountability for their executions. Which is, in fact, what defines electronic trading.
In our experience, dark pools make a difference in terms of liquidity, however, the question is what creates that difference? Is it the electronic trading system feeding through the dark pool that provides the benefit or is it the dark pool, itself? I would say it is the former, but that may depend on each user. routers. I hope the Securities and Exchange Board of India will consider further change including allowing stock crossings and clarifying the rules regarding P-Notes.
Mizuho Securities’ Spyridon Mentzas discusses the status of the Japanese exchange merger and offers thoughts on how well the two systems will merge and the benefits investors can expect.
Compatibility The merger of Tokyo Stock Exchange (TSE) and Osaka Securities Exchange (OSE) is not yet finalized, but it appears they will merge in the beginning of 2013, with the details yet to be specified. The first impression is that they have nearly identical trading rules with some minor differences, such as the OSE trading until 3:10, while the TSE closes at 3:00. When TSE decided to shorten the lunch time in November, the OSE did the same. When one of the exchanges (usually, the TSE) changes the rules, then the other moves in tandem: for example, changing the tick sizes. If the merger does go ahead, it is likely that they are going to use the TSE’s cash system, arrowhead, and OSE’ J-GATE for derivatives. They will use the old systems in parallel, which will achieve a reduction in cost because they will not have to maintain two systems.
Further Industry Consolidation The ECN’s in the US enjoyed technological superiority versus the classic exchanges, where NYSE’s latency was significantly slower than Arca’s. This would have been reason enough for TSE to consider buying a PTS, but with arrowhead’s current latency of less than 2 milliseconds (and another upgrade in the next few months to target less than a millisecond), simply buying a PTS would not give them a noticeable advantage because the TSE and OSE are on par with the PTSs. The reason why PTSs are increasing their market share is that, unlike in the UK and US, where Reg NMS and MiFID have required trading on the exchange with the best price, in Japan the PTSs draw volume through decimal points and smaller tick sizes than the incumbents.
For example, Mizuho Financial Group might trade on the TSE at 105 yen bid, 106 yen offer. That one yen spread is close to 100 basis points or almost one percent, whereas the PTS trades at 0.1 yen. This is a major incentive for investors to buy and sell on the PTSs with their smaller increments to reduce market impact and trading costs. From the beginning, the regulators have not been overly concerned with the PTSs deciding to trade in decimal places and have 0.1 yen ticks. It was always up to the PTSs to decide and the TSE could do the same. If anything, I think the new exchange would rather reduce their tick sizes, than merge again.
However, not all participants would be happy to see new tick sizes, for example, some of the proprietary houses or small firms that trade with retail, as altering their downstream systems to handle decimal places would be costly.
This will also create a fragmentation of liquidity in tick sizes. The bids and offers on the TSE are often thick, with something like 50 billion shares sitting on the bid side, so with 0.1 yen ticks, the average order size might move to 3 million or 1 million shares. Traders who want to buy a large lot will have to scroll up and down to find out how much they have to go up to absorb the available liquidity. I think for the traditional long-only traders, this might mean an increased scattering of liquidity. There is sufficient liquidity in the market at present; even for stocks trading at a low price – there are market makers trying to make 1% during the day. If smaller tick sizes are introduced, that liquidity will likely be scattered or disappear.
Annie Walsh of CameronTec spoke to FX users to better understand the topical issues and challenges facing the OTC Foreign Exchange market and the central role FIX can play in addressing these challenges.
Undoubtedly the capital markets in 2011 will be remembered for many history-making moments including some of the largest currency moves the market can remember. We have witnessed the global foreign exchange market — the most liquid financial market in the world with an average daily turnover in the vicinity of USD4 trillion — bear the brunt of one political crisis after another, causing widespread volatility and difficult to pick currency moves.
Currency friction in Europe and between the US Administration and China will no doubt remain a prominent feature of the global economy for at least the next 1 – 2 years. On top of this remains uncertainty of government, particularly in Europe, and the implications for continuity of fiscal and monetary policy.
Many investment banks too in their search for alpha have been left wondering ”where did the black box get it wrong?” following lack lustre P&L performance, almost industry-wide over recent months.
Without a formal open or close, the FX market presents a true ‘follow the sun’ global market, with inherent levels of opportunity and risk.
Against this uncertain backdrop, the FIX Protocol has great potential to centrally feature in what is undoubtedly the single greatest threat (opportunity, if you prefer) facing the global OTC FX market. That is of structural uncertainty compounded by impending regulatory change to be ushered in, courtesy of Dodd Frank, and MIFID II and III.
With no unified or centrally cleared market for the majority of trades, and little cross-border regulation, due to the over-thecounter (OTC) nature of currency markets, these are rather a number of interconnected marketplaces, where different currencies’ instruments are traded. Inevitably OTC FX will move, however grudgingly, away from its long-standing (self-serving) model of self-regulation, toward greater levels of transparency, regulatory oversight (either directly or indirectly) and centralised clearing.
A Two Speed FX Market
As currently drafted, spot, outrightsand swaps are to be exempt from Dodd Frank’s requirement to be traded via Swap Execution Facilities (SEFs) and be centrally cleared; FX options, Cross Currency (CCY) swaps and Non-deliverable Forwards (ND Fs), however, are not. A perhaps unintended consequence of this two speed approach is the potential for jurisdictional arbitrage, product/financial re-engineering and further fragmentation of execution venues and liquidity.
In the short term, it also means that the sell-side needs to fundamentally reconsider strategies for design, development and deployment of Single Dealer Platforms (SDPs). Multi asset class SDPs will now necessarily evolve to become simultaneously both an execution venue as a destination and a gateway to a SEF, depending on the instrument traded.
BNP Paribas Dealing Services Asia’s Francis So opens up about their new structure, how they use Transaction Cost Analysis (TCA) and their preferences regarding dark pools and High Frequency Trading (HFT) flow.
The Hong Kong dealing desk has been restructured as an externalised/outsourced dealing desk for the buy-side. As a result we are now independent of the asset management group and belong to BNP Paribas Securities Services. Our current name is BNP Paribas Fin’AMS Asia Ltd but this will soon change to BNP Paribas Dealing Services, better reflecting the services we provide. BNP Paribas Securities Services provides middle and back office outsourcing services for buyand sell- side, as well as corporate clients. This new dealing service allows us to provide a full suite of front to back office solutions to meet the needs of the clients. The trend has been for the outsourcing of back office activities and I think it is only a natural progression to consider front office activities. Given the market environment, cost reduction is a key element for asset managers/asset owners. Outsourcing the dealing activity can help reduce cost but more importantly allows the asset manager to focus on delivering greater value to their clients. Our Paris office has been very successful in attracting external clients and in Asia we plan to ramp up activity in 2012.
We treat BNP Paribas Investment Partners (the asset management company of the Group) as one of our most sophisticated clients and as such must ensure that the services provided to them are kept to the highest standard. This will be the same for new clients as one of the keys to attracting and maintaining new client relationships is our ability to provide tailor made solutions and services. Clients can range from new start-ups to existing asset managers that already have a dealing desk. We offer flexibility to asset managers such that they can choose the asset class and/or geographical region they want to outsource. For example, some asset managers that already have dealing capabilities in their home market may decide to invest in overseas markets or new asset classes. They need to ask themselves whether it makes sense from a cost perspective to create a new dealing desk where initial volume is expected to remain low.
We have the knowledge, the expertise and the global reach. We have locations in Europe and Asia to cover all asset classes globally. We also serve fund managers located in different geographical regions.
It is important to stress that we are in no way competing against the sell-side. Our clients keep their contractual and daily relationships with brokers. We act as an agency-only trading desk and we do not have any prop flow or take any positions.
We work together with the portfolio manager to determine what benchmarks best suit their needs. They are able to send orders to our global Order Management System (OMS) with a specific benchmark. By doing so, we can measure our execution performance using their specified benchmark, be it Implementation Shortfall (IS), VWAP or a specific measurable benchmark.
AllianceBernstein’s Global Head of Quantitative Trading, Dmitry Rakhlin, discusses the problem of fragmentation and what makes a good aggregator, along with Ned Phillips of Chi-East, Greg Lee of Deutsche Bank, Steve Grob of Fidessa and Instinet’s Glenn Lesko.
Dmitry Rakhlin, AllianceBernstein
How does aggregation improve trading and best execution? Institutional traders usually demand (remove) liquidity from the markets, which in turn creates market impact. Being able to interact with aggregated liquidity (e.g. all available liquidity) lowers this market impact. Aggregated liquidity also gives a trader the ability to interact with many more liquidity sources randomizing the way the liquidity is taken from the market. This decreases the amount of information leakage and protects the trade from being exploited by predatory strategies.
Does aggregation spell the end of fragmented markets? No. The US equity market is highly fragmented, yet all liquidity centers are interconnected, which allows traders to build various aggregator strategies. No doubt, there is cost and complexity associated with this. Fragmentation also introduces so called latency arbitrage and a potential increase in information leakage (the information leakage can be drastically reduced by using appropriate trading strategies).
The positive aspect of fragmentation is that it creates rich market microstructure (traditional exchanges and exchanges with inverted fee structures, block crossing networks, auctions, conditional order types, aggregators of retail liquidity, etc.). These choices give the buy-side the ability to match their investment strategies to the appropriate liquidity sources and ultimately benefit by being able to trade more nimbly and at lower cost.
From your perspective, is aggregation about greater access to liquidity or reducing trading costs? Both.
Ned Philips, Chi-East
How does aggregation improve trading and best execution?
A good aggregator brings order to fragmented markets by concentrating order flows, and liquidity, from a large number of matching venues. It is a tool that allows all participants to access multiple venues from one easily accessible point, reducing the technology costs and other difficulties involved in monitoring different trading venues.
Does aggregation spell the end of fragmented markets?
No. Even if one good aggregator attracts a majority of trading flows, it would not represent a throw-back to a single exchange monopoly. Aggregators are there to make the process of using multiple markets easier and more efficient and can only exist as long as participants have a choice of matching venues.
What are the risks inherent in aggregation and how can an aggregator ensure improved execution?
Theoretically there is a risk that an aggregator will be so successful that it monopolises the market but competition and risk management would keep things in check.
An aggregator ensures improved execution by concentrating liquidity which reduces spreads and improves execution.
Nomura’s Jeremy Bruce summarises the current state of play in terms of European liquidity venue fragmentation, and focuses specifically on venue ownership and geographical concentration of equity execution venues.
Ownership and Location of European Equity Trading Venues
In the past few two years we have seen not only increasing liquidity fragmentation in Europe, but a significant change in the pecking order of exchange and venue size. The diagram below lists all venues with a market share of greater than 1% as well as referencing other smaller venues. As can be seen, it shows both the rise of venues, such as Chi-X Europe and BATS, as well as the proliferation of light and dark venues owned by the preexisting exchanges. Chi-X Europe in particular, is now comfortably the largest pan-European venue. There are currently two proposed mergers on the table, firstly between NYSE Euronext and Deutsche Boerse, and the second between Chi-X Europe and BATS.
The old model of a country having a primary exchange located within its borders (normally in the main financial district), where its companies’ stocks almost exclusively trade is no longer relevant. As corporate ownership of the manifold liquidity venues becomes more complex and blurred, it is perhaps more meaningful to look at the actual location of the exchange. When we say exchange, we are actually referring not to the administrative or corporate headquarters of the exchange firm, but to the location of the IT infrastructure that runs the actual live exchange matching engine. This location is then a physical data centre building, with an additional failover backup site.
Australian Securities and Investments Commission’s (ASIC) Greg Yanco tells FIXGlobal how Australian markets are preparing for the future, including the launch of Chi-X Australia.
What are ASIC’s goals for an Australian consolidated tape?
ASIC has consulted with industry in relation to options to consolidate data from all venues. In Consultation Paper 145: Australian equity market structures: proposals (CP145), two options were put forward – a single provider established by tender process or multiple providers provided by ASIC. To this end, respondents overwhelmingly preferred a multiple consolidator model. As we have previously stated in the Response to Submissions on CP145 Australian equity market structure: proposals (REP237), this was based on industry expectation that existing data services can produce the most efficient outcome for users.
Submissions also overwhelmingly supported the proposal that market operators should be obligated to provide information to consolidators on a non-discriminatory basis in order to maintain a level playing field. While ASIC expects that more than one consolidator will emerge in Australia, if it becomes apparent that no industry solution is likely to eventuate to consolidate data from all markets, ASIC may revisit the issue and consider introducing a single consolidator via a public tender process.
Are additional clearing agents needed in Australia?
ASIC is aware that the topic of additional clearing agents in Australia is indeed a timely one. It is currently in discussion between industry representative bodies, ASXClear and RBA (as the regulator in the clearing & settlement space) with ASIC as an observer. Issues have arisen as to both quality and quantity of third party clearers, as ASXClear seeks to significantly increase minimum capital adequacy requirements for clearing participants, in particular third party clearers.
It is an area that ASIC will continue to monitor and a discussion that will be followed with great interest, both inside and outside of ASIC. We look forward to continued frank and candid discussions with the financial industry in this space.
How can smart order routing be most effective?
Smart Order Routers (SORs) will assist market participants in meetingtheir best execution obligations in a multi-market environment. Some participants will use SORs developed by independent service providers and some will build their own systems in-house.
Trading participants will be able to route orders automatically to different venues depending on specified criteria. In a multimarket environment the routing of orders could be split across venues depending on liquidity. In this way, ASIC expects that clients will received a better outcome overall, particularly so for retail clients, who will receive the best price across the markets unless they wish to instruct otherwise (e.g. for an order to be executed with an emphasis on speed, rather than price).
Does ASIC seek to encourage high frequency trading in Australia? If so, under what terms?
ASIC neither encourages nor discourages the practice. High frequency trading is, however, an area that is continuously monitored by ASIC, and we will respond if necessary to ensure that any such activity does not interfere with market integrity and fair, orderly and transparent obligations. In addition, market operator platforms must have adequate and scalable throughput capacity.
In the coming months, ASIC will release a consultation paper (CP) pertaining to the broader enhanced market structure. This CP will discuss, among other things, the issue of market makers in the cash equity products. We look forward to industry feedback to this CP when released in the next few months.
RCM’s Head of Asia Pacific Trading, Kent Rossiter, unmasks the Asian trading scene, sharing insights into how RCM navigates the unlit landscape, identifying the effects of dark liquidity and highlighting ways brokers can facilitate better buy-side decision making.
FIXGlobal: What are the main benefits of dark liquidity in Asia?
Kent Rossiter, RCM: One of the major challenges in Asia has always been accessing liquidity without other parties in the market taking advantage of your position and your need to complete the order. In cases where liquidity is scarce, knowledge that a relatively large order is being worked can expose investors to various risks. In such situations, it is advantageous for knowledge of the deal whilst it is being worked to be discreet until the order is filled. In dark pools run by brokers we can get priority on our orders through queue-jumping.
Dark pools support such an approach as they allow large block orders to be worked without showing size. In this way, trading in dark pools allows a trader to access a broker’s own internal order flow, without being gamed by the market that would otherwise risk non-fulfillment or less efficient pricing. As a result, size trading becomes the norm in dark pools and a trader gets to see blocks that may never have been available otherwise. With no information leakage we are not disadvantaged by the fading you see on lit venue quotes. From a personal perspective, the challenges that arise from dealing across a number of venues and the resulting increased use of technology make the role more exciting and satisfying.
FG: How do you limit information leakage in dark pools?
KR: With the exception of broker internalization engines, the trade sizes found in dark pools are often multiple of what they are on the exchange. So having fewer, but larger prints reduces information leakage, and in many cases we can get done on our size right away. Minimizing the number of times a print hits the tape reduces the chance of this footprint being picked up and working against the balance of your order. That said, broker internalization engines do their part well, keeping any spread savings among the two broker’s clients instead of giving it up to the general market.
FG: If you decide to seek dark liquidity, how do you decide between broker internalizers and block crossing networks?
KR: The type of dark venues being used for various trades (i.e. between block crossing networks and brokers) are different. As I mentioned, brokers for the most part are matching up little prints that otherwise would have been time-sliced in the general market, and when using these venues the goal is often to save a few basis points along the way while you work an order. You are not often micro-managing each fill, but through the process we are getting spread capture and price improvement. The type of stock you are often trading in these internalization engines are often larger, more liquid stocks; the type of orders often worked by algos.
Block crossing networks on the other hand, while still matching up electronically, are probably more confidential, and take up the function of what brokers still do upstairs - putting blocks together - so size is the real focus here. Both types of dark pools use the primary market for price sourcing since the vast majority of trades get printed at or within the best bid and offer. As the primary markets become too thin, it can cause price formation problems.
While it is not specific to the consideration of dark pools as an extra execution venue, we have to consider potential increased book out costs if we do use dark pools (except via aggregators, since we would only be using one counterparty), just as we have had to for years when deciding whether to execute a block with a single broker versus multiple counterparties. As dark pools proliferate there is an increased chance that we may not have part of our order in that pool at just the right time to take advantage of flow that may be parked there. Dark pool aggregators are aiming to provide the buy-side solutions to this.