Showing posts with label Black Monday 2.0: The Next Machine-Driven Meltdown. Show all posts
Showing posts with label Black Monday 2.0: The Next Machine-Driven Meltdown. Show all posts

Monday, October 16, 2017

Black Monday 2.0: The Next Machine-Driven Meltdown

Brian Stauffer
Black Monday. Although the event to which those two words refer occurred 30 years ago, they still carry the weight of that day—Oct. 19, 1987—when the Dow Jones Industrial Average shed nearly a quarter of its value in wave after wave of selling.
No one in living memory had seen anything like it, at least not in the U.S., and in the postmortems conducted to understand just how the Dow managed to drop 508 points in one day, experts found a culprit: so-called portfolio insurance, a quantitative tool designed to use futures contracts to protect against market losses. Instead, it created a poisonous feedback loop, as automated selling begat more of the same.
Since that day, markets have rallied and markets have tumbled, and still we marvel at the unintended consequences of what, in hindsight, was an obviously misguided strategy. Yet in the ensuing years, market participants have come to rely increasingly on computers to run quantitative, rules-based systems known as algorithms to pick stocks, mitigate risk, place trades, bet on volatility, and much more—and they bear a resemblance to those blamed for Black Monday.
The proliferation of computer-driven investing has created an illusion that risk can be measured and managed. But several anomalous episodes in recent years involving sudden, severe, and seemingly inexplicable price swings suggest that the next market selloff could be exacerbated by the fact that machines are at the controls. “The system is more fragile than people suspect,” says Michael Shaoul, CEO of Marketfield Asset Management.
THE RISE OF COMPUTER-DRIVEN, rules-based trading mirrors what has happened across nearly every facet of society. As computers have grown more powerful, they have been able to do what humans were already doing, only better and faster. That’s why Google has replaced encyclopedias in the search for information, why mobile banking is slowly replacing bank branches, and why—someday—our cars will be able to drive us to work. And it is also why Wall Street has embraced computers to help with everything from structuring portfolios and trading securities to making long-term investment decisions.
In the years since 1987, huge strides have been made in understanding what drives stock performance and how to apply it to portfolio construction. At first, researchers focused on “factors,” such as a stock’s volatility relative to the market—known as beta; whether a stock is large-cap or small—the size factor; and whether it is cheap or expensive—the value factor. More recently, the use of factors has proliferated to include many others, such as quality and momentum. (The latter involves buying the best-performing stocks and shunning the worst performers.)
Quantitative investors understood early on that betting on stocks based on their characteristics—and not the underlying business fundamentals of a particular company—was a good way to outperform the market. So good, in fact, that many fundamental, or “active,” money managers now use quantitative tools to help construct their portfolios and ensure that they don’t place unintended bets. Nomura Instinet quantitative strategist Joseph Mezrich says that 70% of an active manager’s performance can be explained by quantitative factors. “Factors drive a lot of the returns,” Mezrich says. “Over time, this has dawned on people.”
Has it ever. One result has been the rise of indexing and exchange-traded funds. The ability to buy an index fund based on the Standard & Poor’s 500—effectively a bet that large companies will outperform small ones—made the need for traditional fundamental research and stock-picking unnecessary. Since then, indexes and ETFs have been created to reflect just about any factor imaginable—low volatility and momentum among them. Some funds even combine multiple factors in a quest for better performance.
As a result, an increasing amount of money is being devoted to rules-based investing. Quantitative strategies now account for $933 billion in hedge funds, according to HFR, up from $499 billion in 2007. And there’s some $3 trillion in index ETFs, which are, by definition, rules-based. The upshot: Trillions of dollars are now being invested by computers. “We’ve never seen so many investment decisions driven by quantitative systems,” says Morningstar analyst Tayfun Icten.
That’s quite a change from the 1980s. If you wanted to place a trade 30 years ago, you picked up the phone and called your broker; your broker called the firm’s trader; the trader would ring up a specialist, the person in charge of running trading in a given stock; and the trade would be executed. The process was slow, cumbersome, and inefficient. As computer technology advanced, machines gradually took most of these steps out of the hands of humans. Today, nearly every trade is handled by an algorithm of some sort; it is placed by a computer and executed by computers interacting with one another.
The entity handling trades isn’t the only thing that has changed in the past 30 years. Trading now occurs in penny intervals, not fractions such as eighths and 16ths. While that has made it cheaper for investors to buy and sell a stock, pennies made trading far less lucrative for market makers, who historically profited by playing the “spread” between the highest bid to buy and the lowest offer to sell. Consequently, market makers have been replaced by algorithms programmed to instantaneously recognize changes in liquidity, news flow, and other developments, and respond accordingly. At the same time, the proliferation of exchanges helped to lower trading costs but also created a fragmented market that can make shares hard to find during dislocations.
Most of the time, none of this matters. If you want to buy a stock, you boot up your computer, log in to your brokerage account, and place an order that gets filled almost immediately. The fee you pay is so low that it would have been unimaginable 30 years ago. The system has worked well for individual investors, and will continue to do so—as long as nothing goes wrong.
BUT MISTAKES HAPPEN. In 1998, the “quants” at Long-Term Capital Management, led by Nobel Prize winners Myron Scholes and Robert Merton, nearly caused a massive market selloff when the hedge fund’s highly leveraged trades, based on quantitative models of expected market behavior, suddenly lost money after Russia unexpectedly defaulted on its debt. The damage was magnified by the borrowing that LTCM had used to supersize its bets. Only a bailout organized by the Federal Reserve prevented the broad market from plummeting.
In August 2007, a selloff occurred in quantitative funds that would become known as the “quant quake.” To this day, no one knows what sparked the selling, but once it began, computer models kicked in, causing further selling. Humans added to the mess as risk managers looking at losses dumped shares. Funds specializing in quantitative investment strategies reportedly suffered massive losses: The Renaissance Institutional Equities fund was thought to have lost nearly 9% early in that month, while Goldman Sachs ’ Global Alpha suffered a double-digit decline.
The impact on the market wasn’t huge—the S&P 500 dropped just 3.3% during the first two weeks of August—but the event demonstrated what happens when a trade sours and too many funds are forced by their models to sell at the same time. It was a wake-up call for quants, who have since created more-sophisticated systems to reduce the kind of crowding that led to the selloff.
More recently, problems have been caused by algorithms that are supposed to provide stock for investors to buy, or buy when investors sell, creating liquidity. On May 6, 2010, the S&P 500 dropped 7% in just 30 minutes, as bids and offers for stocks moved far away from where stocks had been trading, in some cases leaving bids down as low as a penny and offers as high as $100,000.
Again, no one knows what caused the sudden decline. Investors had been on edge because of an unfolding European debt crisis, but that alone seemed unlikely to have triggered the flight of automated market makers. The U.S. Commodity Futures Trading Commission blamed the swoon on fake orders placed by a futures trader, while the Securities and Exchange Commission fingered a massive sell order in the futures market allegedly placed by a mutual fund company seeking to protect itself from a potential downturn. That order, it argued, had been handled by a poorly designed algorithm—yet another reminder that an algorithm is only as good as the inputs used by the people designing it.
While the rout was over quickly, and the S&P 500 finished the session down a more modest 3.2%, the episode raised concerns about the potential for computerized trading to exacerbate selloffs.
REGULATORS AND EXCHANGES have made changes since then, but so-called flash crashes continue to happen, even if they are no longer quite as disruptive as the 1987 selloff. On Aug. 24, 2015, for instance, the Dow dropped almost 1,100 points during the first five minutes of trading. The selloff was spurred by a plunge in China’s stock market, which led to a drop in Europe. All of this happened when U.S. markets were closed, which meant that investors turned to the futures and options markets to place their trades.
Chaos prevailed when the stock market opened: Only about half of the stocks in the S&P 500 had started trading by 9:35 a.m.; a quarter of the Russell 3000 index was down 10% or more intraday, and many large ETFs traded far below the value of their underlying assets. Algorithms, sensing something amiss, simply stepped back from the market. Once again, the S&P 500 recovered much of its sudden loss, but savvy market observers detected eerie echoes of an earlier era. In a much-read note at the time,JPMorgan strategist Marko Kolanovic cited the feedback loop of selling and compared it to the Black Monday selloff of 1987.
Flash crashes have not been limited to stocks—or even crashes. On Oct. 15, 2014, the price of the 10-year Treasury note soared, causing yields to tumble 0.35 of a percentage point in mere minutes before quickly reversing. The SEC blamed the increasing role of automated high-frequency algorithms for the sudden move.
The most recent scare occurred on May 18, when the iShares MSCI Brazil Capped ETF(ticker: EWZ) dropped as much as 19% in a single trading session before closing the day down 16%. To put that move in perspective, the Brazil ETF’s worst single-day decline at the height of the financial crisis in 2008 had been 19%. While there was bad news in May—reports that Brazilian President Michel Temer had been ensnared in a corruption scandal—that seemed insufficient cause for such a precipitous decline.
Shaoul, of Marketfield, attributes the Brazil ETF’s plunge to a combination of factors, including the growth of passive investing, which has made it easy to buy and sell an entire country’s market with the press of a button, combined with computer-driven trading. “There was no way of knowing what was a human being pressing a button, or a computer pressing a button,” he says. “But it generates the potential for sudden spikes in volatility that come out of nowhere.”
The Brazil ETF recovered its losses fairly quickly. By the end of August, it was trading above its May 17 close.
U.S. markets haven’t suffered declines like that, but have experienced numerous “fragility events”—sudden one-day declines—during the current rally, says Chintan Kotecha, an equity derivatives strategist at Bank of America Merrill Lynch. But because stocks have been in a bull market, there has been little follow-through after the initial selloff. As a result, some quantitative strategies reposition for more volatility, but none arrives. Kotecha attributes the lack of follow-through, in part, to central bankers’ continued bond-buying, which has provided much-needed support for the markets.
Follow-through was all the market had in 1987, as selling automatically triggered more selling. To some observers, the risks of a similar scenario are growing. One particular area of concern: volatility-targeting strategies, which try to hold a portfolio’s volatility constant, and risk-parity strategies, which attempt to equalize the risk in a portfolio among bonds, stocks, and other assets—and sometimes use leverage to do it. When volatility is low, these portfolios can hold more-risky assets than when volatility is high. But as soon as volatility rises—and stays high—these types of funds will need to start selling stocks and other assets to keep the risk of their portfolios at the same level. If they sell enough, volatility could spike higher, leading to even more selling.
The PROLIFERATION of COMPUTER-DRIVEN INVESTING has created an illusion that RISK can be measured and managed. But several anomalous episodes in recent years involving sudden, severe, and seemingly INEXPLICABLE PRICE SWINGS suggest the next MARKET SELLOFF could be exacerbated by the fact that the MACHINES are at the controls.
In a market selloff, commodity-trading advisors similarly could exit their long positions quickly and look to short stocks, creating further selling pressure as they head for the exits. “Action leads to more action,” says Richard Bookstaber, chief risk officer at the University of California and author of The End of Theory, a book about financial crises caused by positive feedback loops.
PERHAPS THE BIG QUESTION is who might be left to buy. Warren Buffett once quipped that investors should be fearful when others are greedy and greedy when others are fearful, but the current market structure has turned that maxim on its head. Algorithms provide less liquidity in a downturn than a human market maker, who might be thinking about how to profit from a dislocation.
The rise of momentum and passive strategies has caused some $2 trillion to shift away from active money managers, who could be counted on to look for bargains as stocks sold off, says Kolanovic, the JPMorgan strategist. “We think the main attribute of the next crisis will be severe liquidity disruptions resulting from market developments since the last crisis,” he says.
But most strategists acknowledge that such an occurrence isn’t a high-probability event. Much will depend on the cause of any disruption, as well as seasonal factors—stocks are more thinly traded in summer, for example. Also, computers aren’t the only cause of selling cycles; bear markets, after all, long predate machine-driven trading.
Quantitative investors argue that they have learned from past mistakes and are less likely to be leveraged or crowded into the same trades. Moreover, regulators and exchanges have instituted rules that could help arrest a bout of unchecked selling, with trading halts imposed when the S&P 500 falls 7%, 13%, and 20%.
Maybe these precautions will work to stem a tidal wave of selling. One of these days—possibly soon, given stocks’ lofty valuation and the Fed’s plan to shrink its balance sheet—we’ll find out. 
http://www.barrons.com/articles/black-monday-2-the-next-machine-driven-meltdown-1507956435?shareToken=sta34fcb09ee6a423fa9e0acc127844d01&utm_source=newsletter&utm_medium=email&utm_campaign=sendto_newslettertest&stream=top-stories