AI does not have enough experience to handle the next market crash

3:15
 
Share
 
Manage episode 193548202 series 1263995
By Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio streamed directly from their servers.

Artificial intelligence is increasingly used to make decisions in financial markets. Fund managers empower AI to make trading decisions, frequently by identifying patterns in financial data. The more data that AI has, the more it learns. And with the financial world producing data at an ever-increasing rate, AI should be getting better. But what happens if the data the AI encounters isn’t normal or represents an anomaly?

Globally, around 10 times more data was generated in 2017 than in 2010. This means that the best quality data is also highly concentrated in the recent past—a world that has been running on cheap money, supplied by central banks through purchases of safe securities, which is not a “normal” state for the market. This has had a number of effects, from causing a rise in “zombie” firms to creating generational lows in volatility to encouraging unusually large corporate buybacks

The Financial Stability Board, an international body based in Basel, Switzerland set up by the G20 in the aftermath of the last financial crisis, recently studied the potential impacts of AI and machine learning on financial stability. One of the risks highlighted was the increased use of AI by hedge funds and market makers. Because AI is so effective at optimizing complex systems, its use can further tighten trading parameters that are vital for market stability, such as how much capital a bank has in relation to its outstanding trading positions.

With its increasing use in financial markets, it will play a role in the next market correction, perhaps a critical one, as an era of low volatility, high debt, and cheap money comes to an end. AI will need sufficient data across a big enough timespan for the models to adapt to new market conditions without overreacting.

The question is, if and when a shock comes and an entirely unfamiliar situation arises, what will the financial AIs do? As the financial system gets more interconnected, AI could spread the impact of extreme shocks faster, making the entire system potentially less stable during a shock event. This is particularly true if data sources and AI strategies are shared, and then there is a shock to a particular data source.

Consider the example of a data shock in the case of self-driving cars. When Google was training its self-driving car on the streets of Mountain View, California, the car rounded a corner and encountered a woman in a wheelchair, waving a broom, chasing a duck. The car hadn’t encountered this before so it stopped and waited. When a Tesla driving on autopilot failed to recognize a truck turning across it on the freeway, it kept going. In both cases, the situations were unfamiliar—but one had a failsafe and the other simply failed.

AI simply isn’t good in situations it doesn’t yet recognize.

1219 episodes available. A new episode about every 2 hours averaging 5 mins duration .