For even the most skilled human beings, the ability to
constantly observe all of the factors of market pricing, to develop a proper
strategy for using that new information effectively and to act on that new
strategy is impossible to do all within one second. However, that is exactly
what computers are doing today for processing stock market transactions and
online auctions such as online advertising. The speeds of computer transactions
today are sub-second which leaves humans unable to compete effectively. This
gives the human stock market trader a huge disadvantage as they are always
working with false information. As soon as the human trader has seen the
information and has taken the proper amount of time to process it, the market
has already changed because automated computer systems have processed thousands
of transactions on the old data already.
So one
might assume that human traders will one day be phased out of this process
which seems to indeed be the trend. 70% of all equity trades are already done
by high speed computers. But technology already seems to be taking over human
action in many aspects of our lives from automated bill pay to using credit
cards and cell phones instead of cash. So what is the problem with the stock
market and online auctions being computerized? The answer is complex but
essentially comes down to two reasons: Equilibriums and Immaturity. The
transactions that are being done by these high speed computers are done so
quickly that market equilibriums and therefore overall market stability are
never being reached. Data collected from a Chicago firm saw that over 1800 sub-second
price spikes occurred in the last five years due solely to high speed computer
trading. While the overall effect has been fairly mild and the market has
recovered from each, analysts predict that the more algorithms and computers
are introduced into the system, the more likely a catastrophic event is likely
to occur.
Consider
the butterfly effect in the stock market or an online advertising setting. One
computer glitches or has a poor algorithm or for whatever reason decides to dump
its assets. Maybe a few other computers pick up the sale which sends up a red
flag and tells them to sell their assets. This then propagates amongst all of
the computers in a matter of seconds and stocks/auction plummets. Of course if one
sector plummets other sectors are likely to be affected as well and we could
see a market crash or auction crash on the order of minutes. While the stock
market and online sites have taken some measures to try to prevent this such as
‘circuit breakers’ when a stock drops too quickly which will halt the sale of
the stock, the overall affect can still be achieved. Because stocks are not
allowed to reach equilibrium values, the data being used in their algorithms is
inherently flawed and therefore not good practice anyway. Just like the
centipede theory in game theory suggests, humans tend to wait longer and not
necessarily take the instantaneous best choice in a market setting which in
this case actually allows for the market to reach equilibrium and therefore
potentially increase their profits. Computers however act upon thresholds and
data and not upon feelings and instinct so they have no regard for anything but
the algorithm with which they were programmed.
Other
than a lack of market equilibrium, the other problem that arises with computer
trading is a lack of understanding of what exactly is going on. The more sophisticated
a program is, the more likely a bug or unintended effect may be present. With
stock trading computers looking at hundreds of different factors such as price and
trends at time, the reliability and understanding of how all these different
algorithms act with one another or even how your own program will act based on
several changes in market conditions is still largely an unknown. However, the
potential for vast sums of money to be made quickly makes the risk acceptable
to many. However, without a true understanding of how each of these programs
really work and how they work together, they are leaving the chance that the
entire system may be unstable and given a specific input condition, the entire
market could crash.
If we
want to use automated computers to buy and sell stocks/run auctions in the future,
we need to either slow down the rate at which they are able to make
transactions with the market or we need to gain a much deeper understanding of
how the market behaves based on these extremely quick changes in market
conditions.
The question of market reaching an equilibrium is interesting even when there are only humans present, and introducing computer algorithms that can act in arbitarily many different ways does pose a challenge to equilibrium analysis. I'm not aware of any formal (non-empirical) approach to this scenario. Here is a different argument for equilibrium: With computers, the achievable rationality is vastly increased due to its improved processing efficiency of large-scale real-time data. So a "best-response step" is much more likely to be a true best response to the global state, suggesting that convergence to equilibrium should actually be faster! What gives?
ReplyDelete