Notes on
The Man Who Solved the Market
by Gregory Zuckerman
| 21 min read
This book tells the story of Jim Simons and Renaissance Technologies, the most successful quantitative hedge fund in history. It’s a story of how a group of mathematicians and scientists, most with no prior finance experience, revolutionized investing by applying rigorous scientific methods, advanced mathematics, and massive datasets to financial markets. They discovered hidden patterns and subtle correlations that traditional Wall Street players completely missed.
Here are a few key themes:
- A Non-Traditional Approach: Simons and his team came from theoretical mathematics and code-breaking, not finance. This outsider perspective allowed them to see markets differently.
- Data is King: Renaissance’s success was built on amassing and meticulously cleaning vast amounts of historical price data, then using it to model and predict market behavior.
- The Scientific Method Applied to Finance: They treated investing like a scientific problem, forming hypotheses, testing them rigorously, and iterating constantly. Explainability wasn’t the priority; statistical significance was.
- The Importance of Culture: Simons fostered a unique, collaborative, and open environment where ideas were freely shared, and peer pressure drove excellence.
- Machine Learning, Early On: Renaissance was an early adopter of machine learning techniques, building models that could learn and adapt on their own.
- Risk Management is Crucial: Despite their predictive power, they emphasized risk management—even the best models are fallible.
- Secrecy and Innovation: The firm’s extreme secrecy protected its proprietary strategies, allowing them to maintain an edge for decades.
The Paradox of Simons’ Success
I found it fascinating that Simons and his team were the unlikely conquerors of Wall Street. He had no formal finance training, no deep interest in business, and a background in theoretical math.
His firm, located far from Wall Street, hired academics who knew nothing about investing and were even skeptical of capitalism.
Yet, they’re the ones who revolutionized the industry.
It’s as if a group of tourists, on their first trip to South America, with a few odd-looking tools and meager provisions, discovered El Dorado and proceeded to plunder the golden city, as hardened explorers looked on in frustration.
Early Days & Code-Breaking
Simons’ early experience as a code-breaker at the IDA (Institute for Defense Analyses) was formative. He learned to manage talented researchers and fostered a culture that valued even “bad” ideas.
“Bad ideas is good, good ideas is terrific, no ideas is terrible.”
This is a great philosophy for knowledge work.
Even then, Simons was dabbling in the stock market. He and his colleagues published a groundbreaking (though classified) paper proposing a trading system based on “macroscopic variables” (like ‘high variance’ and ‘good’) and hidden Markov models.
They didn’t care about the whys of market movements, just the strategies to exploit them. This was pretty revolutionary for the time.
He had attempted to start a (trading) firm called iStar before, but it didn’t take off. So he’s doing research while being a code breaker. And this seems to be a start in Quant research.
For the majority of investors, this was an unheard-of approach, but gamblers would have understood it well. Poker players surmise the mood of their opponents by judging their behavior and adjusting their strategies accordingly. Facing off against someone in a miserable mood calls for certain tactics; others are optimal if a competitor seems overjoyed and overconfident. Players don’t need to know why their opponent is glum or exuberant to profit from those moods; they just have to identify the moods themselves.
I liked that analogy to poker. It’s also interesting that they were using hidden Markov models.
The paper wasn’t perfect. It made some naive assumptions. But it was something of a trailblazer, in that they suggested one could detect signals capable of conveying useful information about expected market moves.
Simons had been a star cryptologist, had scaled the heights of mathematics, and had built a world-class math department, all by the age of forty. He was confident he could conquer the world of trading. Investors had spent centuries trying to master markets, rarely finding huge success. Once again, rather than deter Simons, the challenges seemed to spark enthusiasm.
A recap of his early life (0→40).
“He really wanted to do unusual things, things others didn’t think possible,” his friend Joe Rosenshein says.
The Efficient Market Hypothesis and Simons’ Perspective
Simons rejected the Efficient Market Hypothesis (EMH), which claimed that all information was already priced into assets. He believed markets had structure, even if they appeared chaotic.
He and his colleagues were arguing that it wasn’t important to understand all the underlying levers of the market’s machine, but to find a mathematical system that matched them well enough to generate consistent profits
Scientists and mathematicians are trained to dig below the surface of the chaotic, natural world to search for unexpected simplicity, structure, and even beauty.
This unique perspective, honed by his mathematical background, was key to his approach.
Markov Chains, Baum-Welch, and Hidden Markov Models
These concepts were central to Simons’ early work and Renaissance’s later success.
- Markov Chains: Sequences where the next event depends only on the current state, not the past.
- Hidden Markov Models: The underlying parameters governing the chain of events are unknown.
- Baum-Welch Algorithm: A way to estimate probabilities and parameters within these complex sequences.
Leonard Baum, co-creator of the Baum-Welch algorithm, was one of Simons’ early collaborators.
The Birth of Renaissance Technologies
Simons initially ventured into venture capital, renaming Monemetrics to Renaissance Technologies in 1982.
Stochastic Differential Equations
Axcom (the firm Simons co-founded with James Ax) explored using stochastic differential equations, which model dynamic processes with uncertainty.
I’m curious if these equations are still widely used in quant finance.
Members of Axcom’s team viewed investing through a math prism and understood financial markets to be complicated and evolving, with behavior that is difficult to predict, at least over long stretches—just like a stochastic process.
They didn’t believe the market was a truly random walk (or completely unpredictable), but it clearly had elements of randomness.
They would later embrace stochastic differential equations for risk management and options pricing.
Early Machine Learning and Simons’ Skepticism
Rene Carmona introduced an early form of machine learning, using kernel methods to find patterns in data. Surprisingly, Simons was initially resistant to this “black box” approach.
“I can’t get comfortable with what this is telling me,” Simons told the team one day. “I don’t understand why [the program is saying to buy and not sell].”
“It’s a black box!” he said with frustration.
Carmona agreed with Simons’s assessment, but he persisted.
“Just follow the data, Jim,” he said. “It’s not me, it’s the data.”
This was unexpected. It’s fascinating that the pioneer of quantitative finance was initially uncomfortable with a core concept of the field.
The started Introducing ML because they found linear regressions to be insufficient. So they used higher dimensional kernel regression approaches to model nonlinearities.
Claude Shannon, John Kelly, and the Kelly Criterion
The book touches on the influence of Claude Shannon (information theory) and John Kelly (the Kelly Criterion) on Elwyn Berlekamp, another key figure at Renaissance.
The Kelly Criterion helps determine the optimal bet size to maximize growth while minimizing risk. This concept of bet sizing would become crucial to Renaissance’s strategy.
Trend Following vs. Value Investing
Renaissance initially used trend-following strategies, buying assets that were rising and selling those that were falling. This contrasted with the value investing approach of Warren Buffett and others.
Buying investments as they became more expensive and selling them as they fell in value was at odds with leading academic theory, which recommended buying when prices cheapened and taking money off the table when prices richened.
Medallion’s Launch and Early Struggles
Simons and Ax launched Medallion in 1988, focused solely on trading. It initially struggled, partly due to Ax’s shifting focus.
Slippage
This is the cost incurred when buying or selling affects the price of an asset, reducing potential profits. Renaissance was very aware of slippage from the beginning.
The “Magic Formula” and Putting in the Reps
For much of 1990, Simons’s team could do little wrong, as if they had discovered a magic formula after a decade of fumbling around in the lab.
The importance of persistence. It took them a decade to find their edge.
Edward Thorp and Louis Bachelier
Edward Thorp pioneered the use of quantitative strategies in investing. As an academic, he worked with Claude Shannon (information theory pioneer) and was influenced by John Kelly’s proportional betting system. He first applied mathematical analysis to gambling, documented in his book Beat the Dealer.
In 1964, Thorp turned to Wall Street, seeing it as the ultimate casino. After studying technical analysis and Graham & Dodd’s Security Analysis, he was struck by how little sophisticated analysis existed in the field. He focused on stock warrants, developing formulas to identify mispriced securities. Using an HP 9830 computer, he would buy undervalued warrants while shorting overvalued ones.
His hedge fund, Princeton/Newport Partners, attracted notable investors like Paul Newman and achieved consistent 15%+ annual returns. Their computer-driven trading consumed so much power their office was consistently overheated.
Thorp’s work built on Louis Bachelier’s 1900 doctoral thesis on option pricing, which used equations similar to Einstein’s work on Brownian motion. Though overlooked for decades, Bachelier’s work on irregular stock price movements became foundational to modern quantitative finance.
Despite skeptics claiming markets were “too complicated to model,” Thorp’s fund grew to $300M by the late 1980s, far larger than Medallion’s initial $25M. However, Princeton/Newport was forced to close in 1988 due to fallout from the Michael Milken trading scandal, though Thorp was never accused of wrongdoing.
Pairs Trading and Statistical Arbitrage
These are strategies that involve betting on the relative price movements of related assets.
Morgan Stanley’s APT (Automated Proprietary Trading) team was an early adopter of statistical arbitrage.
The Morgan Stanley traders became some of the first to embrace the strategy of statistical arbitrage, or stat arb. This generally means making lots of concurrent trades, most of which aren’t correlated to the overall market but are aimed at taking advantage of statistical anomalies or other market behavior.
When the traders prepared to buy and sell big chunks of shares for clients, acquiring a few million dollars of Coca-Cola, for example, they protected themselves by selling an equal amount of something similar, like Pepsi, in what is commonly referred to as a pairs trade.
Single Model vs. Multiple Models
Henry Laufer made the crucial decision to use a single trading model for Medallion, rather than multiple models for different assets. This allowed them to leverage their vast data trove and find correlations across different markets.
Five-Minute Bars and Nonrandom Trading Effects
In the early days, Renaissance’s data operation, led by Straus, maintained extensive records of price movements across commodities, bonds, and currencies. Their initial approach was relatively simple: they divided the trading week into ten segments, alternating between overnight sessions for international markets and regular day sessions. This bisected structure let them look for patterns between segments and execute trades at key points - morning, noon, and day’s end.
Simons, however, wondered if they were missing something by using such broad time segments. This led to Laufer experimenting with progressively finer time divisions, first halving the day, then quartering it, before finally settling on five-minute intervals. Recent improvements in computing power made this granular analysis feasible for the first time. The team could now ask precise questions: Did the 188th five-minute period of cocoa futures consistently dip during market anxiety? Did the 50th period of gold trading show consistent strength during inflationary concerns?
This finer resolution revealed previously invisible patterns in the market’s behavior. For instance, they discovered that certain trading patterns on Friday mornings had predictive power for that afternoon’s close. They also found that late-day market rises often continued into the next morning’s open, creating profitable overnight holding opportunities. The team uncovered various other effects, including patterns in volatility and coordinated movements between related assets like gold and silver, or heating oil and crude oil.
While many of these patterns defied obvious explanation, they met Renaissance’s strict statistical criteria, showing p-values below 0.01. This meant there was less than a 1% chance these patterns were mere statistical flukes.
It was as if the Medallion team had donned glasses for the first time, seeing the market anew.
The Betting Algorithm
Simons pushed for a “betting algorithm” to determine the optimal allocation of capital to different trades, given the limited money Medallion managed. This was another early application of machine learning.
“Our system is a living thing; it’s always modifying,” he said. “We really should be able to grow it.”
Behavioral Economics
Tversky and Kahneman’s research in the 1970s revealed systematic irrationality in human decision-making, later expanded by Thaler into behavioral economics. Their work identified key biases affecting investors:
- Loss Aversion: Investors feel losses roughly twice as strongly as equivalent gains
- Anchoring: Initial reference points disproportionately influence decisions
- Endowment Effect: Overvaluing assets simply because we own them
This aligned with Renaissance’s experience: markets were most inefficient during periods of high emotion and stress. Not coincidentally, these turbulent periods often produced Medallion’s largest profits.
Kahneman and Thaler would win Nobel Prizes for their work. A consensus would emerge that investors act more irrationally than assumed, repeatedly making similar mistakes. Investors overreact to stress and make emotional decisions. Indeed, it’s likely no coincidence that Medallion found itself making its largest profits during times of extreme turbulence in financial markets, a phenomenon that would continue for decades to come.
Brown and Mercer’s System
Brown and Mercer, recruited from IBM, brought a fresh perspective to Renaissance’s trading system. They approached trading as a mathematical optimization problem, similar to how they had tackled language recognition at IBM.
Their system was revolutionary in several ways:
- Monolithic Design: Combined all trading signals and portfolio requirements into a single model
- Adaptive Learning: System could learn and self-adjust based on real-time results
- Automated Optimization: Ran multiple times per hour, evaluating thousands of potential trades
- Scale: Half million lines of code (compared to Frey’s tens of thousands)
The system took various inputs:
- Trading costs
- Leverage levels
- Risk parameters
- Other operational constraints
A key innovation was the system’s ability to self-correct when trades weren’t executed. It would automatically search for alternative buy/sell orders to rebalance the portfolio to its target state. This solved a major limitation of Frey’s earlier model.
The system retained Frey’s original prediction model from Morgan Stanley, focusing on mean reversion trades. As one employee summarized:
“We make money from the reactions people have to price moves.”
While Renaissance would add many refinements over the years, this reversion-to-the-mean approach remained their bedrock strategy for over a decade, with other strategies serving as “second order” complements.
Despite its sophistication, the system initially struggled with scale. When Renaissance tried to deploy more than $35 million in equities, the returns diminished significantly – similar to what happened with Frey’s system. Neither Brown nor Mercer could initially diagnose the problem.
This system, implemented in 1995, represented exactly what Simons had envisioned: a fully automated trading platform that could continuously optimize and adapt. While it had initial scaling issues, it would become the foundation for Renaissance’s future success.
David Magerman, a later recruit, found and fixed crucial bugs in Brown and Mercer’s code.
Renaissance’s Culture of Openness
Simons fostered a unique culture of collaboration and transparency. All employees had access to the source code, and peer pressure drove them to excel.
Simons created a culture of unusual openness. Staffers wandered into colleagues’ offices offering suggestions and initiating collaborations. When they ran into frustrations, the scientists tended to share their work and ask for help, rather than move on to new projects, ensuring that promising ideas weren’t “wasted,” as Simons put it. Groups met regularly, discussing intimate details of their progress and fielding probing questions from Simons.
Peer pressure became a crucial motivational tool. Researchers, programmers, and others spent much of their time working on presentations. They burned to impress each other—or, at least, not embarrass themselves in front of colleagues—spurring them to plug away at challenging problems and develop ingenious approaches.
“You want a bigger bonus? Help the fund get higher returns in whatever way you can: discover a predictive source, fix a bug, make the code run faster, get coffee for the woman down the hall with a great idea, whatever … bonuses depend on how well the fund performs, not if your boss liked your tie.”
Urgency
Academics can slog along for years on academic papers; by contrast, Simons pushed for results within weeks, if not days, an urgency that held appeal. The atmosphere was informal and academic, yet intense; one visitor likened it to a “perpetual exam week.”
The Three-Step Process for Discovering Trading Signals
- Identify anomalous patterns in historical pricing data.
- Ensure the anomalies are statistically significant, consistent, and nonrandom.
- See if the pricing behavior can be explained reasonably (though this was less important to them).
Non-Intuitive Signals and Data Overfitting
Renaissance embraced “non-intuitive” signals, those they couldn’t fully understand, as long as they were statistically significant. This gave them an edge over competitors who wouldn’t touch such trades.
“If there were signals that made a lot of sense that were very strong, they would have long-ago been traded out,” Brown explained. “There are signals that you can’t understand, but they’re there, and they can be relatively strong.”
They were careful to avoid data overfitting, the pitfall of finding spurious correlations.
The obvious danger with embracing strategies that don’t make sense: The patterns behind them could result from meaningless coincidences. If one spends enough time sorting data, it’s not hard to identify trades that seem to generate stellar returns but are produced by happenstance. Quants call this flawed approach data overfitting.
For example:
David Leinweber found that US stock returns can be predicted with 99% accuracy by combining data for the annual butter production in Bangladesh, US cheese production, and the population of sheep in Bangladesh and the US.
Often, the Renaissance researchers’ solution was to place such head-scratching signals in their trading system, but to limit the money allocated to them, at least at first, as they worked to develop an understanding of why the anomalies appeared. Over time, they frequently discovered reasonable explanations, giving Medallion a leg up on firms that had dismissed the phenomena. They ultimately settled on a mix of sensible signals, surprising trades with strong statistical results, and a few bizarre signals so reliable they couldn’t be ignored.
The Increasing Use of Machine Learning
Renaissance’s system increasingly relied on machine learning, with the computers learning and adapting on their own.
Just as astronomers set up powerful machines to continuously scan the galaxy for unusual phenomena, Renaissance’s scientists programmed their computers to monitor financial markets, grinding away until they discovered overlooked patterns and anomalies. Once they were determined to be valid, and the firm determined how much money to place in the trades, the signals were placed into the system and left to do their thing, without any interference. By then, Medallion increasingly was relying on strategies that its system taught itself, a form of machine learning. The computers, fed with enough data, were trained to spit out their own answers. A consistent winner, for example, might automatically receive more cash, without anyone approving the shift or even being aware of it.
The Collapse of LTCM
The collapse of Long-Term Capital Management (LTCM) reinforced Renaissance’s emphasis on risk management.
“LTCM’s basic error was believing its models were truth,” Patterson says. “We never believed our models reflected reality—just some aspects of reality.”
New Data Sources
Renaissance began exploring new data sources, including news feeds and analyst predictions.
Some sources, like quarterly corporate earnings reports, didn’t provide much of an advantage.
But data on the earnings predictions of stock analysts and their changing views on companies sometimes helped.
Watching for patterns in how stocks traded following earnings announcements, and tracking corporate cash flows, research-and-development spending, share issuance, and other factors, also proved to be useful activities. The team improved its predictive algorithms by developing a rather simple measure of how many times a company was mentioned in a news feed—no matter if the mentions were positive, negative, or even pure rumors.
Medallion’s Extraordinary Performance
By the early 2000s, Medallion’s performance was legendary, with a Sharpe ratio of around 6.0.
“We’re right 50.75 percent of the time … but we’re 100 percent right 50.75 percent of the time,” Mercer told a friend. “You can make billions that way.”
Multidimensional Anomalies
Renaissance realized that there were far more factors influencing investments than most investors appreciated. They uncovered subtle, “multidimensional anomalies” and mathematical relationships between these factors.
“The inefficiencies are so complex they are, in a sense, hidden in the markets in code,” a staffer says. “RenTec decrypts them. We find them across time, across risk factors, across sectors and industries.”
Outsiders didn’t quite get it, but the real key was the firm’s engineering—how it put all those factors and forces together in an automated trading system. The firm bought a certain number of stocks with positive signals, often a combination of more granular individual signals, and shorted, or bet against, stocks with negative signals, moves determined by thousands of lines of source code.
Keeping Alpha Secret
How Renaissance traded was as important as what they traded. They developed sophisticated techniques to hide their trades and preserve their signals.
“Once we’ve been trading a signal for a year, it looks like something different to people who don’t know our trades,” an insider says.
Alternative Data and the Future of Quant Investing
Today, the fastest-moving firms often hold an edge. In late August 2018, shares of a small cancer-drug company called Geron Corporation soared 25 percent after its partner, Johnson & Johnson, posted a job listing. The opening suggested that a key regulatory decision for a drug the two companies were developing might be imminent, a piece of news that escaped all but those with the technology to instantly and automatically scour for job listings and similar real-time information.
Alternative data is increasingly becoming important in the field. This kind of data includes just about everything imaginable, “including instant information from sensors and satellite images around the world”.
Creative investors test for money-making correlations and patterns by scrutinizing the tones of executives on conference calls, traffic in the parking lots of retail stores, records of auto-insurance applications, and recommendations by social media influencers.
Rather than wait for figures on agricultural production, quants examine sales of farm equipment or satellite images of crop yields. Bills of lading for cargo containers can give a sense of global shifts. Systematic traders can even get cell phone–generated data on which aisles, and even which shelves, consumers are pausing to browse within stores. If you seek a sense of the popularity of a new product, Amazon reviews can be scraped. Algorithms are being developed to analyze the backgrounds of commissioners and others at the Food and Drug Administration to predict the likelihood of a new drug’s approval.
The idea is essentially to take in as much data as possible.
To explore these new possibilities, hedge funds have begun to hire a new type of employee, what they call data analysts or data hunters, who focus on digging up new data sources, much like what Sandor Straus did for Renaissance in the mid-1980s. All the information is crunched to get a better sense of the current state and trajectory of the economy, as well as the prospects of various companies. More adventurous investors may even use it to prepare for a potential crisis if, say, they see a series of unusual pizza deliveries at the Pentagon in the midst of an international incident.
“Instead of the hit-and-miss strategy of trying to find signals using creativity and thought,” a Renaissance computer specialist says, “now you can just throw a class of formulas at a machine-learning engine and test out millions of different possibilities.”
Lessons from Renaissance
- The Scientific Method: Apply rigorous scientific principles to any challenging problem.
- Multidimensional Thinking: Recognize that there are more factors and variables at play than are readily apparent.
- The Limits of Predictability: It’s incredibly difficult to beat the market consistently. Renaissance only profited on slightly more than 50% of its trades.
- Embrace the Unknowable: The narratives are mostly quaint. You can’t be sure what the future holds.
The gains Simons and his colleagues have achieved might suggest there are more inefficiencies in the market than most assume. In truth, there likely are fewer inefficiencies and opportunities for investors than generally presumed. For all the unique data, computer firepower, special talent, and trading and risk-management expertise Renaissance has gathered, the firm only profits on barely more than 50 percent of its trades, a sign of how challenging it is to try to beat the market—and how foolish it is for most investors to try.
Simons and his colleagues generally avoid predicting pure stock moves. It’s not clear any expert or system can reliably predict individual stocks, at least over the long term, or even the direction of financial markets. What Renaissance does is try to anticipate stock moves relative to other stocks, to an index, to a factor model, and to an industry.
During his time helping to run the Medallion fund, Elwyn Berlekamp came to view the narratives that most investors latch on to to explain price moves as quaint, even dangerous, because they breed misplaced confidence that an investment can be adequately understood and its futures divined.
“I don’t deny that earnings reports and other business news surely move markets,” Berlekamp says. “The problem is that so many investors focus so much on these types of news that nearly all of their results cluster very near their average.”
Liked these notes? Join the newsletter.
Get notified whenever I post new notes.