Risk aversion is one of the most basic assumptions of economic behavior, but few studies have addressed the question of where risk preferences come from and why they differ from one individual to the next. Here, we propose an evolutionary explanation for the origin of risk aversion. In the context of a simple binary-choice model, we show that risk aversion emerges by natural selection if reproductive risk is systematic (i.e., correlated across individuals in a given generation). In contrast, risk neutrality emerges if reproductive risk is idiosyncratic (i.e., uncorrelated across each given generation). More generally, our framework implies that the degree of risk aversion is determined by the stochastic nature of reproductive rates, and we show that different statistical properties lead to different utility functions. The simplicity and generality of our model suggest that these implications are primitive and cut across species, physiology, and genetic origins.Available Here >
Despite many compelling applications in economics, sociobiology, and evolutionary psychology, group selection is still one of the most hotly contested ideas in evolutionary biology. Here we propose a simple evolutionary model of behavior and show that what appears to be group selection may, in fact, simply be the consequence of natural selection occurring in stochastic environments with reproductive risks that are correlated across individuals. Those individuals with highly correlated risks will appear to form “groups”, even if their actions are, in fact, totally autonomous, mindless, and, prior to selection, uniformly randomly distributed in the population. This framework implies that a separate theory of group selection is not strictly necessary to explain observed phenomena such as altruism and cooperation. At the same time, it shows that the notion of group selection does captures a unique aspect of evolution—selection with correlated reproductive risk—that may be sufficiently widespread to warrant a separate term for the phenomenon.Available Here >
As the prevalence of Alzheimer’s disease (AD) grows, so do the costs it imposes on society. Scientific, clinical, and financial interests have focused current drug discovery efforts largely on the single biological pathway that leads to amyloid deposition. This effort has resulted in slow progress and disappointing outcomes. Here, we describe a “portfolio approach” in which multiple distinct drug development projects are undertaken simultaneously. Although a greater upfront investment is required, the probability of at least one success should be higher with “multiple shots on goal,” increasing the efficiency of this undertaking. However, our portfolio simulations show that the risk-adjusted return on investment of parallel discovery is insufficient to attract private-sector funding. Nevertheless, the future cost savings of an effective AD therapy to Medicare and Medicaid far exceed this investment, suggesting that government funding is both essential and financially beneficial.Available Here >
At the fifth annual CFA Institute European Investment Conference on 19 October 2012 in Prague, Robert C. Merton gave a presentation on analyzing and managing macrofinancial risk. This article is based on his talk and on research he carried out with his coauthors.Download (PDF) >
The combination of rising home prices, declining interest rates, and near-frictionless refinancing opportunities can create unintentional sychronization of homeowner leverage, leading to a “ratchet” effect on leverage when home prices fall. Our simulation of the U.S. housing market yields potential losses of $1.7 trillion from June 2006 to December 2008 with cash-out refinancing vs. only $330 billion in the absence of cash-out refinancing. The refinancing ratchet effect is a new type of systemic risk in the financial system and does not rely on any dysfunctional behaviors.Download (PDF) >
We explore a new dimension of fund managers’ timing ability by examining whether they can time market liquidity through adjusting their portfolios’ market exposure as aggregate liquidity conditions change. Using a large sample of hedge funds, we find strong evidence of liquidity timing. A bootstrap analysis suggests that top-ranked liquidity timers cannot be attributed to pure luck. In out-of-sample tests, top liquidity timers outperform bottom timers by 4.0-5.5% annually on a risk-adjusted basis. We also find that it is important to distinguish liquidity timing from liquidity reaction, which primarily relies on public information. Our results are robust to alternative explanations, hedge fund data biases, and the use of alternative timing models, risk factors, and liquidity measures. The findings highlight the importance of understanding and incorporating market liquidity conditions in investment decision making.Download (PDF) >
This Article proposes a novel and provocative analysis of judicial opinions that are published without indicating individual authorship. Our approach provides an unbiased, quantitative, and computer scientific answer to a problem that has long plagued legal commentators.
United States courts publish a shocking number of judicial opinions without divulging the author. Per curiam opinions, as traditionally and popularly conceived, are a means of quickly deciding uncontroversial cases in which all judges or justices are in agreement. Today, however, unattributed per curiam opinions often dispose of highly controversial issues, frequently over significant disagreement within the court. Obscuring authorship removes the sense of accountability for each decision’s outcome and the reasoning that led to it. Anonymity also makes it more difficult for scholars, historians, practitioners, political commentators, and—in the thirty-nine states with elected judges and justices—the electorate, to glean valuable information about legal decision-makers and the way they make their decisions. The value of determining authorship for unsigned opinions has long been recognized but, until now, the methods of doing so have been cumbersome, imprecise, and altogether unsatisfactory.
Our work uses natural language processing to predict authorship of judicial opinions that are unsigned or whose attribution is disputed. Using a dataset of Supreme Court opinions with known authorship, we identify key words and phrases that can, to a high degree of accuracy, predict authorship. Thus, our method makes accessible an important class of cases heretofore inaccessible. For illustrative purposes, we explain our process as applied to the Obamacare decision, in which the authorship of a joint dissent was subject to significant popular speculation. We conclude with a chart predicting the author of every unsigned per curiam opinion during the Roberts Court.Download (PDF) >
In this paper we provide a brief survey of algorithmic trading, review the major drivers of its emergence and popularity, and explore some of the challenges and unintended consequences associated with this brave new world. There is no doubt that algorithmic trading has become a permanent and important part of the financial landscape, yielding tremendous cost savings, operating efficiency, and scalability to every financial market it touches. At the same time, the financial system has become much more of a system than ever before, with globally interconnected counterparties and privately-owned and -operated infrastructure that facilitates tremendous integration during normal market conditions, but which spreads dislocation rapidly during period of financial distress. A more systematic and adaptive approach to regulating this system is needed, one that fosters the technological advances of the industry while protecting those who are not as technologically advanced. We conclude by proposing “Financial Regulation 2.0,” a set of design principles for regulating the financial system of the Digital Age.Download (PDF) >
To reduce risk, investors seek assets that have high expected return and are unlikely to move in tandem. Correlation measures are generally used to quantify the connections between equities. The 2008 financial crisis, and its aftermath, demonstrated the need for a better way to quantify these connections. We present a machine learning-based method to build a connectedness matrix to address the shortcomings of correlation in capturing events such as large losses. Our method uses an unconstrained optimization to learn this matrix, while ensuring that the resulting matrix is positive semi-definite. We show that this matrix can be used to build portfolios that not only “beat the market,” but also outperform optimal (i.e., minimum variance) portfolios.Download (PDF) >
In this paper, we describe a new approach to financing biomedical innovation that we first proposed in Fernandez, Stein, and Lo (2012) and extend in several ways here: using portfolio theory and securitization to reduce the risk of translational medicine. By combining a large number of drug-development projects within a single portfolio, a “megafund,” it becomes possible to reduce the investment risk to such an extent that issuing bonds backed by these projects becomes feasible. Debt financing is a key innovation because the cost of each drug-development project can be several hundred million dollars; hence, a sufficiently diversified portfolio may require tens of billions of dollars of investment capital, and debt markets have much greater capacity than either private or public equity markets. If these bonds are structured to have different priorities, the most senior class or “tranche” may be rated by credit-rating agencies, opening up a much larger pool of institutional investors who can purchase such instruments, e.g., pension funds, sovereign wealth funds, endowments, and foundations.Download (PDF) >