It makes sense starting the discussion of the Silver Age with a few words about Colorado, where it all began. No, I don’t mean the Comstock Lode of Louis L’Amour novels my dad used to read, but rather the Colorado that started the modern age of financial technology via the creation of the field of econometrics, or statistical analysis applied to the copious and dirty data of finance.
The Great Depression and the Birth of Modern Financial Analysis
It all started, as it often does, with a crisis. The Great Depression had hit the United States and a young man from a wealthy family retreated to a sanitorium in the Rocky Mountains to see if his malady could be cured by the clean, crisp air. It was when in Colorado that young Alfred Cowles’ mind started to race and he began to examine why his family had lost so much money in the stock market crash, especially when his family had had access to be best analysts that money could buy. To answer this question he took the paper reports of the analysts and compiled them into tables and began to analyze them. In this sense, the whole field of econometrics was born out of a young man analyzing the analysts. It was in examining analyst behavior and forecasts, that he and those around him built a new language of analysis, but it would take until after the war till that language was embued with enough power than truly new insights began to emerge from the data . While much of the work in developing the computers that could carry out the analysis was done back East, it was nearby in Denver that groups started to provide the data that the new machines would need to feed what would become an insatiable need for analysis.
The oldest data firm in the United States is Standard & Poor’s (now just S&P Global Inc.), a company that was originally formed in 1860 by a man with the unfortunate name of Poor to compile information on U.S. canals and railroads. That company was eventually merged with a company that provided standardized information on companies (as opposed to “as reported” information, or information in the form that companies provided it themselves). Eventually, that company rechristened Standard & Poor’s (S&P) office in Denver and it spent over three years developing a product targeted at providing electronic information to investors that was meant to facilitate a new kind of computerized research. Appropriately, S&P named the product “Compustat.”
The First Financial Data Source Crafted for Computer Analysis
Compustat was the first product of its kind, and by the 1960s it was delivering information that had been massaged so that mainframe computers could easily process it and do analysis in a comprehensive and comparative nature. While the product was considered innovative for many users, most investors in those days chose to continue to use the “as reported” data provided in the firms’ own annual reports or by S&P in paper booklets, a habit that most so-called “fundamental investors” retain to this day. As a result, the computerized product remained a niche product for years, more of an oddity than a truly profitable commercializable product.
That remained the case until a local investor from Denver, Harold Silver, grasped the potential value of using the computer to do more analysis over more companies than any group of human beings could do. Harold understood what he’d lose in precision he could gain in breadth of application and as a result he began building what would become the first systematic research system meant to “mine” the data for alpha, or investment insights. In this very real sense, Harold Silver became the first of what we’d come to call “quantitative Investors” or “Quants.” But who was this man, the first among quantitative investors?
Harold Silver and Brigham Young University
Harold Silver was a natural innovator, the epitome of the self-made man, starting with nothing but his wits and applying his knowledge to the steel, mining, sugar, and shipbuilding industries and in the process building several fortunes. He’d actually started out in the Columbia University engineering program, but due to lack fo money and a deeply rooted desire to succeed in the business world, he dropped out of school and returned first to Salt Lake City, moving to Denver as his business empire grew. Eventually, after selling his business interests, Harold turned his attention to his investment portfolio and working with his financial advisors in Denver and computer scientists at the University of Colorado at Boulder, he became the first person to systematically use the S&P data to produce an “index” or broad set of investment recommendations.
While Harold mostly used his system for his own portfolio (to great effect), he eventually gave his system to Brigham Young University, a few years before he died in the early 1980’s, along with several million dollars to seed an academic program in what we’d call today quantitative investing. Much of that money was spent on other projects, but an aspect of it still remains. As important as that is as a hallmark in the investment world, it was his work with S&P in helping them verify their process and data where perhaps Harold Silver’s greatest legacy lies.
Mining the Analysts for Alpha
Working with groups such as Merrill Lynch and other brokers, Harold Silver and his programmers in effect mined the insights of the fundamental research analysts for alpha or investment outperformance. His programmers also became the first external group that systematically verified and improved the data for S&P and other groups helping the data achieve the level of quality that investors could use for a real investment process. In so doing Harold prepared the way for others, such as researchers in Berkeley, California to begin to systematically use the data to even greater effect and in ways that investors such as Harold never would have thought about.
I was first introduced to the history of Harold Silver when I was 18 years old when I went to work for the man that translated Harold Silver’s machine code into Fortran IV for BYU. That was my introduction to quantitative investing and the importance of sell-side analysts to the investment process. In retrospect, looking back so many years, the code and process were eloquent, years ahead of its time, utilizing sophisticated techniques such as sigmoid functions. splines, and sophisticated data cleansing routines, ideas that had been previously discussed by econometricians back to the days of Cowles, but still to this day are rarely used and even more rarely used correctly.
The University of Utah Harold Silver Papers
If anyone wishes to see that original Silver code, we at the University of Utah have the archives of this amazing work, and one must simply come here in person and request the Harold Silver papers. Be warned if you do, there are dozens of boxes of information documenting this amazing man’s seminal work in finance. If you want a more compact read, I’d suggest getting a copy of Leonard Arrington’s book on Harold Silver.
As a final note, upon review, I can clearly say that the Berkeley finance revolution (which we will talk about in the next entry), simply could not have happened if the foundation had not been laid so well in Colorado, whether by econometrics, computerized data or the first computerized system that mined for alpha.