Featured Image

Modern Quantitative Finance

Risk, Performance, and Reporting

By Tom P. Davis, PhD, CFA  |  May 30, 2017

Recently Dr. Gal Rafael, a physicist from Caltech, wrote a note on a traditional course all physics majors have to take called “Modern Physics.”  Dr. Rafael discussed that what is being taught in a typical modern physics course is actually physics dating from the 1920s, and he called on other physicists to think about what actually constitutes modern physics in the 21st century.  Earlier this month, I attended the pre-eminent derivatives conference “Global Derivatives” in Barcelona, and similar thoughts started ruminating in my brain. What constitutes “Modern Quantitative Finance”? 

Now, physics has a much longer history than quant finance, which arguably began in the 1960s with Ed Thorpe and did not become a full-fledged field until the 1970s with the derivation of the Black Scholes Merton equation and its generalization. Really, we’ve only had a few decades with the discipline, so perhaps it is a bit premature to think about “classic” versus “modern” quant finance. But nevertheless I will. 

The Quantitative Finance Arms Race

I like to describe quant finance as an arms race. Once a team of physicists hired by a bank had demonstrated the edge superior mathematics could provide, every bank started to hire its own team of physicists, known as quants (originally pejorative apparently, but now a sought after job title). Banks developed more intricate and complex models, leading the way to insights for the quantitative insight into the risks of banks’ balance sheets (which Alan Greenspan believed to be good for the economy). Unfortunately, this ended in crisis the same way that all arms races do; in this case the credit crisis and the global financial crisis (GFC).

There were many issues leading up to these of course, but overly complex models and the inability for senior decision makers to understand them (I’m looking at you, Gaussian copula) definitely seeded, or at least exacerbated, the issue.

Greenspan later remarked, “I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders and their equity in the firms,” in a 2008 testimony to Congress.    

So here we are today, still dealing with the impacts of the GFC, with increased regulation, a push to clearing, and living with a multi-curve environment with negative rates. The GFC heralded the biggest change in quant finance since the introduction of the Black Scholes equation.

Before 2007, the best quants were in the front office of the largest institutions, gaining any edge they could using very complex mathematical models, where traders were the ultimate arbiter of truth. The GFC in 2008 saw many of these quants moving into the middle office, pouring over legal contracts once thought mundane. The reason was that “Credit support annexes,” attached to all International Swaps and Derivatives Association (ISDA ) deals to describe how collateral was to be managed between the counterparties, contained a lot of optionality (such as the ability to post any currency as collateral), and any optionality needs to be valued. Another seismic shift saw the biggest institutions shutting down their quantitative desks, resulting in the best minds of our generation moving to the buy-side. 

This shift was recognized by the Global Derivatives conference organizers. Prior to the conference there was a buy-side summit, and many of the talks were focused on the buy-side (like Riccardo Rebonato's Smart Beta for Fixed Income). Although there were talks on new models (such as  A New Dynamic Model for CDOs), most of the conference tracks were focused on issues that were traditionally (pre-2008) thought of as outside the realm of quant finance:

  • Software efficiency (automatic differentiation and GPUs)
  • Regulation (What’s next for xVA?)
  • Clearing and Initial Margin (CVA and IMM)
  • The “real world measure” (P versus Q)
  • Machine learning

Each of the above points deserves its own discussion, if not entire semester devoted to the subject. (I've written more on the P vs. Q measure in a past Insight article.) 

The Next Quantitative Leap

The fact that quant finance has changed drastically was not lost on the conference organizers; we saw two talks specifically on this subject: The Future of Quant Finance and What Language Should a Quant Speak?  (Strangely, the consensus for the latter was Danish). In The Future of Quant Finance talk, John Hull, Professor of Derivatives and Risk Management at the Rotman School of Management at the University of Toronto and one of the founding fathers of quant finance, spoke about the necessity to embrace change. What you will be working on five years from now will certainly be different from what you are working on today. This sage advice cannot be ignored and is one of the core reasons quant finance is such an interesting field. When I started in quant finance in 2006 after a PhD in theoretical physics, I was worried that it would be a boring application comprised entirely of solving known partial differential equations (PDE). I was happily mistaken. 

Another point raised at the event was that today’s universities now have programs devoted to quant finance, and within the curriculum many deep and interesting topics are taught. However, when the brightest from these programs enter the finance industry, they are not solving interesting problems. Jesper Andreason, Head of Quantitative Research at Danske Bank, had some advice for these grads: “No one is going to give you a PDE to solve on your first day.  You must do the mundane stuff, or better yet have a computer automate the mundane stuff, and go seek out interesting problems to solve.”

This shift away from heavy quantitative models to computational efficiency has had other ramifications as well. The types of university graduates now hired is shifting from math and physics and towards computer science. At the same time, large data firms such as Google and Facebook are hiring graduates who can solve PDEs to solve very interesting problems in big data.

My response to this is to tell all physics and math grads to learn software engineering skills such as design patterns, algorithms, and collaboration. These are tools that any modern quant needs in their toolkit. 

Gone are the days where a quant writes equations and a software engineer implements in a production system. This archaic structure results in a low fidelity implementation (the math could be wrong) and a much slower time between inception and the development showing up in production. This is a double-edged sword, however, and a modern quant must know (intimately) the software engineering issues that arise in production code such as design patterns and best coding practices. The most important lesson that we learned from object-oriented software development is that the abstractions that represent the object model must reflect the relevant abstractions in the technical domain. This requires quants to be central to the architecting of a quantitative financial library. This is not to say that pure software engineers have no place in a quantitative codebase. The production system needs to be developed in such a way that the very detailed software engineering aspects (memory management, multi-threading, and parallelization) are taken away from the quant. 

This year’s Global Derivatives conference showed clear signs of a field in transition. Much higher focus on the buy-side, regulation and computational techniques and less stress on new models and mathematical techniques. To paraphrase Hull, what will be presented on at Global Derivatives 2023 will be very different from the presentations from Global Derivatives 2017. Hopefully, the reader will agree that this insight into the derivatives field better outlines what defines modern quantitative finance. 

Tom P. Davis, CFA

Vice President, Director Research, Fixed Income & Derivatives

Dr. Tom Davis is Vice President, Director of Research, Fixed Income, and Derivatives at FactSet. In this role, he is focused on ensuring FactSet is providing the highest quality derivative analytics and growing the coverage across all asset classes. His team also conducts cutting edge research in the models and methods of quantitative finance which will ultimately increase the speed and accuracy of FactSet analytics. Prior to FactSet, Dr. Davis spent four years at Numerix as Vice President of Product Management in charge of their flagship product and four years managing a team of quantitative analysts at FINCAD focused on arbitrage-free modeling of interest rates and foreign exchange rates to price exotic hybrid derivatives. Dr. Davis earned a Doctor of Philosophy in theoretical physics from the University of British Columbia in Vancouver, Canada.

Comments

The information contained in this article is not investment advice. FactSet does not endorse or recommend any investments and assumes no liability for any consequence relating directly or indirectly to any action or inaction taken based on the information contained in this article.