Financial firms have always relied on an array of data and technological processes to get the intelligence they need to make decisions. However, with the transition to today’s advanced technology environments occurring almost overnight, many organizations are still relying on the same data architectures, software, and workflows that they have for many years.
As we discussed in the last article of this series, technological advancement has created new opportunities for analysis in the form of alternative data, the use of which has become increasingly common over the last five years. But with the increase in popularity of data that allows investment professionals to test novel hypotheses and ideas, comes a dilemma concerning the sourcing and cleanliness of such information.
According to the recent survey of Heads/Directors/VPs of Quantitative Analysis, Senior Quantitative Analysts, Data Scientists, and Chief Data Officers from 50 hedge funds and institutional asset management firms across the U.S., EMEA, and APAC that FactSet conducted with Coleman Parkes, Quantitative analysts are becoming more concerned with the quality and accuracy of the data they obtain, and less concerned with the costs of getting that data.
The Need for Quality Content
During the survey, respondents ranked nine different aspects of data management in order of importance. Cost came in seventh out of the nine options overall, with only 6% of respondents saying cost is the most important. Data accuracy (22%) and data quality (20%) were ranked as the first and second most important aspects of data management. Offering a more complete market perspective took third place overall, with 16% of respondents placing it highest.
Even items such as speed of data provision and delivery frequency, which are more about distribution than data quality, rank significantly higher in importance in the survey than cost, as did provider reputation, which came in at sixth most important overall.
Changes in What Content Is Valued
In addition to assessing the content quality standards that quantitative analysts require, the survey also looked at the types of content data managers say they’re using. The findings reveal substantive differences from the types of core content that survey respondents value most.
The survey found that the top three types of content currently being used—economics, events and transcripts, and global sanctions—all ranked significantly lower on the list of core content currently valued most (ranking 17, 16, and six of 19 respectively). The most valued types of content—benchmarks, prices, and fundamentals—ranked fourth, 17, and 13 respectively on the list of types of content currently being used.
Serving quantitative analysts’ data needs is a field that is ripe for improvement. Professionals in the field of quantitative analysis made it clear in their responses that they believe they’re spending too much time on operational demands. They know this detracts from their ability to keep up with the kinds of alternative data they expect will be of greater value in the future. When look specifically at the type of content available and the processes in place to ensure its quality, two major themes should be considered:
The first theme is that there appears to be a significant disconnect between what information analysts value and what they are currently using. A possible reason for this is that respondents seem to be valuing systems or functions higher than what they are really getting out of them. Another reason is that the core content people value may not necessarily be what they think should have the greatest value. Also, the types of core content that will be most valued in the future—fundamentals, prices, and people—are now in the lower half of the types of content currently being used.
The second theme, as it relates to the quality of data, the increasing importance of data management was evident when examining how respondents compare quality to cost. These findings raised the possibility that quants are facing more pressure to evaluate higher volumes of data and sources, something to bear in mind when evaluating the need to expand data coverage.