Featured Image

Distribution (or Showcasing the Secret Sauce)

Risk, Performance, and Reporting

By Pat Reilly  |  August 3, 2021

In this five-part series, FactSet’s Pat Reilly, Director of Analytics, will examine the theme of data governance and distribution through the lenses of data sourcing, integration, quality, analysis, and distribution across internal and external clients. Combined, these provide asset managers and asset owners with an overview of the key elements to be considered when constructing an efficient data governance and distribution process.

Our final piece takes on the theme of distribution; the full series can be downloaded here.

How Do We Define "Distribution"?

In asset management, “distribution” tends to be synonymous with sales and fund flows. In that regard, distribution is mission critical. Shift the perspective to a governance lens, however, and distribution becomes far more than the act of asset gathering.

The reality is that distribution is a balancing act between competing parties and priorities. It’s a series of activities that begins with the recognition that data is an asset from which each firm and user should seek to generate a return. But how to do this and where to start?

Interestingly, the natural starting point is at the conclusion. For a given context, who is the audience, what is the relevant data, and how does it get there? Internal and external stakeholders all have a say.

Internal stakeholders might be investment teams, risk managers, compliance teams, sales and marketing, or executive committees. The relevant data will be broader in scope than what is typically distributed externally. Multi-horizon performance figures and attribution, risk decomposition, and exposures and characteristics are the tip of the iceberg. Investment committee reporting will typically encompass quantitative as well as qualitative elements. Risk teams will want to tweak stress tests or model portfolios to account for the macroeconomic environment. Compliance groups will require verified source documents for regulatory reporting as well as books and records retention. Sales and marketing will have a standard database template that is business as usual and then follow up with a request for a pitch book for the latest strategy.

External stakeholders include clients and prospects, third parties like consultants or vendors, and regulators. Like internal audiences, for external audiences the relevant dataset will contain standard and bespoke elements. Client- and prospect-facing materials will combine portfolio-level analytics, performance, attribution, and risk measures with single security detail and commentary that closes the feedback loop and reiterates the firms’ secret sauce. Third parties like consultant databases and data aggregators tend to follow a standard operating procedure around content and timing. Regulatory reporting also follows a standard approach, with the caveat that the environment is ever-evolving and usually combines security-, portfolio-, and composite-level details. Reporting is typically a summary, with verified source data retention policies requiring detailed backup of the summary view.

These stakeholders and data requirements may seem intimidating, making data seem less of an asset and more of an inescapable liability. But all is not lost! Selecting the appropriate delivery mechanism can streamline the sources utilized, improve end-user flexibility, and enhance security and access controls across the firm. Reframing distribution from a purely sales construct to the final step in an effective governance strategy ties sourcing, integration, quality, and analysis together.

In the good old days, a firm would traditionally approach these disparate needs with a hybrid approach including a market data terminal, basic static output that would then undergo manual aggregation and transformation, and maybe an early API. Today, that approach is akin to a first-generation iPod. Sure, it works, but the technology has improved so much that one would be remiss to not at least look at what else is out there. As a starting point, let’s break this into interactive and production uses.

Interactive Distribution: Real Time and Tactile

Interactive use cases have grown exponentially as the demand for real-time  data and analytics grows. This increase might seem like a governance nightmare, but it doesn’t have to be. Certainly, the terminal model continues to be a dominant force across investment professionals. But the portability of access via web-based utilities and mobile applications has become paramount. Unlocking professionals, especially client-facing ones, from their desktop is now table stakes. It’s not enough to meet the data need, now it must be on-demand, location agnostic, and from one source of truth. This alignment around portability also eases governance concerns, as ostensibly, the content is the same as what would be received on a desktop instance. Firms are also leveraging business intelligence tools to surface data in unique combinations or visualizations. This might be built leveraging APIs, or it may source entirely from a data warehouse. The natural extension to this is a bespoke portal. To date, this is most common in the retail space. However, if we inventory reporting needs across audiences, surfacing the most common elements along with basic market data to the firm via a portal makes a lot of sense for other financial institutions. User error is eliminated, access rights can be centralized, and the user experience can be tightly defined. Interactive uses will continue to evolve; however, each firm will have their own timeline. There is no single prescription for all.

The ABCs of Distribution Automation

Shifting the discussion from interactive to production uses, some overlap occurs with the concept of a data warehouse. This concept may be used interchangeably with an ABOR or IBOR, however understanding the warehouse as a singular distribution staging area outside of that functionality is a proper starting point. A data warehouse contains pricing, security master and position data, performance and risk analytics, and other proprietary elements that are utilized downstream in a variety of ways. The warehouse will be populated in several ways, usually via scheduled batched reports or flat files, but more and more via APIs, all of which stem from the same interactive platform provider. By aligning the inputs across use cases, data consistency and quality shift an “either-or” argument to a “like for like” conversation.

FactSet API and Portal Options

The other production use case to explore is publishing. This generally takes one of three approaches. Automation of the process from data population to verification, commentary, compliance signoff, and delivery using an end-to-end reporting solution has seen a tremendous amount of growth recently. Even better is when the solution incorporates API or portal options and is data source agnostic. Tackling reporting needs via a managed service is a viable option for firms that would prefer to utilize headcount and/or spend outside of the middle office. Finally, manual reporting is still very much alive. Although it comes with the historical operational risks previously outlined, those can be somewhat alleviated by source alignment and process documentation.

Conclusion

In summary, it is a new age for distribution. Once stakeholders have been identified, delivery mechanisms can easily source verified and locked down datasets, mitigating the business and process risks of distribution, leaving firms to get back to the basics of asset management and asset gathering.

Disclaimer: The information contained in this article is not investment advice. FactSet does not endorse or recommend any investments and assumes no liability for any consequence relating directly or indirectly to any action or inaction taken based on the information contained in this article.

New call-to-action

Pat Reilly, CFA

Senior Vice President, Senior Director, Americas Analytics

Mr. Pat Reilly is Senior Vice President, Senior Director of FactSet’s Analytics solutions for the Americas. In this role, he focuses on providing content, analytics, and attribution solutions to clients across equities, fixed income, and multi-asset class strategies. Prior to this role, Mr. Reilly headed the Fixed Income Analytics team in EMEA and began his career at FactSet managing the Analytics sales for the Western United States and Canada. Before joining FactSet, he was a Credit Manager at Wells Fargo and an Insurance Services Analyst at Pacific Life. Mr. Reilly earned a degree in Finance from the University of Arizona and an MBA from the University of Southern California and is a CFA charterholder.

Comments

The information contained in this article is not investment advice. FactSet does not endorse or recommend any investments and assumes no liability for any consequence relating directly or indirectly to any action or inaction taken based on the information contained in this article.