Job Description:
Position Description:
As an integral part of the Quantitative Research and Investing (QRI) group, the Common Risk Platform (CRP) team is responsible for data and tools that enable advisors across Fidelity to measure and report on risk in their portfolios. This highly skilled team is responsible for a cross-asset, production platform for risk computation as well as the quality of input data to these computations and the correctness of the results.
In your role as Data Analyst, Risk Platform you will be part of a team responsible for ensuring the accuracy of inputs to the Risk Platform and the quality of its outputs. Your work will involve building automated data quality checks and conducting ad hoc analyses to understand the accuracy of risk data. Your team is spread across the globe, with key colleagues in London with whom you can learn and grow. Your efforts will span deep technical aspects of datasets as well as high-level executive communication and interaction with quant researchers.
The Expertise You Have:
- A Bachelor’s degree, or higher, in computer science, engineering, mathematics, or finance.
- 4+ years of data analysis and data quality work in the risk field, involving datasets including MSCI and Barra, with substantial fluency in risk reporting and metrics.
- Demonstrable track record of leadership in independent, technical initiatives in data quality and risk at a premier financial services firm.
- Highly analytical with the ability to quickly comprehend large data sets in order to develop new processes, perform calculations, and identify anomalies.
- Highly proactive and self-motivated, able to meet objectives under minimal supervision.
The Value You Deliver:
- Ensure the quality of risk datasets is certified for production use, identifying common-case and esoteric data quality issues before they flow into the system
- Verify key risk reports consumed by the rest of the firm, researching exposures that seem out of place and providing domain expertise to clients
- Use your experience to find opportunities for automation and tooling, to allow your team to scale and accelerate
- Work with team members to develop and share best practices in data quality and anomaly detection
- Help management understand operational risks and align your team with the overarching goal of zero defects
The Skills You Bring:
- Ability to write complex queries in SQL across large datasets and to debug stored procedures
- Expertise manipulating data in Python or R, including joining data, using statistics packages to categorize data and identify outliers, and automating data processes
- Background in statistics, linear algebra and Calculus as applied to portfolio risk and anomaly detection
- Domain expertise in financial datasets from vendors like Bloomberg and MSCI and tools such as BarraOne and RiskManager
- Understanding of risk models and the computations behind them, with intuition for whether a risk value is sensible for a given portfolio or security
- Detailed knowledge of multiple asset classes, including equities, fixed income, commodities and alternative investments, and the relevant analytics for each (e.g. financial ratios, duration, OAS, Greeks, implied vol)
- Experience with data quality rules and frameworks for financial datasets
- Experience with Cloud-native platforms (AWS, Snowflake, etc.)
- Experience working in agile environments using tools such as Jira, Confluence and GitHub