All the research topics covered by the QMI are structured around the 3 main stages of the creation of a quantitative investment strategy. The first step is related to signal generation. Particular attention will be paid to the use of artificial intelligence in this production of signals. The second step involves portfolio construction and dynamic risk management. In particular, we are interested in the optimal use of techniques from the world of derivatives. Issues related to potential crowding due to joint use of related strategies by different funds will also be addressed. Finally, the last step covers all the challenges of real-world implementation of paper portfolios from the previous step.

 1. Signal Generation


Statistical Signal Processing & Machine Learning in Finance

Application of signal treatment to the estimation of factorial models, the detection of outliers, the filtering of trends and the robust estimation of Kalman models is a historical research field of the IdR QMI. Our industrial partners were originally interested by using mathematical methods to take investment decisions. At that time, Statistical Signal Processing were the most promising approach to process the information encompassed in historical time series. A recent publication - Chevalier, Darolles (2022), may be classified in this category (See publications). The objective is to allocate money to a portfolio of different trends following systems. The risk of this strategy is then linked to the probability of observe simultaneously breaks in trends characterizing different markets. Forthcoming working papers will feed this strand of research, and in particular through the use of machine learning techniques to relate observed trends to economic environment variables. This will make the link between this topic and the following one.
Machine Learning (ML) is indeed a promising technique to process large sets of information. Arthur Stalla-Bourdillon and several co-authors test the usefulness of Machine Learning (ML) for sovereign risk assessment and pricing in the euro area. They show that their predictive accuracy compared to traditional econometrics methods and their assessment on what are the most important economic factors behind market perception of sovereign risk (see Belly et al. working papers).

Serge Darolles, Gaëlle Le Fol, and Beatrice Sagna with another co-author are working on volume prediction (univariate and multivariate) models using machine learning methods. Their first results presented in a working paper show that machine learning techniques outperform ARMA and SETAR specifications both in and out of sample.

Finally, we have organized four hackathons on AI and ML in Asset management. The last one was organised in February 2022 with 66 participants/41 teams (See Hackathon 2022).

These two last topics are of course related to the next one.

Big data, machine learning and Alternative Data in Finance

Serge Darolles and Gaëlle Le Fol, with the support of a former ENSAE Master Student, have studied the use of Nowcasters to predict changes in regime. The first results presented in Darolles, Deni, Le Fol (2022) are promising, and further developments will be tested.

Arthur Stalla-Bourdillon and his co-authors published a paper in the Journal of International Money and Finance underlining how relying on a large datasets of sectoral equity prices performs better in macro-forecasting than using, as a predictor, the aggregate equity prices. Embedding these sectoral stock prices into a factor model also outperforms conventional benchmarks, such as the term spread, (see Publications).

Finally, an « Alternative Data in Finance » special invited session led by Serge Darolles, Member of the QMI has been organised at the Computational Financial Econometrics (CFE) conference in London in December 2022 (see CFE conference 2022). A companion publication in Option Finance (November) by Serge Darolles adressed the same topic.

The development of ESG dataset is a new and promising field for this topic. Researches are conducted on the link between ESG rating and the risk premia of listed securities. Darolles, He and Le Fol (2022) are interested in the role played by institutional investors in this relationship, by analysing the change of sign in this relationship that institutional investors could cause. Darolles, Faverjon and Lambert (2022) are interested in the information contained in ESG scores. They first break down this information into a systematic part, resulting from a direct analysis of raw information, and an analysis part relative to each rating agency. They measure the predicting power on stock returns of these two separate parts, and then highlight the role of ESG rating agency on price formation. An organized session on "Causal inference in sustainable investing" will be organized at the next CFE conference in London, in December 2024.

Risk premia

Serge Darolles is working with his former PhD Student Charles Chevalier on the characterization of a Multi-asset Trend Following Risk Premia that can be used to explain the cross-sectional dispersion observed in the CTA space. The corresponding risk factor can be used to improve the explanatory power of the linear factor models generally used to analyse hedge fund portfolios. A first publication in Journal of Asset Management in 2019 reports all the results obtained on momentum strategies. A second paper (“Diversifying Trends”) by the same co-authors is forthcoming in Econometrics and Statistics. The main objective of this research is to extract what is in common between trends observed on different markets.
Julien Royer,Jean-Michel Zakoian and a coauthor, are extending the theory on the estimation of dynamic conditional betas. In particular, they alleviate the short memory assumption frequently imposed on the volatility models, which can be restrictive in empirical applications. They investigate the existence of a carbon risk premium in the cross-section of US industry portfolios.
Serge Darolles, Gaëlle Le Fol and Gulten Mero are working on a regime switching approach to study the existence of risk premia. They apply their methodology to the size premia and show that the size effect is not a statistical fluck. They use a Markov Regime Switching model to filter regime-dependent risk-adjusted size spreads, in order to capture the reward for bearing the risk inherent to the size effect. The corresponding paper is published in Finance (see Publications) in 2022 and some developments are currently works in progress.

Finally, Paul Ehling and Costas Xiouras in their project “Asset Pricing with Endogenous beta”, funded in 2018, study the cross-section of expected returns in a new theoretical framework where betas are determined endogenously. Their theoretical analysis shows that the stocks’ betas fluctuate significantly over time and are affected by both the state of the economy and the individual stock states i.e., their characteristics. This research was presented at the CFE conference in London in December 2021.


2. Risk & Crowding


Hector Chan and a co-author are developing, in a paper published in Journal of Portfolio Management in 2022, a model whose aim is to study the relationship between crowding and liquidity shocks. One of the main results is that crowding is associated with a larger exposure to broader liquidity shocks on arbitrageurs. They confirm this link empirically by studying equity long-short strategies. They use short interest data both to identify liquidity shocks impacting sophisticated equity investors and to infer crowdedness for some of the well-known long–short equity factors. When liquidity shocks (such as the 2007 quant crisis or the more recent 2020 COVID-19–induced quant deleverage) occur, crowded strategies indeed tend to underperform.

Risk disaggregation and portfolio allocation

A change in the structure of a fund's client base affects the potential mismatch between the liquidity of its assets and liabilities. An asset/liability approach for liquidity management is therefore critical and requires a client behaviour model. Serge Darolles, Gaëlle Le Fol and Ran Sun are working on investor’s behaviour and the consequences on funding liquidity risk (see working papers).
Marius Zoican is working with a co-author on a new project where they look at institutional investor attention. They build a model where analysts who compete for scarce investor attention to maximize volume for brokerage houses end up clustering in a small subset of stocks. They find that it explains 21.39% of the cross-sectional variation in analyst coverage. This research has been presented at the Asian Finance Association in China in July (see Activity report 2021, page 19).
Hugues Langlois in the project “Forecasting Portfolio Weights”, funded in 2018, proposes a new methodology to compute dynamic mean-variance optimal portfolios. The originality of his approach is to directly forecast portfolio weights. This research was presented at the CFE conference in London in December 2021 and in a webinar launched in 2021.

Contagion and funds flows

Serge Darolles, Gaëlle Le Fol and her former PhD Student Beatrice Sagna work with another co-author on some multivariate volume prediction methods applied to the circulation of liquidity within a portfolio. This research has been presented several times at some international conferences in the past.

Fabrice Riva and a coauthor investigate the impact of Exchange-Traded Funds (ETFs) on their constituent securities. They find that after an ETF switches from synthetic to physical replication constituent stocks experience greater commonality, both in returns and in liquidity. The effect on return commonality appears stronger for the least liquid stocks included in the ETF. Also, they present evidence that ETF arbitrage is the transmission mechanism of the comovements. Moreover, they show that the comovements do not appear excessive. This research has been accepted for presentation several times by well-established Finance conferences (FMA 2022, Eurofidai-ESSEC 2022, 14th Annual Hedge Fund Research Conference).

Darolles and Roussellet (2023) study hedge fund liquidity management in the presence of liquidity risks on the asset and liability sides. They formulate a two-period model where a single fund has always access to a liquid asset and can invest in an illiquid asset which pays off only at the end of period two. Funding liquidity risk takes the form of a random outflow originating from clients in period one. They solve the allocation problem of the fund and find its optimal allocation between liquid and illiquid assets. Liquidation probability and portfolio composition are revealing about the market liquidity and funding liquidity, respectively. Gates, as a device that limits the outflows experienced by the fund, helps it reduce its liquidation risk and harvest liquidity premia.

Estimation risk for portfolios

In a paper published in Journal of Econometrics in 2022, Jean Michel Zakoïan and a co-author are testing the existence of moments in the framework of GARCH processes, which is of particular interest as the existence of moments can be crucial for risk management, for instance when risk is measured through the expected shortfall (see publications).
Ophélie Couperier is also working on risk measures and on backtests of risk measures in two working papers, with different co-authors. Her research has been presented at several international conferences (See 2022 report).

Juan Imbet is currently working with three co-authors, on robust option-implied measures of conditional volatility, skewness and kurtosis based upon quantiles and expectiles inferred from weekly options on the S&P 500. They find that the option-implied robust indicators exhibit short-, medium- and long-term predictive ability for the U.S. equity risk premium, market volatility, skewness and kurtosis, both in- and out-of-sample, and outperform equal indicators inferred from historical returns. The paper was also presented in a practitioner’s conference called QuantMinds international 2022.

If standard errors measure the uncertainty in estimates of population parameters, evidence-generating process (EGP) variation across researchers adds uncertainty: Non-standard errors (NSEs). Lambert and Zoican with many co-authors study Non-standard errors (NSEs) by letting 164 teams test the same hypotheses on the same data. They show that NSEs turn out to be sizable, but smaller for better reproducible or higher rated research. Adding peer-review stages reduces NSEs. They further find that this type of uncertainty is underestimated by participants. This research is forthcoming in Journal of Finance.

Systemic risk and stress exercises

Several research have been conducted by Christian Gourieroux to detect the systemic risks present in a portfolio, define rating for systemic risk, or construct scenario generators to measure the impact of systemic shocks. Gagliardini, Gourieroux, Rubin (2019) develop a systematic factor model for a joint analysis of the ranking of portfolio managers based on a high dimensional analysis of 900 stocks returns.

Boloorforosh, Christoffersen, Fournier, Gourieroux (2019) consider the market beta exposures of stocks and allows for stochastic market betas exposures of stocks and allows for stochastic market betas with possible comovements. Such nonlinear dynamic factor models are usually difficult to estimate by maximum likelihood due to the high dimensionality. Gagliardini, Gourieroux (2019) introduce a moment method based on Laplace transform to get consistent approximations in this big data framework. This method is particularly useful when we have to consider large panels of assets, such as in Brownlees, Darolles, Le Fol, Sagna (2022).

Several papers may be put together around the expected shortfall and systemic risk themes. Couperier and Leymarie for example developed a new methodology to backtest expected shortfall via multi-quantile regression. El Azri proposes a framework to assess tail risk connectedness across financial markets using a two-step procedure. Expected shortfall are first estimated using a quantile regression approach. And a directional left-tail risk spillover measure à la Diebold and Yilmaz is developed to quantify spillovers between markets.

Alternative Risk Premia

Given the sharp increase of the number of alternative risk premia discovered by academics and practitioners, several issues need to be addressed: the factor construction methodologies, the consequences for portfolio diversification, the persistence of the alternative risk premia.

Regarding the first two issues, Marie Lambert and some coauthors  are working on construction rules of risk factors and the design of smart beta strategies. A proper methodology to stratify the stock universe into style buckets is key for the design of persistent risk factors, asset allocation and performance attribution.
The same team also works on the design of alternative risk premia capturing non-linear payoffs. The working paper on the gamma trading of hedge funds have also been presented at several conferences and seminars (see Fays et al. working papers).

Regarding the persistence of the alternative risk premia, Serge Darolles and Marie Lambert with two other coauthors are working on the economic cycle of alternative risk premia and the change in the business model from active to passive management for those investment strategies. Serge Darolles has presented the paper at the AMF Scientific committee in November 2021 (see Activity report 2021 page 19). Serge Darolles and Fabrice Riva created a course on this topic for M1 level students. This course presents the theoretical foundations and proposes numerous practical applications, with the idea to use the Python language to develop arbitrage strategies based on market anomalies.
On the same topic of alpha persistence, Serge Darolles, Gaëlle Le Fol and Gulten Mero used a regime switching approach to study the existence of risk premia in their paper published in 2022 in Finance (see publications).

Finally, Luc Dumontier is starting a PhD thesis on the “The 5 W’s of Alpha Generation” under the direction of Gaelle Le Fol. He did his first presentation at the last CFE Conference in Berlin, in December 2023.


 3. Implementation challenges


Recent studies have documented that market impact decays slowly through time. Hector Chan, in a recently published paper in Journal of Portfolio Management, studies the effect of such slow decay on trading strategies’ capacity. To do so, he proposes a numerical methodology to estimate capacity. He shows that incorporating the slow decay of market impact leads to trading strategy capacity estimates are significantly lower than shown in previous capacity studies.

Listed market liquidity

Looking at serial correlations, Serge Darolles, Gaëlle Le Fol and Ran Sun are working on hedge funds liquidity and managers’ skills.
Marius Zoican and another researcher are also working on ETF liquidity. They find that identical ETFs can exploit different investor clienteles to charge different management fees for holding identical portfolios. Highly liquid ETFs can extract 0.47 bps in higher fees than their competitors for each 1 bp of narrower bid-ask spread. This research has been presented in numerous international conferences in 2020 and in 2021. It received the award of Best paper semifinalist (Microstructure), Financial Management Association 2021 and is now Revise and Resubmit (round 2) at Journal of Finance.
In their paper, Tamara Nefedova, with some co-authors, uses transaction-level data from Ancerno to investigate implicit cost dynamics and estimate transaction costs associated with trading asset-pricing anomalies. They find that the related costs are considerably lower than documented by previous studies. On the same topic, Charles Chevalier and Serge Darolles used proprietary trading data to analyze the transaction costs associated with the implementation of systematic trend following strategies. They show in “Futures Market Liquidity and the Trading Cost of Trend Following Strategies” that the decrease in the volatility of commodity markets implies an increase of the leverage needed to meet the fund investment objectives. An increase of the total amount of transaction costs paid by the funds follows, with as consequence a decrease in the fund returns.
The paper by Hector Chan extends previous works on market impact by considering more flexible forms of the function measuring the link between the order size and the price impact. He shows that these more realistic specification may lead to very different results and change the forecast capacity of the usual arbitrage strategies.

Algo and/or High frequency trading

Optimisation of the VWAP (Volume Weighted Average Price) replication algorithms, link between the speed of placing orders on the market and the arrival of information, liquidity trade-offs, maximum trading capacity are areas of research in which QMI is regularly investing.
Serge Darolles, Gaëlle Le Fol, and Beatrice Sagna with another co-author are working on basket VWAP strategies. They first have papers of the volume forecasting methodology (univariate and multivariate) and now use this approach to filter from the realized volume the connections between stocks belonging to a same market.
The current research of Ophélie Couperier with Jean-Michel Zakoian and another co-author, aims at introducing functional covariates that considers the influence of intraday price variations in the volatility.

Marius Zoican and his co-authors find in their paper “Speed and learning in high-frequency auctions” that on discrete-time markets, faster trading enhances arbitrageur competition. In contrast to continuous-time markets, lower latencies can improve liquidity on batch auction markets. Marius Zoican with another co-author also propose in another paper an experimental trading platform where participants face speed bumps and invest in low-latency trading technology. They find that asymmetric speed bumps reduce investment in low-latency technology by 20%, and a one standard deviation larger speed bump further reduces low-latency investment by 8.33%. These two papers are published in Journal of Financial Markets, (see Published papers).