Paper: Identifying Contation
Working paper: The Booms and Busts of Beta Arbitrage
Working paper: High-Frequency Trading around Large Institutional Orders, forthwoming in Journal of Finance
Abstract : The focus of this study is to develop theories that can underpin information mining on the web to produce reliable information and to assess the impact of existing methods on the behaviour of market prices using techniques that are based upon the concept of entropy.
The framework for the analysis will be provided by information theory. The major metrics will be constructed on the application of concepts related to Shannon entropy and cross entropy. The data for the study will be drawn from Thomson Reuters market data provided by The Securities Industry Research Centre of the Asia Pacific (SIRCA).
Abstract: The object of this project is to study default dependence and contagion amongst nonagency securitized mortgages in the US over the period 1998-2011. We will use a Cox proportional hazard model in a competing risk framework for default (and prepayment) and a copula model for the dependence amongst individual hazards. Dependence between defaults can occur because of geographical proximity, common economic conditions, which may be of either a local or economy-wide nature, the business cycle, interest rates, etc. We want to quantify the amount of this default dependence and investigate the reasons why such dependence occurs.
Abstract: Many institutional traders split large orders into smaller orders sent over some time period. This schedule may be optimized to reduce price impact. I have developed performance metrics to assess how effective funds are at (i) executing these smaller orders, (ii) deciding when to wait for orders to be filled (i.e. market timing), and (iii) scheduling the smaller orders. The performance metrics have sound theoretical backing and let us separate trading-related performance from noise. I propose to use data on orders and trades for a selection of investment funds to characterize these skills. For the initial work, I will study: (i) the relative magnitudes of these skills, (ii) how these skills vary across funds, (iii) what fraction of firms seem to possess superior trading-related skills, (iv) how firms’ skills change over time due to learning, and (v) the savings in transactions costs which accrue to investors. For possible further work, I suspect this data would help answer further questions including: (vi) how firms’ trading-related performance changes with macroeconomic factors, (vii) whether changes in trading-related skills result in fund inflows, (viii) the value of these inflows to the funds.
Abstract: ODERIM “Outlier Detection/Estimation and mitigation for RIsk Management and control, based on Advanced SSP methods, with a focus on extreme situations”. The long lasting crisis situation since 2008 is corrupting financial data with an increasing number of extreme events (i.e. outliers). These outliers require being detected, processed and, if possible, anticipated in order to keep acceptable performance while limiting specific risks for either long-term management style or high frequency trading. The objective of the project is to improve and optimize statistical filtering techniques (such as Lq-regularized Kalman, MCMC algorithms, Particle filtering) to detect, estimate and mitigate the outliers that occur in financial data in order to avoid the contamination of the systematic exposures due to idiosyncratic (exogenous) extreme events.