The Quantitative Marketing and Structural Econometrics Workshop was recently hosted by the Northwestern University and the Olin Business School at Washington University in St Louis, MO. Here's a quick round-up from 18+ hours of empirical grounding for the benefit of researchers, particularly grad students:
The three-day workshop offered a select group of PhD students from Economics, Marketing, and related fields exposure to cutting edge quantitative research methods in causal reduced form research, structural econometrics and machine learning. These three key themes were spread over 12 different sessions of 1.5 hours each, addressed by accomplished researchers including organizers Drs. Brett Gordon and Raphael Thomadsen, Dr. Tat Chen, Dr. Peter Rossi, Dr. Avi Goldfarb, Dr. Stephen Ryan, and Dr Paul Ellickson. This post offers a summary of key topics discussed in these three areas and valuable benefits.
I. Causal Inference and Identification
Summary: Sessions focused on Causal Inference and Identification included:
(1) Causal Effects, Experiments, and Identification, and
(2) Finding Exogenous Variation in Observational Data.
The focus of the first session was on conducting quasi-experimental research using methods from economics. It offered an overview of econometric techniques, such as difference-in-differences, regression discontinuity, and instrumental variable approaches to estimating causal effects. The unique focus of this session was on what makes a valid and well-written quasi-experimental paper. A highlight of this session was an overview of the latest developments in these fields including use of machine learning for estimating heterogeneous treatment effects and finding mechanisms underlying the causal effects. The focus of the second session was on finding exogenous variation in data – including the use of instruments and fixed effects, as well as the often-overlooked dangers of both approaches.
II. Structural Econometrics
Summary: Sessions focused on Structural Econometrics included:
(1) Aggregate Demand Models I & II, and
(2) Single-Agent Dynamics I & II, and
(3) Empirical Games.
These talks were broadly based on estimating demand and supply to recover the structural underpinnings and primitives of a model based on theoretical underpinnings. The discussions ranged from classic papers, models and assumptions to current cutting- edge developments and methods.
For those who are serious about empirical IO research, these session would generate enough homework to go digging, do the groundwork, and also learn to code classic BLP models as well as dynamic models based on value- and policy- function interactions.
One of my research ideas on backward compatibility of video games is based on structural methods. I have also recently taken an empirical IO course in Economics with Dr. Fernando Luco at Texas A&M University. Hence, this topic was a useful refresher for me, particularly in reviewing and gaining in- depth knowledge of those methods.
III. Machine Learning
Summary: Sessions focused on Machine Learning included:
(1) Machine Learning to Estimate Demand, and
(2) What Can Machine Learning Teach Us.
These sessions discussed a broad philosophical overview of how machine-learning methods differ from econometric and traditional economics-based methods. More importantly, these sessions suggested linkages and integration between various methods. Finally, they opened up the way forward for current research in marketing and economics to gain from machine learning methods and specific ways grad students can grow in these areas.
Benefits: Two key benefits from these sessions were:
(1) Review of fundamental machine learning concepts, and
(2) State of the field for openness to machine learning methods and specific ways to contribute
All in all, an intense 3-day refresher for some concepts and fresh foundations for some others, and a great first visit to the spectacular WashU campus!