Research
Publications
- “Beliefs, Information Sharing, and Mental Health Care Use Among University Students” (Abstract)
(with Ida Grigoryeva, Bruno Calderon, Roberto González and Alejandro Guardiola Ramires)
Forthcoming at Journal of Development Economics
Abstract: This paper investigates the role of beliefs and stigma in shaping students’ use of professional mental health services at a large private university in Mexico, where supply-side barriers are minimal and services are readily accessible. In an online experiment with 680 students, we estimate a large treatment gap with nearly 50% of students in distress not receiving professional mental health support despite a high level of awareness and perceived effectiveness. We document stigmatized beliefs and misconceptions correlated with the treatment gap. For example, three-quarters of students incorrectly believe that those in distress perform worse academically, and many underestimate how common therapy use is among their peers. To correct inaccurate beliefs, we implement an information intervention and find that it increases students’ willingness to share on-campus mental health resources with peers and encourages them to recommend these resources when advising a friend in distress. However, we also find that it lowers their willingness to pay for external services, suggesting a potential substitution effect from private therapy to free on-campus resources.
Presented at: Advances with Field Experiments Conference (UChicago, 2025), NHH Field Experiments Conference (Bergen, 2024)*, Behavioral & Experimental Economics Student Conference (Caltech, 2023)
Working Papers
- “Mental Models, Social Learning and Statistical Discrimination: A Laboratory Study” (Abstract)
Abstract: Behavioral biases in decision-making are widespread and often persist even in the presence of feedback that is diagnostic of optimal behavior. Using a laboratory experiment, I examine whether exposure to others’ decisions can correct initial misconceptions and facilitate learning in a worker hiring task where deviations from the theoretical benchmark arise from neglecting an informative signal. I show that exposure to optimal behavior substantially improves decision quality, while exposure to suboptimal choices has a weak negative effect. The analysis reveals substantial heterogeneity in learning beyond exposure: roughly one-third of subjects mechanically replicate observed choices and revert to suboptimal behavior once exposure ends, another third maintain improved performance only when the task environment remains unchanged, and the remaining third exhibit deeper learning that transfers to a related task with altered primitives. These findings highlight the dual role of social learning: while it can enhance decision-making, it can also generate imitation that fails to generalize beyond the observed context.
Presented at: ESA North American Meeting (Columbus, 2024), Behavioral & Experimental Economics Student Conference (Caltech, 2024), Los Angeles Experiments Conference Poster (Caltech, 2024)
- “Mass Reproducibility and Replicability: A New Hope” (Abstract)
(with Abel Brodeur, Derek Mikola, Nikolai Cook, et al.)
Revise and Resubmit at Nature
Abstract: This study pushes our understanding of research reliability by reproducing and replicating claims from 110 papers in leading economic and political science journals. The analysis involves computational reproducibility checks and robustness assessments. It reveals several patterns. First, we uncover a high rate of fully computationally reproducible results (over 85%). Second, excluding minor issues like missing packages or broken pathways, we uncover coding errors for about 25% of studies, with some studies containing multiple errors. Third, we test the robustness of the results to 5,511 re-analyses. We find a robustness reproducibility of about 70%. Robustness reproducibility rates are relatively higher for re-analyses that introduce new data and lower for re-analyses that change the sample or the definition of the dependent variable. Fourth, 52% of re-analysis effect size estimates are smaller than the original published estimates and the average statistical significance of a re-analysis is 77% of the original. Lastly, we rely on six teams of researchers working independently to answer eight additional research questions on the determinants of robustness reproducibility. Most teams find a negative relationship between replicators' experience and reproducibility, while finding no relationship between reproducibility and the provision of intermediate or even raw data combined with the necessary cleaning codes.
Work in Progress
- “Learning to Ignore Irrelevant Contingencies: An Experiment” (Abstract)
(with Jeongbin Kim and Emanuel Vespa)
Abstract: We study how individuals learn to make optimal decisions when doing so requires conditioning on payoff-relevant contingencies, as prescribed by theory. Using a laboratory experiment, we contrast learning across two environments: one in which irrelevant contingencies are excluded by design, and another in which they occur with positive probability but never affect outcomes. At baseline, 4 out of 5 subjects make a suboptimal choice. With experience and feedback, 71% of participants in the zero-probability environment switch to the optimal choice, compared to 49% in the setting where such contingencies remain present, indicating that engagement with payoff-irrelevant situations hinders learning. Participants in the zero-probability treatment are also more likely to report correct beliefs about the task environment and, conditional on holding accurate beliefs, are 15 percentage points more likely to make an optimal choice. These findings suggest that in many environments that necessitate reasoning through contingencies, not engaging with events that are irrelevant to the outcome can significantly ease learning and lead to optimal behavior.
Presented at: ESA North American Meeting (Tucson, 2025), Behavioral & Experimental Economics Student Conference (UC Santa Barbara, 2025)
* – presentation by co-author