-
Can a Website Bring Unemployment Down? Experimental Evidence from France
Aïcha Ben Dhia, Bruno Crépon, Esther Mbih, Louise Paul-Delvaux, Bertille Picard, and Vincent Pons
Keywords: Labor economics, Online platforms
NBER Working Paper 29914, April 2022 VoxEU Column
AbstractWe evaluate the impact of an online platform giving job seekers tips to improve their search and recommendations of new occupations and locations to target, based on their personal data and labor market data. Our experiment used an encouragement design and was conducted in collaboration with the French public employment agency. It includes 212,277 individuals. We find modest effects on search methods: the users of the platform adopt some of its tips and they are more likely to use resources provided by public employment services. However, following individual trajectories for 18 months after the intervention, we do not observe any impact on time spent looking for a job, search scope (occupational or geographical), or self-reported well-being. Most importantly, we do not find any effect on any employment outcome, whether in the short or medium run. We conclude that the enthusiasm around the potential for job-search assistance platforms to help reduce unemployment should be toned down.
-
Does Personalized Allocation Make Our Experimental Designs More Fair?
Bertille Picard
Keywords: Fairness, Contextual bandits, Randomized controlled trials, Machine learning
AbstractAlgorithms can optimize treatment allocation within an experimental design. They can progressively identify the most beneficial treatment for the subjects and thus maximize the experiment’s overall impact. However, these designs raise concerns for experimentalists and policymakers because they involve transferring decision-making to an algorithm. Are adaptive experiments inherently fairer and thus a preferred choice over traditional randomized controlled trials? In this paper, I propose a comprehensive examination of fairness by considering multiple criteria that can influence the researchers’ preference for one design over the other: the possibility to increase the benefits of the experiment for the experimental subjects, the transparency of the decision rule, the absence of discrimination regarding the treatment allocation, the protection of individuals’ data. By summarizing and analyzing these distinct criteria through a utility model, I discuss the relative fairness of adaptive experiments and standard randomized controlled trials. Specifically, I show that these different designs align with extreme versions of the fairness utility model, reflecting the pursuit of distinct fairness objectives within experimental settings. I highlight intermediate solutions that can be pursued to reconcile and balance different fairness objectives in experimental designs.
-
Decomposing Inequalities with Machine Learning: Should we Change the Reference Outcome?
Emmanuel Flachaire, Bertille Picard
Keywords: Inequality, Oaxaca decomposition, Machine learning
AbstractThe Kitagawa-Oaxaca-Blinder decomposition is widely used in empirical studies to analyze changes in a distributional statistic between two groups or two periods. It separates an observed difference into an explained part due to individuals’ observed characteristics and an unexplained part. In this paper, the assumptions required to identify the different components of the aggregate decomposition are discussed, and two key hypotheses are discussed: the common support condition and the correct specification of a parametric linear model. This leads us to propose a decomposition method in a fully non-parametric framework.
-
An Adaptive Experiment to Boost Online Skill Signaling and Visibility
Morgane Hoffmann, Bertille Picard, Charly Marie, and Guillaume Bied
Keywords: Labor economics, Online platforms, Adaptive experiments
AbstractDigital matching platforms promise to reduce frictions on the labor market by providing low-cost information on available positions and candidates. As such, they may form a welcome addition to the toolbox available to Public Employment Services to bridge labor supply and demand. However, there are certain challenges associated with their adoption. For instance, vulnerable populations may face difficulties in utilizing digital tools effectively. In this study, we evaluate the impact of a communication campaign by email designed to encourage the use of an online matching platform maintained by the French Public Employment Service, Pôle emploi. We designed several email templates that combined information, support or motivational content to encourage jobseekers to engage with their profiles on the platform. In order to discover email effectiveness we implement an adaptive experiment (contextual bandit) where the goal is to use past jobseekers take-up responses and characteristics to determine email allocation in the future, reducing gradually the allocation of less promising templates. Additionally, we built an optimal personalization allocation strategy based on collected data and test its effectiveness. Emails had a positive impact on the usage of the platform, as measured by a wide range of outcomes. However, attempts at learning a personalized emailing strategy did not manage to significantly improve on a random allocation of email templates.
-
Can a Website Bring Unemployment Down? Experimental Evidence from France
Keywords: Labor economics, Online platforms
-
Does Personalized Allocation Make Our Experimental Designs More Fair?
Keywords: Fairness, Contextual bandits, Randomized controlled trials, Machine learning
-
Decomposing Inequalities with Machine Learning: Should we Change the Reference Outcome?
Keywords: Inequality, Oaxaca decomposition, Machine learning
-
An Adaptive Experiment to Boost Online Skill Signaling and Visibility
Keywords: Labor economics, Online platforms, Adaptive experiments