IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2002.05670.html
   My bibliography  Save this paper

Experimental Design in Two-Sided Platforms: An Analysis of Bias

Author

Listed:
  • Ramesh Johari
  • Hannah Li
  • Inessa Liskovich
  • Gabriel Weintraub

Abstract

We develop an analytical framework to study experimental design in two-sided marketplaces. Many of these experiments exhibit interference, where an intervention applied to one market participant influences the behavior of another participant. This interference leads to biased estimates of the treatment effect of the intervention. We develop a stochastic market model and associated mean field limit to capture dynamics in such experiments, and use our model to investigate how the performance of different designs and estimators is affected by marketplace interference effects. Platforms typically use two common experimental designs: demand-side ("customer") randomization (CR) and supply-side ("listing") randomization (LR), along with their associated estimators. We show that good experimental design depends on market balance: in highly demand-constrained markets, CR is unbiased, while LR is biased; conversely, in highly supply-constrained markets, LR is unbiased, while CR is biased. We also introduce and study a novel experimental design based on two-sided randomization (TSR) where both customers and listings are randomized to treatment and control. We show that appropriate choices of TSR designs can be unbiased in both extremes of market balance, while yielding relatively low bias in intermediate regimes of market balance.

Suggested Citation

  • Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Weintraub, 2020. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Papers 2002.05670, arXiv.org, revised Sep 2021.
  • Handle: RePEc:arx:papers:2002.05670
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2002.05670
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Identification of treatment response with social interactions," Econometrics Journal, Royal Economic Society, vol. 16(1), pages 1-23, February.
    2. Susan Athey & Dean Eckles & Guido W. Imbens, 2018. "Exact p-Values for Network Interference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 230-240, January.
    3. G W Basse & A Feller & P Toulis, 2019. "Randomization tests of causal effects under interference," Biometrika, Biometrika Trust, vol. 106(2), pages 487-494.
    4. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hannah Li & Geng Zhao & Ramesh Johari & Gabriel Y. Weintraub, 2021. "Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms," Papers 2104.12222, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Julius Owusu, 2023. "Randomization Inference of Heterogeneous Treatment Effects under Network Interference," Papers 2308.00202, arXiv.org, revised Jan 2024.
    2. Stefan Wager & Kuang Xu, 2021. "Experimenting in Equilibrium," Management Science, INFORMS, vol. 67(11), pages 6694-6715, November.
    3. Hannah Li & Geng Zhao & Ramesh Johari & Gabriel Y. Weintraub, 2021. "Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms," Papers 2104.12222, arXiv.org.
    4. Tadao Hoshino & Takahide Yanagi, 2021. "Causal Inference with Noncompliance and Unknown Interference," Papers 2108.07455, arXiv.org, revised Oct 2023.
    5. Stefan Wager & Kuang Xu, 2019. "Experimenting in Equilibrium," Papers 1903.02124, arXiv.org, revised Jun 2020.
    6. Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Y. Weintraub, 2022. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Management Science, INFORMS, vol. 68(10), pages 7069-7089, October.
    7. Iavor Bojinov & David Simchi-Levi & Jinglong Zhao, 2023. "Design and Analysis of Switchback Experiments," Management Science, INFORMS, vol. 69(7), pages 3759-3777, July.
    8. Chabé-Ferret, Sylvain & Reynaud, Arnaud & Tène, Eva, 2021. "Water Quality, Policy Diffusion Effects and Farmers’ Behavior," TSE Working Papers 21-1229, Toulouse School of Economics (TSE).
    9. Ariel Boyarsky & Hongseok Namkoong & Jean Pouget-Abadie, 2023. "Modeling Interference Using Experiment Roll-out," Papers 2305.10728, arXiv.org, revised Aug 2023.
    10. Tadao Hoshino & Takahide Yanagi, 2023. "Randomization Test for the Specification of Interference Structure," Papers 2301.05580, arXiv.org, revised Dec 2023.
    11. Vivek F. Farias & Andrew A. Li & Tianyi Peng & Andrew Zheng, 2022. "Markovian Interference in Experiments," Papers 2206.02371, arXiv.org, revised Jun 2022.
    12. Kitagawa, Toru & Wang, Guanyi, 2023. "Who should get vaccinated? Individualized allocation of vaccines over SIR network," Journal of Econometrics, Elsevier, vol. 232(1), pages 109-131.
    13. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421, arXiv.org, revised Jul 2022.
    14. Davide Viviano, 2019. "Policy Targeting under Network Interference," Papers 1906.10258, arXiv.org, revised Apr 2024.
    15. Michael Pollmann, 2020. "Causal Inference for Spatial Treatments," Papers 2011.00373, arXiv.org, revised Jan 2023.
    16. Supriya Tiwari & Pallavi Basu, 2024. "Quasi-randomization tests for network interference," Papers 2403.16673, arXiv.org.
    17. Yann Bramoullé & Habiba Djebbari & Bernard Fortin, 2020. "Peer Effects in Networks: A Survey," Annual Review of Economics, Annual Reviews, vol. 12(1), pages 603-629, August.
    18. Hao, Shiming, 2021. "True structure change, spurious treatment effect? A novel approach to disentangle treatment effects from structure changes," MPRA Paper 108679, University Library of Munich, Germany.
    19. Susan Athey & Dean Eckles & Guido W. Imbens, 2018. "Exact p-Values for Network Interference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 230-240, January.
    20. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2002.05670. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.