IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2104.12222.html
   My bibliography  Save this paper

Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms

Author

Listed:
  • Hannah Li
  • Geng Zhao
  • Ramesh Johari
  • Gabriel Y. Weintraub

Abstract

Two-sided marketplace platforms often run experiments to test the effect of an intervention before launching it platform-wide. A typical approach is to randomize individuals into the treatment group, which receives the intervention, and the control group, which does not. The platform then compares the performance in the two groups to estimate the effect if the intervention were launched to everyone. We focus on two common experiment types, where the platform randomizes individuals either on the supply side or on the demand side. The resulting estimates of the treatment effect in these experiments are typically biased: because individuals in the market compete with each other, individuals in the treatment group affect those in the control group and vice versa, creating interference. We develop a simple tractable market model to study bias and variance in these experiments with interference. We focus on two choices available to the platform: (1) Which side of the platform should it randomize on (supply or demand)? (2) What proportion of individuals should be allocated to treatment? We find that both choices affect the bias and variance of the resulting estimators but in different ways. The bias-optimal choice of experiment type depends on the relative amounts of supply and demand in the market, and we discuss how a platform can use market data to select the experiment type. Importantly, we find in many circumstances, choosing the bias-optimal experiment type has little effect on variance. On the other hand, the choice of treatment proportion can induce a bias-variance tradeoff, where the bias-minimizing proportion increases variance. We discuss how a platform can navigate this tradeoff and best choose the treatment proportion, using a combination of modeling as well as contextual knowledge about the market, the risk of the intervention, and reasonable effect sizes of the intervention.

Suggested Citation

  • Hannah Li & Geng Zhao & Ramesh Johari & Gabriel Y. Weintraub, 2021. "Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms," Papers 2104.12222, arXiv.org.
  • Handle: RePEc:arx:papers:2104.12222
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2104.12222
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Identification of treatment response with social interactions," Econometrics Journal, Royal Economic Society, vol. 16(1), pages 1-23, February.
    2. Susan Athey & Dean Eckles & Guido W. Imbens, 2018. "Exact p-Values for Network Interference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 230-240, January.
    3. Kenneth Burdett & Shouyong Shi & Randall Wright, 2001. "Pricing and Matching with Frictions," Journal of Political Economy, University of Chicago Press, vol. 109(5), pages 1060-1085, October.
    4. Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Weintraub, 2020. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Papers 2002.05670, arXiv.org, revised Sep 2021.
    5. David Holtz & Sinan Aral, 2020. "Limiting Bias from Test-Control Interference in Online Marketplace Experiments," Papers 2004.12162, arXiv.org.
    6. G W Basse & A Feller & P Toulis, 2019. "Randomization tests of causal effects under interference," Biometrika, Biometrika Trust, vol. 106(2), pages 487-494.
    7. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881.
    8. David Holtz & Ruben Lobel & Inessa Liskovich & Sinan Aral, 2020. "Reducing Interference Bias in Online Marketplace Pricing Experiments," Papers 2004.12489, arXiv.org.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Iavor Bojinov & David Simchi-Levi & Jinglong Zhao, 2023. "Design and Analysis of Switchback Experiments," Management Science, INFORMS, vol. 69(7), pages 3759-3777, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Y. Weintraub, 2022. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Management Science, INFORMS, vol. 68(10), pages 7069-7089, October.
    2. Julius Owusu, 2023. "Randomization Inference of Heterogeneous Treatment Effects under Network Interference," Papers 2308.00202, arXiv.org, revised Jan 2024.
    3. Iavor Bojinov & David Simchi-Levi & Jinglong Zhao, 2023. "Design and Analysis of Switchback Experiments," Management Science, INFORMS, vol. 69(7), pages 3759-3777, July.
    4. Stefan Wager & Kuang Xu, 2021. "Experimenting in Equilibrium," Management Science, INFORMS, vol. 67(11), pages 6694-6715, November.
    5. Tadao Hoshino & Takahide Yanagi, 2021. "Causal Inference with Noncompliance and Unknown Interference," Papers 2108.07455, arXiv.org, revised Oct 2023.
    6. Stefan Wager & Kuang Xu, 2019. "Experimenting in Equilibrium," Papers 1903.02124, arXiv.org, revised Jun 2020.
    7. Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Weintraub, 2020. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Papers 2002.05670, arXiv.org, revised Sep 2021.
    8. Chabé-Ferret, Sylvain & Reynaud, Arnaud & Tène, Eva, 2021. "Water Quality, Policy Diffusion Effects and Farmers’ Behavior," TSE Working Papers 21-1229, Toulouse School of Economics (TSE).
    9. Ariel Boyarsky & Hongseok Namkoong & Jean Pouget-Abadie, 2023. "Modeling Interference Using Experiment Roll-out," Papers 2305.10728, arXiv.org, revised Aug 2023.
    10. Tadao Hoshino & Takahide Yanagi, 2023. "Randomization Test for the Specification of Interference Structure," Papers 2301.05580, arXiv.org, revised Dec 2023.
    11. Vivek F. Farias & Andrew A. Li & Tianyi Peng & Andrew Zheng, 2022. "Markovian Interference in Experiments," Papers 2206.02371, arXiv.org, revised Jun 2022.
    12. Kitagawa, Toru & Wang, Guanyi, 2023. "Who should get vaccinated? Individualized allocation of vaccines over SIR network," Journal of Econometrics, Elsevier, vol. 232(1), pages 109-131.
    13. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421, arXiv.org, revised Jul 2022.
    14. Davide Viviano, 2019. "Policy Targeting under Network Interference," Papers 1906.10258, arXiv.org, revised Apr 2024.
    15. Michael Pollmann, 2020. "Causal Inference for Spatial Treatments," Papers 2011.00373, arXiv.org, revised Jan 2023.
    16. Yann Bramoullé & Habiba Djebbari & Bernard Fortin, 2020. "Peer Effects in Networks: A Survey," Annual Review of Economics, Annual Reviews, vol. 12(1), pages 603-629, August.
    17. Hao, Shiming, 2021. "True structure change, spurious treatment effect? A novel approach to disentangle treatment effects from structure changes," MPRA Paper 108679, University Library of Munich, Germany.
    18. Susan Athey & Dean Eckles & Guido W. Imbens, 2018. "Exact p-Values for Network Interference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 230-240, January.
    19. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    20. Xiaokang Luo & Tirthankar Dasgupta & Minge Xie & Regina Y. Liu, 2021. "Leveraging the Fisher randomization test using confidence distributions: Inference, combination and fusion learning," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 777-797, September.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2104.12222. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.