IDEAS home Printed from https://ideas.repec.org/a/spr/infosf/v22y2020i6d10.1007_s10796-019-09938-6.html
   My bibliography  Save this article

Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing

Author

Listed:
  • Yong Sun

    (Nanjing University of Aeronautics and Astronautics
    Chuzhou University)

  • Wenan Tan

    (Nanjing University of Aeronautics and Astronautics)

Abstract

In recent years, spatial crowdsourcing has emerged as an important new framework, in which each spatial task requires a set of right crowd-workers in the near vicinity to the target locations. Previous studies have focused on spatial task assignment in the static crowdsourcing environment. These algorithms may achieve local optimality by neglecting the uncertain features inherent in real-world crowdsourcing environments, where workers may join or leave during run time. Moreover, spatial task assignment is more complicated when large-scale crowd-workers exist in crowdsourcing environments. The large-scale nature of task assignments poses a significant challenge to uncertain spatial crowdsourcing. In this paper, we propose a novel algorithm combining spatial optimization and multi-agent temporal difference learning (SMATDL). The combination of grid-based optimization and multi-agent learning can achieve higher adaptability and maintain greater efficiency than traditional learning algorithms in the face of large-scale crowdsourcing problems. The SMATDL algorithm decomposes the uncertain crowdsourcing problem into numerous sub-problems by means of a grid-based optimization approach. In order to adapt to the change in the large-scale environment, each agent utilizes temporal difference learning to handle its own spatial region optimization in online crowdsourcing. As a result, multiple agents in SMATDL collaboratively learn to optimize their efforts in accomplishing the global assignment problems efficiently. Through extensive experiments, we illustrate the effectiveness and efficiency of our proposed algorithms on the experimental data sets.

Suggested Citation

  • Yong Sun & Wenan Tan, 2020. "Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing," Information Systems Frontiers, Springer, vol. 22(6), pages 1447-1465, December.
  • Handle: RePEc:spr:infosf:v:22:y:2020:i:6:d:10.1007_s10796-019-09938-6
    DOI: 10.1007/s10796-019-09938-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10796-019-09938-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10796-019-09938-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yuxiang Zhao & Qinghua Zhu, 2014. "Evaluation on crowdsourcing research: Current status and future direction," Information Systems Frontiers, Springer, vol. 16(3), pages 417-434, July.
    2. Yong Sun & Wenan Tan & Lingxia Li & Weiming Shen & Zhuming Bi & Xiaoming Hu, 2016. "A new method to identify collaborative partners in social service provider networks," Information Systems Frontiers, Springer, vol. 18(3), pages 565-578, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yong Sun & Wenan Tan, 0. "Combining Spatial Optimization and Multi-Agent Temporal Difference Learning for Task Assignment in Uncertain Crowdsourcing," Information Systems Frontiers, Springer, vol. 0, pages 1-19.
    2. Livio Cricelli & Michele Grimaldi & Silvia Vermicelli, 2022. "Crowdsourcing and open innovation: a systematic literature review, an integrated framework and a research agenda," Review of Managerial Science, Springer, vol. 16(5), pages 1269-1310, July.
    3. Silvia Blasi & Silvia Rita Sedita, 2018. "Leveraging the power of creative crowds for innovative brands: the eYeka crowdsourcing initiatives," "Marco Fanno" Working Papers 0228, Dipartimento di Scienze Economiche "Marco Fanno".
    4. Regina Lenart-Gansiniec, 2018. "Methodological Challenges of Research on Crowdsourcing," Journal of Entrepreneurship, Management and Innovation, Fundacja Upowszechniająca Wiedzę i Naukę "Cognitione", vol. 4(4), pages 107-126.
    5. Hong Jiang & Shuyu Sun & Hongtao Xu & Shukuan Zhao & Yong Chen, 2020. "Enterprises' network structure and their technology standardization capability in Industry 4.0," Systems Research and Behavioral Science, Wiley Blackwell, vol. 37(4), pages 749-765, July.
    6. Till Blesik & Markus Bick & Tyge-F. Kummer, 2022. "A Conceptualisation of Crowd Knowledge," Information Systems Frontiers, Springer, vol. 24(5), pages 1647-1665, October.
    7. Xuanwei Zhao & Enjun Xia, 2016. "Research On The Operation Mechanism Of Network Crowdsourcing System And Constitutions Of Crowdsourcing Capability," International Journal of Innovation Management (ijim), World Scientific Publishing Co. Pte. Ltd., vol. 20(07), pages 1-18, October.
    8. Henner Gimpel & Vanessa Graf-Seyfried & Robert Laubacher & Oliver Meindl, 2023. "Towards Artificial Intelligence Augmenting Facilitation: AI Affordances in Macro-Task Crowdsourcing," Group Decision and Negotiation, Springer, vol. 32(1), pages 75-124, February.
    9. Marta Poblet & Esteban García-Cuesta & Pompeu Casanovas, 2018. "Crowdsourcing roles, methods and tools for data-intensive disaster management," Information Systems Frontiers, Springer, vol. 20(6), pages 1363-1379, December.
    10. Gaganmeet Kaur Awal & K. K. Bharadwaj, 2019. "Leveraging collective intelligence for behavioral prediction in signed social networks through evolutionary approach," Information Systems Frontiers, Springer, vol. 21(2), pages 417-439, April.
    11. Lee, Jung & Seo, DongBack, 2016. "Crowdsourcing not all sourced by the crowd: An observation on the behavior of Wikipedia participants," Technovation, Elsevier, vol. 55, pages 14-21.
    12. Bal, Anjali S. & Weidner, Kelly & Hanna, Richard & Mills, Adam J., 2017. "Crowdsourcing and brand control," Business Horizons, Elsevier, vol. 60(2), pages 219-228.
    13. Roman Lukyanenko & Andrea Wiggins & Holly K. Rosser, 0. "Citizen Science: An Information Quality Research Frontier," Information Systems Frontiers, Springer, vol. 0, pages 1-23.
    14. Yin, Xicheng & Wang, Hongwei & Wang, Wei & Zhu, Kevin, 2020. "Task recommendation in crowdsourcing systems: A bibliometric analysis," Technology in Society, Elsevier, vol. 63(C).
    15. Anna Adamik & Michał Nowicki & Andrius Puksas, 2022. "Energy Oriented Concepts and Other SMART WORLD Trends as Game Changers of Co-Production—Reality or Future?," Energies, MDPI, vol. 15(11), pages 1-38, June.
    16. Carbajo, Ruth & Cabeza, Luisa F., 2018. "Renewable energy research and technologies through responsible research and innovation looking glass: Reflexions, theoretical approaches and contemporary discourses," Applied Energy, Elsevier, vol. 211(C), pages 792-808.
    17. Dan Li & Longying Hu, 2017. "Exploring the effects of reward and competition intensity on participation in crowdsourcing contests," Electronic Markets, Springer;IIM University of St. Gallen, vol. 27(3), pages 199-210, August.
    18. Marta Poblet & Esteban García-Cuesta & Pompeu Casanovas, 0. "Crowdsourcing roles, methods and tools for data-intensive disaster management," Information Systems Frontiers, Springer, vol. 0, pages 1-17.
    19. Regina Lenart-Gansiniec, 2017. "Factors Influencing Decisions about Crowdsourcing in the Public Sector: A Literature Review," Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, Mendel University Press, vol. 65(6), pages 1997-2005.
    20. Naudé, Wim & Bray, Amy & Lee, Celina, 2021. "Crowdsourcing Artificial Intelligence in Africa: Findings from a Machine Learning Contest," IZA Discussion Papers 14545, Institute of Labor Economics (IZA).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:infosf:v:22:y:2020:i:6:d:10.1007_s10796-019-09938-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.