IDEAS home Printed from https://ideas.repec.org/a/gam/jsusta/v9y2017i7p1203-d104062.html
   My bibliography  Save this article

Evaluating Retrieval Effectiveness by Sustainable Rank List

Author

Listed:
  • Tenvir Ali

    (Department of Information and Communication Engineering, Yeungnam University, Gyeongsan, Gyeongbuk 38541, Korea)

  • Zeeshan Jhandir

    (Department of Information and Communication Engineering, Yeungnam University, Gyeongsan, Gyeongbuk 38541, Korea)

  • Ingyu Lee

    (Sorrell College of Business, Troy University, Troy, AL 36082, USA)

  • Byung-Won On

    (Department of Software Convergence Engineering, Kunsan National University, Gunsan-si, Jeollabuk-do 54150, Korea)

  • Gyu Sang Choi

    (Department of Information and Communication Engineering, Yeungnam University, Gyeongsan, Gyeongbuk 38541, Korea)

Abstract

The Internet of Things (IoT) and Big Data are among the most popular emerging fields of computer science today. IoT devices are creating an enormous amount of data daily on a different scale; hence, search engines must meet the requirements of rapid ingestion and processing followed by accurate and fast extraction. Researchers and students from the field of computer science query the search engines on these topics to reveal a wealth of IoT-related information. In this study, we evaluate the relative performance of two search engines: Bing and Yandex. This work proposes an automatic scheme that populates a sustainable optimal rank list of search results with higher precision for IoT-related topics. The proposed scheme rewrites the seed query with the help of attribute terms extracted from the page corpus. Additionally, we use newness and geo-sensitivity-based boosting and dampening of web pages for the re-ranking process. To evaluate the proposed scheme, we use an evaluation matrix based on discounted cumulative gain (DCG), normalized DCG (nDCG), and mean average precision (MAP n ). The experimental results show that the proposed scheme achieves scores of MAP@5 = 0.60, DCG 5 = 4.43, and nDCG 5 = 0.95 for general queries; DCG 5 = 4.14 and nDCG 5 = 0.93 for time-stamp queries; and DCG 5 = 4.15 and nDCG 5 = 0.96 for geographical location-based queries. These outcomes validate the usefulness of the suggested system in helping a user to access IoT-related information.

Suggested Citation

  • Tenvir Ali & Zeeshan Jhandir & Ingyu Lee & Byung-Won On & Gyu Sang Choi, 2017. "Evaluating Retrieval Effectiveness by Sustainable Rank List," Sustainability, MDPI, vol. 9(7), pages 1-20, July.
  • Handle: RePEc:gam:jsusta:v:9:y:2017:i:7:p:1203-:d:104062
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2071-1050/9/7/1203/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2071-1050/9/7/1203/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Tefko Saracevic, 2007. "Relevance: A review of the literature and a framework for thinking on the notion in information science. Part III: Behavior and effects of relevance," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 2126-2144, November.
    2. Stephen P. Harter, 1996. "Variations in relevance assessments and the measurement of retrieval effectiveness," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 47(1), pages 37-49, January.
    3. Dirk Lewandowski, 2015. "Evaluating the retrieval effectiveness of web search engines using a representative query sample," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1763-1775, September.
    4. Arif Mehmood & Gyu Sang Choi & Otto F. Feigenblatt & Han Woo Park, 2016. "Proving ground for social network analysis in the emerging research area “Internet of Things” (IoT)," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(1), pages 185-201, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Roland Grad & Pierre Pluye & Vera Granikov & Janique Johnson‐Lafleur & Michael Shulha & Soumya Bindiganavile Sridhar & Jonathan L. Moscovici & Gillian Bartlett & Alain C. Vandal & Bernard Marlow & Lor, 2011. "Physicians' assessment of the value of clinical information: Operationalization of a theoretical model," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(10), pages 1884-1891, October.
    2. Takano, Yasutomo & Kajikawa, Yuya, 2019. "Extracting commercialization opportunities of the Internet of Things: Measuring text similarity between papers and patents," Technological Forecasting and Social Change, Elsevier, vol. 138(C), pages 45-68.
    3. Basso, Fernanda Gisele & Pereira, Cristiano Gonçalves & Porto, Geciane Silveira, 2021. "Cooperation and technological areas in the state universities of São Paulo: An analysis from the perspective of the triple helix model," Technology in Society, Elsevier, vol. 65(C).
    4. Matheus Becker Costa & Leonardo Moraes Aguiar Lima Santos & Jones Luís Schaefer & Ismael Cristofer Baierle & Elpidio Oscar Benitez Nara, 2019. "Industry 4.0 technologies basic network identification," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 977-994, November.
    5. Frans van der Sluis & Egon L. van den Broek, 2023. "Feedback beyond accuracy: Using eye‐tracking to detect comprehensibility and interest during reading," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 3-16, January.
    6. Amosa Babalola & Olayemi Olalekan & Onyeka Ndidi & Nwaekpe Christian, 2021. "Usage of Internet Search Engines among Polytechnic Students," International Journal of Research and Scientific Innovation, International Journal of Research and Scientific Innovation (IJRSI), vol. 8(9), pages 76-80, September.
    7. Wayne de Fremery & Michael K. Buckland, 2022. "Context, relevance, and labor," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(9), pages 1268-1278, September.
    8. Lu, Yang & Papagiannidis, Savvas & Alamanos, Eleftherios, 2018. "Internet of Things: A systematic review of the business literature from the user and organisational perspectives," Technological Forecasting and Social Change, Elsevier, vol. 136(C), pages 285-297.
    9. Julianne Sansa-Otim & Mary Nsabagwa & Andrew Mwesigwa & Becky Faith & Mojisola Owoseni & Olayinka Osuolale & Daudi Mboma & Ben Khemis & Peter Albino & Samuel Owusu Ansah & Maureen Abla Ahiataku & Vict, 2022. "An Assessment of the Effectiveness of Weather Information Dissemination among Farmers and Policy Makers," Sustainability, MDPI, vol. 14(7), pages 1-20, March.
    10. Maram Hasanain & Tamer Elsayed, 2022. "Studying effectiveness of Web search for fact checking," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(5), pages 738-751, May.
    11. Zhang, Lei & Kopak, Rick & Freund, Luanne & Rasmussen, Edie, 2011. "Making functional units functional: The role of rhetorical structure in use of scholarly journal articles," International Journal of Information Management, Elsevier, vol. 31(1), pages 21-29.
    12. Gineke Wiggers & Suzan Verberne & Wouter van Loon & Gerrit‐Jan Zwenne, 2023. "Bibliometric‐enhanced legal information retrieval: Combining usage and citations as flavors of impact relevance," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(8), pages 1010-1025, August.
    13. Howard D. White, 2015. "Co-cited author retrieval and relevance theory: examples from the humanities," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2275-2299, March.
    14. Moghadasi, Shiva Imani & Ravana, Sri Devi & Raman, Sudharshan N., 2013. "Low-cost evaluation techniques for information retrieval systems: A review," Journal of Informetrics, Elsevier, vol. 7(2), pages 301-312.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jsusta:v:9:y:2017:i:7:p:1203-:d:104062. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.