IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0290762.html
   My bibliography  Save this article

Short text classification with machine learning in the social sciences: The case of climate change on Twitter

Author

Listed:
  • Karina Shyrokykh
  • Max Girnyk
  • Lisa Dellmuth

Abstract

To analyse large numbers of texts, social science researchers are increasingly confronting the challenge of text classification. When manual labeling is not possible and researchers have to find automatized ways to classify texts, computer science provides a useful toolbox of machine-learning methods whose performance remains understudied in the social sciences. In this article, we compare the performance of the most widely used text classifiers by applying them to a typical research scenario in social science research: a relatively small labeled dataset with infrequent occurrence of categories of interest, which is a part of a large unlabeled dataset. As an example case, we look at Twitter communication regarding climate change, a topic of increasing scholarly interest in interdisciplinary social science research. Using a novel dataset including 5,750 tweets from various international organizations regarding the highly ambiguous concept of climate change, we evaluate the performance of methods in automatically classifying tweets based on whether they are about climate change or not. In this context, we highlight two main findings. First, supervised machine-learning methods perform better than state-of-the-art lexicons, in particular as class balance increases. Second, traditional machine-learning methods, such as logistic regression and random forest, perform similarly to sophisticated deep-learning methods, whilst requiring much less training time and computational resources. The results have important implications for the analysis of short texts in social science research.

Suggested Citation

  • Karina Shyrokykh & Max Girnyk & Lisa Dellmuth, 2023. "Short text classification with machine learning in the social sciences: The case of climate change on Twitter," PLOS ONE, Public Library of Science, vol. 18(9), pages 1-26, September.
  • Handle: RePEc:plo:pone00:0290762
    DOI: 10.1371/journal.pone.0290762
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0290762
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0290762&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0290762?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Beakcheol Jang & Inhwan Kim & Jong Wook Kim, 2019. "Word2vec convolutional neural networks for classification of news articles and tweets," PLOS ONE, Public Library of Science, vol. 14(8), pages 1-20, August.
    2. Grimmer, Justin & Stewart, Brandon M., 2013. "Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts," Political Analysis, Cambridge University Press, vol. 21(3), pages 267-297, July.
    3. Denny, Matthew J. & Spirling, Arthur, 2018. "Text Preprocessing For Unsupervised Learning: Why It Matters, When It Misleads, And What To Do About It," Political Analysis, Cambridge University Press, vol. 26(2), pages 168-189, April.
    4. Sebők, Miklós & Kacsuk, Zoltán, 2021. "The Multiclass Classification of Newspaper Articles with Machine Learning: The Hybrid Binary Snowball Approach," Political Analysis, Cambridge University Press, vol. 29(2), pages 236-249, April.
    5. D'Orazio, Vito & Landis, Steven T. & Palmer, Glenn & Schrodt, Philip, 2014. "Separating the Wheat from the Chaff: Applications of Automated Document Classification Using Support Vector Machines," Political Analysis, Cambridge University Press, vol. 22(2), pages 224-242, April.
    6. Greene, Kevin T. & Park, Baekkwan & Colaresi, Michael, 2019. "Machine Learning Human Rights and Wrongs: How the Successes and Failures of Supervised Learning Algorithms Can Inform the Debate About Information Effects," Political Analysis, Cambridge University Press, vol. 27(2), pages 223-230, April.
    7. Lisa Maria Dellmuth & Maria-Therese Gustafsson, 2021. "Global adaptation governance: how intergovernmental organizations mainstream climate change adaptation," Climate Policy, Taylor & Francis Journals, vol. 21(7), pages 868-883, August.
    8. Constantine Boussalis & Travis G. Coan & Mirya R. Holman, 2018. "Climate change communication from cities in the USA," Climatic Change, Springer, vol. 149(2), pages 173-187, July.
    9. Åsa Persson, 2019. "Global adaptation governance: An emerging but contested domain," Wiley Interdisciplinary Reviews: Climate Change, John Wiley & Sons, vol. 10(6), November.
    10. Martin Popel & Marketa Tomkova & Jakub Tomek & Łukasz Kaiser & Jakob Uszkoreit & Ondřej Bojar & Zdeněk Žabokrtský, 2020. "Transforming machine translation: a deep learning system reaches news translation quality comparable to human professionals," Nature Communications, Nature, vol. 11(1), pages 1-15, December.
    11. Miller, Blake & Linder, Fridolin & Mebane, Walter R., 2020. "Active Learning Approaches for Labeling Text: Review and Assessment of the Performance of Active Learning Approaches," Political Analysis, Cambridge University Press, vol. 28(4), pages 532-551, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Miklos Sebők & Zoltán Kacsuk & Ákos Máté, 2022. "The (real) need for a human touch: testing a human–machine hybrid topic classification workflow on a New York Times corpus," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(5), pages 3621-3643, October.
    2. Paweł Matuszewski, 2023. "How to prepare data for the automatic classification of politically related beliefs expressed on Twitter? The consequences of researchers’ decisions on the number of coders, the algorithm learning pro," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 301-321, February.
    3. Sandra Wankmüller, 2023. "A comparison of approaches for imbalanced classification problems in the context of retrieving relevant documents for an analysis," Journal of Computational Social Science, Springer, vol. 6(1), pages 91-163, April.
    4. Mohamed M. Mostafa, 2023. "A one-hundred-year structural topic modeling analysis of the knowledge structure of international management research," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(4), pages 3905-3935, August.
    5. Latifi, Albina & Naboka-Krell, Viktoriia & Tillmann, Peter & Winker, Peter, 2024. "Fiscal policy in the Bundestag: Textual analysis and macroeconomic effects," European Economic Review, Elsevier, vol. 168(C).
    6. Purwoko Haryadi Santoso & Edi Istiyono & Haryanto & Wahyu Hidayatulloh, 2022. "Thematic Analysis of Indonesian Physics Education Research Literature Using Machine Learning," Data, MDPI, vol. 7(11), pages 1-41, October.
    7. Camilla Salvatore & Silvia Biffignandi & Annamaria Bianchi, 2022. "Corporate Social Responsibility Activities Through Twitter: From Topic Model Analysis to Indexes Measuring Communication Characteristics," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 164(3), pages 1217-1248, December.
    8. Jason Anastasopoulos & George J. Borjas & Gavin G. Cook & Michael Lachanski, 2018. "Job Vacancies, the Beveridge Curve, and Supply Shocks: The Frequency and Content of Help-Wanted Ads in Pre- and Post-Mariel Miami," NBER Working Papers 24580, National Bureau of Economic Research, Inc.
    9. repec:osf:socarx:yfzsh_v1 is not listed on IDEAS
    10. W. Benedikt Schmal, 2024. "Academic Knowledge: Does it Reflect the Combinatorial Growth of Technology?," Papers 2409.20282, arXiv.org.
    11. Seraphine F. Maerz & Carsten Q. Schneider, 2020. "Comparing public communication in democracies and autocracies: automated text analyses of speeches by heads of government," Quality & Quantity: International Journal of Methodology, Springer, vol. 54(2), pages 517-545, April.
    12. Iasmin Goes, 2023. "Examining the effect of IMF conditionality on natural resource policy," Economics and Politics, Wiley Blackwell, vol. 35(1), pages 227-285, March.
    13. Erkan Gunes & Christoffer Koch Florczak, 2025. "Replacing or enhancing the human coder? Multiclass classification of policy documents with large language models," Journal of Computational Social Science, Springer, vol. 8(2), pages 1-20, May.
    14. Mircea Popa, 2025. "Modelling policy action using natural language processing: evidence for a long-run increase in policy activism in the UK," Journal of Computational Social Science, Springer, vol. 8(2), pages 1-51, May.
    15. Yeomans, Michael, 2021. "A concrete example of construct construction in natural language," Organizational Behavior and Human Decision Processes, Elsevier, vol. 162(C), pages 81-94.
    16. Karell, Daniel & Freedman, Michael Raphael, 2019. "Rhetorics of Radicalism," SocArXiv yfzsh, Center for Open Science.
    17. Bastiaan Bruinsma & Moa Johansson, 2024. "Finding the structure of parliamentary motions in the Swedish Riksdag 1971–2015," Quality & Quantity: International Journal of Methodology, Springer, vol. 58(4), pages 3275-3301, August.
    18. Sara Kahn-Nisser, 2019. "When the targets are members and donors: Analyzing inter-governmental organizations’ human rights shaming," The Review of International Organizations, Springer, vol. 14(3), pages 431-451, September.
    19. Justyna Klejdysz & Robin L. Lumsdaine, 2023. "Shifts in ECB Communication: A Textual Analysis of the Press Conference," International Journal of Central Banking, International Journal of Central Banking, vol. 19(2), pages 473-542, June.
    20. Javier De la Hoz-M & Mª José Fernández-Gómez & Susana Mendes, 2021. "LDAShiny: An R Package for Exploratory Review of Scientific Literature Based on a Bayesian Probabilistic Model and Machine Learning Tools," Mathematics, MDPI, vol. 9(14), pages 1-21, July.
    21. Andres Algaba & David Ardia & Keven Bluteau & Samuel Borms & Kris Boudt, 2020. "Econometrics Meets Sentiment: An Overview Of Methodology And Applications," Journal of Economic Surveys, Wiley Blackwell, vol. 34(3), pages 512-547, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0290762. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.