Is ChatGPT a friend or foe in the war on misinformation? : A South African perspective

A South African perspective

Burgert Senekal
University of the Free State
Susan Brokensha
University of the Free State
Share:

How to Cite

Is ChatGPT a friend or foe in the war on misinformation? : A South African perspective. (2023). Communicare: Journal for Communication Studies in Africa, 42(2), 3-16. https://doi.org/10.36615/jcsa.v42i2.2437
  • Articles
  • Submited: April 6, 2023
  • Published: December 9, 2023

Abstract

The release of ChatGPT at the end of 2022 met with fears and optimism. One particularly important avenue of research that is emerging revolves around ChatGPT's ability to provide accurate and unbiased information on a variety of topics. Given the interest that Google and Microsoft have shown in similar technologies, it is likely that Large Language Models such as ChatGPT could become new gateways to information, and if this is the case, what kind of information this technology provides needs to be investigated. The current study examines the usefulness of ChatGPT as a source of information in a South African context by first investigating ChatGPT's responses to ten South African conspiracy theories in terms of truthfulness, before employing bias classification as well as sentiment analysis to evaluate whether ChatGPT exhibits bias when presenting eight South African political topics. We found that, overall, ChatGPT did not spread conspiracy theories. However, the tool generated falsehoods around one conspiracy theory and generally presented a left bias, albeit not to the extreme. Sentiment analysis showed that ChatGPT's responses were mostly neutral and, when more emotive, were more often positive than negative. The implications of the findings for academics and students are discussed, as are a number of recommendations for future research.

References

  1. @Kgabane. (2021). Twitter post. Twitter, 16 June. Available from: https://twitter.com/Kgabane/status/1405073464344694784 Accessed 4 April 2023.
  2. @Yolandacuba. (2021). Twitter post. Twitter, 16 June. Available from: https://twitter.com/Yolandacuba/status/1405057431894437891 Accessed 4 April 2023.
  3. Akinola, A.O. (2020). Farm attacks or ‘white genocide’? Interrogating the unresolved land question in South Africa. African Journal on Conflict Resolution, 20(2):65–91.
  4. Anonymous, (2005). Department of error. The Lancet, 366(9485):548. doi: 10.1016/S0140-6736(05)67096-1
  5. Asak, M.O. & Molale, T.B. (2020). Deconstructing de-legitimisation of mainstream media as sources of authentic news in the post-truth era. Communicatio, 46(4):50-74. doi: 10.1080/02500167.2020.1723664
  6. Bang, Y., Cahyawijaya, S., Lee, N., Dai, W., Su, D., Wilie, B., Lovenia, H., Ji, Z., Yu, T., Chung, W., Do, Q. V., Xu, Y. & Fung, P. (2023). A multitask, multilingual, multimodal evaluation of ChatGPT on reasoning, hallucination, and interactivity. arXiv. doi: 10.48550/arxiv.2302.04023
  7. Borji, A. (2023). A categorical archive of ChatGPT failures. arXiv. doi: 10.48550/arxiv.2302.03494
  8. Brokensha, S.I., Kotzé, E. & Senekal, B.A. 2023. AI in and for Africa: A humanistic perspective. (Artificial Intelligence and Robotics Series). Boca Raton, FL: Chapman & Hall/CRC.
  9. Brown, T. B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D. M., Wu, J., Winter, C., Hesse, C. & Amodei, D. (2020). Language models are few-shot learners. arXiv. doi: 10.48550/arxiv.2005.14165
  10. Chutel, L. (2018). Why googling squatter camps in South Africa returns pictures of white people. Quartz, 15 June. Available from: https://qz.com/africa/1306782/whygoogling-squatter-camps-in-south-africareturns-pictures-of-white-people/ Accessed 4 April 2023.
  11. Clack, W. & Minnaar, A. (2018). Rural crime in South Africa: an exploratory review of ‘farm attacks’ and stocktheft as the primary crimes in rural areas. Acta Criminologica: African Journal of Criminology & Victimology, 31(1):103-135.
  12. Cooper, M W. (1991). Behold a pale horse. Flagstaff: Light Technology Publishing.
  13. Davis, R. (2020). QAnon originated in South Africa – now that the global cult is back here we should all be afraid. Daily Maverick, 26 September. Available from: https://www.dailymaverick.co.za/article/2020-09-26-qanon-originated-in-south-africa-now-that-the-global-cult-is-back-here-we-should-all-be-afraid/ Accessed: 4 April 2023.
  14. De Angelis, L., Baglivo, F., Arzilli, G., Privitera, G.P., Ferragina, P., Tozzi, A.E. & Rizzo, C. (2023). ChatGPT and the rise of large language models: The new AI-driven infodemic threat in public health. Frontiers in Public Health, 11:1-8. doi: 10.3389/fpubh.2023.1166120
  15. Douglas, K.M., Uscinski, J.E., Sutton, R.M., Cichocka, A., Nefes, T., Ang, C.S. & Deravi, F. (2019). Understanding conspiracy theories. Political psychology, 40(S1):3-35. doi: 10.1111/pops.12568
  16. Duberry, J. (2022). Artificial intelligence and democracy: Risks and promises of AI-mediated citizen–government relations. USA: Edward Elgar Publishing. doi: 10.4337/9781788977319.00011
  17. Falkof, N. (2021). Worrier state: Risk, anxiety and moral panic in South Africa. Johannesburg: Wits University Press. doi: 10.7765/9781526164032.00008
  18. Fassin, D. (2022). Conspiracy theories as ambiguous critique of crisis. In Crisis under critique: How people assess, transform, and respond to critical situations, pp. 425-440. Edted by Fassin, D. & Honneth, A. New York Chichester, West Sussex: Columbia University Press.
  19. Fassin, D. & Schneider, H. (2003). The politics of AIDS in South Africa: Beyond the controversies. British Medical Journal (BMJ, Clinical Research Ed.), 326(7387):495–497. doi: 10.1136/bmj.326.7387.495
  20. Fizek, S. & Dippel, A. (2020). Gamification of terror – power games as liminal spaces. In Games and ethics, pp. 77-94. Edited by Groen, M., Kiel, N., Tillmann, A. & Weßel, A. Wiesbaden: Springer Fachmedien Wiesbaden (Digitale Kultur und Kommunikation). doi: 10.1007/978-3-658-28175-5_6
  21. Fox, M. (2005). Hamilton Naki, 78, self-taught surgeon, dies. New York Times, 11 June. Available from: https://www.nytimes.com/2005/06/11/obituaries/hamilton-naki-78-selftaught-surgeon-dies.html. Accessed 16 March 2023.
  22. Fuchs, K. (2023). Exploring the opportunities and challenges of NLP models in higher education: Is Chat GPT a blessing or a curse?. Frontiers in Education, 8:1166682.
  23. Gao, C.A., Howard, F.M., Markov, N.S., Dyer, E.C., Ramesh, S., Luo, Y. & Pearson, A.T. (2022). Comparing scientific abstracts generated by ChatGPT to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers. bioRxiv. doi: 10.1101/2022.12.23.521610
  24. Ge, J. & Lai, J.C. (2023). Artificial intelligence-based text generators in hepatology: ChatGPT is just the beginning. Hepatology Communications, 7(4):e0097. doi: 10.1097/HC9.0000000000000097
  25. Ghosh, S. & Caliskan, A. (2023). ChatGPT perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across Bengali and five other low-resource languages. arXiv preprint arXiv: 2305.10510.
  26. Gilson, A., Safranek, C., Huang, T., Socrates, V., Chi, L., Taylor, R.A. & Chartash, D. (2022). How does ChatGPT perform on the medical licensing exams? The implications of large language models for medical education and knowledge assessment. medRxiv. doi: 10.1101/2022.12.23.22283901
  27. Gondwe, G. (2023). ChatGPT and the Global South: How are journalists in sub-Saharan Africa engaging with generative AI?. Online Media and Global Communication, 2(2):228-249. doi: 10.1515/omgc-2023-0023.
  28. Guo, B., Zhang, X., Wang, Z., Jiang, M., Nie, J., Ding, Y., Yue, J. & Wu, Y. (2023). How close is ChatGPT to human experts? Comparison corpus, evaluation, and detection. arXiv. doi: 10.48550/arxiv.2301.07597
  29. Hanley, H.W.A., Kumar, D. & Durumeric, Z. (2023). A golden age: Conspiracy theories’ relationship with misinformation outlets, news media, and the wider internet. arXiv. doi: 10.48550/arxiv.2301.10880
  30. Hartmann, J., Schwenzow, J. & Witte, M. (2023). The political ideology of conversational AI: Converging evidence on ChatGPT’s pro-environmental, left-libertarian orientation arXiv. doi: 10.48550/arxiv.2301.01768
  31. Hasnain, M. (2023). ChatGPT applications and challenges in controlling monkey pox in Pakistan. Annals of Biomedical Engineering: 1-3. doi: 10.1007/s10439-023-03231-z
  32. Hornschuh, V. (2007). A victimological investigation of farm attacks with specific reference to farmers’ perceptions of their susceptibility, the consequences of attacks for farmers and the coping strategies applied by them after victimisation. Master’s thesis. Pretoria: University of South Africa. Available from: https://repository.up.ac.za/bitstream/handle/2263/26745/dissertation.pdf?sequence=1
  33. Hughes, A. (2023). ChatGPT: Everything you need to know about OpenAI’s GPT-3 tool. BBC Science Focus, 30 June. Available from: https://www.sciencefocus.com/future-technology/gpt-3. Accessed 13 March 2023.
  34. Jansen van Vuuren, A-M. & Leenen, L. (2020). Proving it is the data that is biased, not the algorithm through a recent South African online case study. Journal of Information Warfare, 19(3):118-129.
  35. Kapp, C. (2005). Hamilton Naki. The Lancet, 366(9479):22. doi: 10.1016/S0140-6736(05)66811-0
  36. Kocoń, J., Cichecki, I., Kaszyca, O., Kochanek, M., Szydło, D., Baran, J., Bielaniewicz, J., Gruza, M., Janz, A., Kanclerz, K., Kocoń, A., Koptyra, B., Mieleszczenko-Kowszewicz, W., Miłkowski, P., Oleksy, M., Piasecki, M., Radliński, Ł., Wojtasik, K., Woźniak, S. & Kazienko, P. (2023). ChatGPT: Jack of all trades, master of none. arXiv. doi: 10.48550/arxiv.2302.10724
  37. Kung, T.H., Cheatham, M., Medinilla, A., ChatGPT, Sillos, C., De Leon, L., Elepano, C., Madriaga, M., Aggabao, R., Diaz-Candido, G., Maningo, J. & Tseng, V. (2022). Performance of ChatGPT on USMLE: Potential for AI-assisted medical education using large language models. medRxiv. doi: 10.1101/2022.12.19.22283643
  38. Lee, C., Kim, J. & Lim, J.S. (2023). How does fact-check labeling impact the evaluations of inadvertently placed brand ads? The Social Science Journal: 1-17. doi: 10.1080/03623319.2023.2216965
  39. Leitenberg, M. (2020). False allegations of biological-weapons use from Putin’s Russia. The Nonproliferation Review, 27(4-6):425-442. doi: 10.1080/10736700.2021.1964755
  40. Levallois, C. (2013). Umigon: Sentiment analysis for tweets based on terms lists and heuristics. Published in International Workshop on Semantic Evaluation. Proceedings of the 7th International Workshop on Semantic Evaluation, 9-14 June 2013 in Georgia, USA.
  41. Lin, S., Hilton, J. & Evans, O. (2021). TruthfulQA: measuring how models mimic human falsehoods. arXiv. doi: 10.48550/arxiv.2109.07958.
  42. Mahl, D., Schäfer, M.S. & Zeng, J. (2022). Conspiracy theories in online environments: An interdisciplinary literature review and agenda for future research. New Media & Society: 146144482210757. doi: 10.1177/14614448221075759
  43. Mandela, Z. (2019). Twitter post. Twitter, 13 June. Available from https://twitter.com/zindzimandela/status/1139209835596210176. Accessed: 4 April 2023.
  44. Mare A. (2014). New media technologies and internal newsroom creativity in Mozambique: The Case of @Verdade, Digital Journalism 2(1):12-28. doi: 10.1080/21670811.2013.850196
  45. Marx, C. (2020). Trennung und Angst: Hendrik Verwoerd und die Gedankenwelt der Apartheid. Berlin: De Gruyter Oldenbourg. doi: 10.1515/9783110680508
  46. McGee, R.W. (2023). Is ChatGPT biased against conservatives? An empirical study. Social Science Research Network (SSRN), 15 February. Available from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4359405
  47. Mhlanga, D. (2023). Open AI in education, the responsible and ethical use of ChatGPT towards lifelong learning, Social Science Research Network (SSRN), 11 February. Available from https://ssrn.com/abstract=4354422
  48. Mihailidis, P. & Viotty, S. (2017). Spreadable spectacle in digital culture: Civic expression, fake news, an dthe role of media literacies in “post-fact”society. American Behavioral Scientist, 61(4):441-454. doi: 10:1177/0002764217701
  49. Mistry, D. & Dhlamini, J. (2001). Perpetrators of farm attacks: An offender profile. Institute for Human Rights and Criminal Justice Studies, Technikon SA.
  50. Myeka, Z. (2022). Did the 'real' Nelson Mandela really die in 1985? Embracing the global media and information literacy week. Nelson Mandela Foundation, 2 November. Available from https://www.nelsonmandela.org/news/entry/did-the-real-nelson-mandela-really-die-in-1985 Accessed 4 April 2023.
  51. Nates, T. (2010). ‘But, apartheid was also genocide … What about our suffering?’ Teaching the Holocaust in South Africa – opportunities and challenges. Intercultural Education, 21(sup1):S17–S26. doi: 10.1080/14675981003732183
  52. Nattrass, N. (2013). Understanding the origins and prevalence of AIDS conspiracy beliefs in the United States and South Africa. Sociology of Health & Illness, 35(1):113–129. doi: 10.1111/j.1467-9566.2012.01480.x
  53. Nattrass, N. (2023). Promoting conspiracy theory: From AIDS to COVID-19. Global Public Health, 18(1):2172199. doi: 10.1080/17441692.2023.2172199
  54. Ouyang, L., Wu, J., Jiang, X., Almeida, D., Wainwright, C.L., Mishkin, P., Zhang, C., Agarwal, S., Slama, K., Ray, A., Schulman, J., Hilton, J., Kelton, F., Miller, L., Simens, M., Askell, A., Welinder, P., Christiano, P., Leike, J. & Lowe, R. (2022). Training language models to follow instructions with human feedback. arXiv. doi: 10.48550/arxiv.2203.02155
  55. Qadir, J. (2023). Engineering education in the era of ChatGPT: Promise and pitfalls of generative AI for education. Proceedings of the 2023 IEEE Global Engineering Education Conference (EDUCON):1-9, held 1-4 May at the American University of Kuwait, Salmiya, Kuwait.
  56. Quinn, H. (2023). Are ChatGPT and Bard ready for search engine integration? Technical.ly, 6 March. Available from https://technical.ly/software-development/are-chatgpt-bard-ready-for-search-engine-integration-bing/ Accessed 8 March 2023.
  57. Radford, A., Narasimhan, K., Salimans, T. & Sutskever, I. (2018). Improving language understanding by generative pre-training. Available from https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf Accessed 13 March 2023.
  58. Ribeiro, F.N., Araújo, M., Gonçalves, P., André Gonçalves, M. & Benevenuto, F. (2016). SentiBench – a benchmark comparison of state-of-the-practice sentiment analysis methods. EPJ Data Science, 5(1):23. doi: 10.1140/epjds/s13688-016-0085-1
  59. Richmond, C. (2005). Hamilton Naki. British Medical Journal (BMJ), 331(7515):519.7. doi: 10.1136/bmj.331.7515.519-f
  60. Rozado, D. (2023). The political biases of ChatGPT. Social Sciences, 12(3):148. doi: 10.3390/socsci12030148
  61. Rutinowski, J., Franke, S., Endendyk, J., Dormuth, I. & Pauly, M. (2023). The self-perception and political biases of ChatGPT. arXiv preprint arXiv:2304.07333
  62. Sallam, M., Salim, N.A., Al-Tammemi, A.B., Barakat, M., Fayyad, D., Hallit, S., Harapan, H., Hallit, R. & Mahafzah, A. (2023). ChatGPT output regarding compulsory vaccination and COVID-19 vaccine conspiracy: A descriptive study at the outset of a paradigm shift in online search for information. Cureus, 15(2):e35029. doi: 10.7759/cureus.35029
  63. Senekal, B.A. (2020). The blue-eyed devil rapists: An exploration of the discourse on Twitter around land thieves in a South African context. Ensovoort, 41(7).
  64. Shoki, W. (2020). Political struggle is the answer – not conspiracy theories. Jacobin, 20 January. Available from https://jacobin.com/2020/01/conspiracy-theories-nelson-mandela-anc-south-africa?fbclid=IwAR1uP8sWBLVvk1n28wlhTzD1JuYDHvKwrirtimalNSEU57Djlsx7vZw8Tb8 Accessed 4 April 2023.
  65. Sohail, S.S., Madsen, D.Ø., Farhat, F. & Alam, M.A. (2023). ChatGPT and vaccines: Can AI chatbots boost awareness and uptake? Annals of Biomedical Engineering, 1-5. doi: 10.1007/s10439-023-03305-y
  66. Suguri Motoki, F.Y., Pinho Neto, V. & Rodrigues, V. (2023). More human than human: Measuring ChatGPT political bias. Social Science Research Network (SSRN), 18 July. Available from https://ssrn.com/abstract=4372349
  67. Thompson, J. & Davis, S. (2021). What drives support for QAnon? Evidence from a survey experiment. Open Society Foundations (OSF) preprint. doi: 10.31219/osf.io/23qaj
  68. Van Onselen, G. (2018). The great age of deceit. The 2018 FNF/IRR Liberty Lecture.
  69. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L. & Polosukhin, I. (2017). Attention is all you need. arXiv. doi: 10.48550/arxiv.1706.03762
  70. Wang, W. (2019). Calculating political bias and fighting partisanship with AI. The Bipartisan Press. Available from https://www.thebipartisanpress.com/politics/calculating-political-bias-and-fighting-partisanship-with-ai/ Accessed 16 March 2023..
  71. Wolf, L. (2012). David Beresford Pratt: Die mens agter die sluipmoordpoging. LitNet Akademies: ‘n Joernaal vir die Geesteswetenskappe, Natuurwetenskappe, Regte en Godsdienswetenskappe, 9(3):743–804.szwa
  72. Zadrozny, B. & Collins, B. (2018). How three conspiracy theorists took Q and sparked Qanon. NBC News, 14 August. Available from https://www.nbcnews.com/tech/tech-news/how-three-conspiracy-theorists-took-q-sparked-qanon-n900531 Accessed 4 April 2023.
  73. Zhou, C., Li, Q., Li, C., Yu, J., Liu, Y., Wang, G., Zhang, K., Ji, C., Yan, Q., He, L., Peng, H., Li, J., Wu, J., Liu, Z., Xie, P., Xiong, C., Pei, J., Yu, P.S. & Sun, L. (2023). A comprehensive survey on pretrained foundation models: A history from BERT to ChatGPT. arXiv. doi: 10.48550/arxiv.2302.09419
How to Cite
Is ChatGPT a friend or foe in the war on misinformation? : A South African perspective. (2023). Communicare: Journal for Communication Studies in Africa, 42(2), 3-16. https://doi.org/10.36615/jcsa.v42i2.2437

Send mail to Author


Send Cancel

Custom technologies based on your needs

  • ORCID
  • Crossref
  • PubMed
  • Clarivate