Politics of Disinformation. Группа авторов
Читать онлайн книгу.The most widely used instruments for data gathering are the survey (29.4%) and content analysis (21.7%). The majority of surveys are conducted using an online panel, making use of companies that specialize in market studies to ensure high participation. One of those most utilized by researchers is Amazon Mechanical Turk or MTurk (Furman and Tunç 2019; Edgerly and Vraga 2020), but others also appear on the list, like Qualtrics (Garrett and Poulsen 2019), GfK, YouGov, and Nielsen IBOPE. Only Weeks and Garrett (2014) have had recourse to telephone surveys, which they used to analyze rumors during the 2008 presidential campaign in the United States.
Technological advance has facilitated the use of tools for gathering massive data, which speeds up content analysis. Several authors use Media Cloud, an open-source platform for media analysis. Monitoring news coverage is complemented by the identification of keywords, the creation of word clouds, word counts, and even geocoding all the stories and showing the results on a map. To recover historical archives published in a web format they have recourse to Archive.org, although some studies use the database of the GDELT Project, which extracts content from Google News (Guo and Vargo 2020).
Resources used for social media monitoring include Brandwatch or Netlytic, a cloud-based analyzer that uses public APIs to collect posts from Twitter and YouTube. Botometer enables the detection of possible messages by bots. For analyzing social media they also use UCINET, Gephi, and NodeXL.
Regarding the software that enables data to be stored, transcribed, and codified, mention is made in the sample of some as well known as ATLAS.ti, NVivo, and MAXQDA. R is also employed for computational text analysis. Additionally, the majority of the statistical analyses use SPSS software.
New Opportunities for Research
The two basic approaches found are descriptive investigations that explain the problem and experimental studies that make it possible to detect possible solutions. In contrast to the profusion of pragmatic studies that identify and quantify the effects and consequences of disinformation and postulate models, there is a scarcity of works that analyze the legislative frameworks and communication policies generated, such as the initiatives promoted by the 6European Union (Lopez-García et al. 2019; Helberger 2020; Iosifidis and Andrews 2020). Other minority approaches are those relating socio-demographic variables to the level of media literacy and their incidence in the consumption of disinformation (Vraga and Tully 2019), or relating to how different population groups perceive the work of verification platforms (Robertson et al. 2020). Approaches from visual communication, prospective investigations, and works referring to newsroom self-criticism and internal debates in the media about the harm caused to them by disseminating false content, are nonexistent in spite of the fact that disinformation has been the subject of editorial pieces (Tandoc et al. 2019). These academic challenges could mark the future of investigative production on disinformation questions, which are expected to be long-lasting and worrying when facing the development of artificial intelligence and the appearance of deep fakes.
Acknowledgements
This