Computational journalism (Eng. Computational journalism ) - a type of journalistic activity, the implementation of which is the use of a variety of computational methods: information gathering, organization, understanding, communication, dissemination of news. [1] At the same time, the main criteria of journalistic materials are preserved: accuracy and verifiability. This type of journalism focuses on the technical aspects of computer science: artificial intelligence, content analysis (NLP, vision, hearing), visualization, personalization and recommendation systems, as well as various aspects of social computing.
The main emphasis is on the study of technologies for the development of new tools with the goal of 1) giving meaning to a variety of sources of news information and the importance of mobile computing, data collection and increasingly cheaper network sensors, 2) creating heterogeneous content (mashup) and the use of journalism programs, 3 ) application of computational approaches to verify the quality of information; 4) data mining for personalization and aggregation.
Content
History of the term
The term "Computational Journalism" was first used in 2006 (according to some sources, in 2007) at the Georgia Institute of Technology. A lecture course on this area of ββjournalism was delivered by renowned professor Irfan Aziz Essa. In 2008, the Institute organized a scientific conference on computational journalism [1] , where several hundred studies by Atlanta journalists were published. In July 2009, the Center for Advanced Studies in Behavioral Sciences (CASBS) at Stanford University established a workshop in this area with the aim of studying and further disseminating it. Scientific conferences on computational journalism were also held in 2013, 2014 and 2015. [2]
In the simplest case, computational journalism involves applying computer technology to journalism. This means not just the use of computer technology in journalism - journalism, of course, has long been dealing with information and communication technologies (ICT) in the modern era [2] - but also the active interaction with methods for large-scale data processing using software in order to new ways of accessing, organizing and presenting information. Hamilton and Turner (2009) define computational journalism as: a combination of algorithms, data, and social science knowledge in addition to journalism accountability. In a sense, computational journalism relies on two well-known approaches: reporting mechanisms (CAR), as well as the use of social science tools in journalism. Like these models, computational journalism seeks to allow journalists to explore the ever-increasing volumes of structured and fragmented information as they need to search for stories.
Impact Factors for Computing Journalism
While the concept of computational journalism is not new, its potential and importance are increasing. [3] Thus, three main factors can be identified as a driving force. Firstly, it has significantly expanded the amount of data that is publicly available, in particular information from government sources: regardless of the fact that the data is released through official or "underground" channels, such as Wikileaks. Ideas such as the US Open Government Initiative, (www.data.gov), Britain's data.gov.uk site, as well as Australian Government 2.0 suggestions indicate a growing trend in providing information that is held by public authorities more readily available. Secondly, the combination of Web 2.0, which is stored in the public domain, and lower cost, greater ease of use and increased power of data mining software promotes experimentation with public data. In addition, there is a boom in countless forms of online participation and engagement through an abundance of Web 2.0 and social media sites.
These are clearly not three separate developments. Rather, they are interconnected elements of a shift in broad media ecology from downstream mass communication to more representative and interactive social media, which also affects politics and citizen engagement. Computational journalism can help newspapers, in particular, to successfully adapt to this changing environment by creating new ways to ensure the quality, accuracy and originality of news articles in the world, where there is a circulation of news with increased speed, the need to reduce costs in conditions of lower profits, as well as the desire attract online audiences as participants in the news process, and not just readers and consumers. [four]
The Benefits of Computing Journalism
Computational journalism can increase user interest and provide a greater degree of interaction with the media through new forms of communication and dissemination of information, including online communities and social networks, making them accessible to readers and interactive media.
Computational journalism provides greater opportunities for cooperation and interaction in creative processes between professional and citizen journalists and their readers. Examples here include crowdsourcing and joint reporting of news across platforms. Crowdsourcing, in which groups of people work together over the Internet on a single piece of news, or a part of it, means that many people can spend several minutes researching low level, while one person takes days to do it. The Guardianβs investigation into Parliamentβs Expenses (http://www.guardian.co.uk/politics/mps-expenses) is an example of the use of crowdsourcing as part of investigative research projects.
Another striking example of crowdsourcing and collaborative reporting is Wikileaks. Founded in 2006, the WikiLeaks website (www.wikileaks.org) is an international organization that publishes anonymously submitted document materials and information leaks, making them known to the public. Materials posted by WikiLeaks range from Guantanamo procedures documents, Sarah Palin's email account contents, lists of banned or illegal web addresses for several countries, including those that will be banned under the proposed Australian government laws on the Internet censorship, e-mail correspondence between climatologists, the leak of which came from the University of Climate Studies of East Anglia, a video of an incident in which Iraqi civilians More than 75 thousand documents on the war in Afghanistan, previously inaccessible for public discussion (Associated Press 2010), and more than 300 thousand documents related to the Iraq war were killed by American troops, and perhaps the most controversial. Later leaks led the White House National Security Council to state that the leaks were "irresponsible" and that "the United States strongly condemns the disclosure of classified information by individuals and organizations that could bring the lives of Americans and our partners threatened and also threatened our national security "(USCC 2010).
Notes
- β Computational journalism: analysis of information flows (Russian) (neopr.) ? . Institute of UNIC (March 31, 2011). Date of treatment February 7, 2019.
- β Will blockchain save journalism . TASS. Date of treatment February 7, 2019.
- β The next big thing in journalism might be algorithm reporters (unspecified) ? . Poynter (March 15, 2018). Date of treatment February 7, 2019.
- β Alexander Amzin. 14 journalistic trends: investigations, algorithms and voice interfaces (rus.) (Neopr.) ? . We and Jo (April 27, 2018). Date of treatment February 7, 2019.
Links
- https://compjournalism.wordpress.com/
- https://compjournalism.wordpress.com/blog/
- http://jonathanstray.com/a-computational-journalism-reading-list
- http://www.nickdiakopoulos.com/?s=computational
- https://github.com/comp-journalism/UMD-J479V-J779V-Spring2016
- http://www.compjournalism.com/?p=147
- http://cacm.acm.org/magazines/2011/10/131400-computational-journalism/fulltext