The Technical Challenge of Hate Speech, Incitement and Extremism in Social Media

When: 
Thursday, August 18, 2016 - 7:00pm
Room: 
E51-325
Lecturer(s): 
Andre Oboler
Lecturer Photo

The primary challenge is working out how to identify incitement and hate speech given: (a) the volume of content creation in social media (b) the use of videos, images, coded language, local references etc (c) the changing nature of the expression over time (d) limitations that prevent governments demanding access to non-public data

Further, without knowing what the public is reporting to the social media platforms, how can a governments judge if social media platforms are responding adequately? This has come up in cases like the murder of Leigh Rigby (the Telegraph reports: "Facebook 'could have prevented Lee Rigby murder'", Sky News "Facebook Failed To Flag Up Rigby Killer's Message") it's also been a hot topic in the US Congress e.g. ABC News reports, "Officials: Facebook, Twitter Not Reporting ISIS Messages". The latest, is from Israel where Internal Security Minister Gilad Erdan said Facebook has blood on its hands for not preventing recent killings. He is quoted by Al-Monitor as saying, "[The Facebook posts] should have been monitored in time, and [the murder] should have been averted. Facebook has all the tools to do this. It is the only entity that, at this stage, can monitor such a tremendous quantity of materials. It does it all the time for marketing purposes. The time has come for Facebook to do the same thing to save lives."

The approach my organisation uses relies on crowd sourcing, artificial intelligence and cloud computing. It enables content to be evaluated by people, but then quality controls the response of the crowd through AI. It allow empirical results to be gathered, such as those reflected in this report we produced for the Israeli Government on antisemitism in social media: http://mfa.gov.il/MFA/ForeignPolicy/AntiSemitism/Pages/Measuring-the-Hat...

Dr Andre Oboler is CEO of the Online Hate Prevention Institute, an Australian charity combating racism, bigotry and extremism in social media. He also serves as an expert on the Australian Government's Delegation to the International Holocaust Remembrance Alliance, co-chair of the Working Group on Antisemitism on the Internet and in the Media for the Global Forum to Combat Antisemitism, and as a Vice Chair of the IEEE Computer Society's Member and Geographic Activities Board. Dr Oboler holds a PhD in Computer Science from Lancaster University (UK), a Juris Doctor from Monash University (Australia) and completed a Post Doctoral Fellowship in Political Science at Bar-Ilan University (Israel). His research interests include empirical software engineering, process improvement, hate speech in social media and the social implications of technology. Web: Online Hate Prevention Institute www.ohpi.org.au; personal website www.oboler.com.