The Times has just completed a detailed investigation into the way that advertising and pay per view clicks fund extremist and other content on the Internet and in the last few days they have also highlighted how child pornography and other activity, including the promotion of extremist material, has taken place inadvertently through the algorithm systems set up by social media companies.
We complained to a social media company about 8 weeks ago about how their algorithms were promoting far right and Islamist material to members of Muslim communities in the United Kingdom. In particular, Islamist material was being promoted to individuals who had an interest in Islam, Muslims and facets related to both and the algorithms were inadvertently strengthening such groups who promote a world view of political Islam at the expense of the rights of others communities and with narratives that include a ‘war being waged on Islam’ and where Muslims are the ‘victims’ of the West. Some of these groups have also wavered on what is terrorism and when it takes place in our country.
We would urge social media companies to ensure that their algorithms do not support groups who are against universal values of human rights, pluralism, free speech, democratic values and the right to question and dissent in communities. These are fundamental values and the last thing we want is for society to be shaped by a set of algorithms who know nothing about the context and nature of groups.
We will continue to make the case on such issues and we will continue to move forward constructively on such matters with social media companies.