Advertisement

Statements From The Department Of Homeland Affairs And Facebook

Regarding tackling extemism

FULL RESPONSE FROM DEPARTMENT OF HOME AFFAIRS

A spokesperson for the Department of Home Affairs provided the following statement:

· Since 2013-14, the Government has allocated over $53 million to countering violent extremism (CVE) programs, including more than $13 million for intervention programs.

· This includes: o $13.4 million to support the national roll-out of intervention programs, to divert and disengage individuals from violence.

o $21.7 million to combat terrorist propaganda, including removal of terrorist content online and supporting positive messaging against violent extremism.

o $4 million to support the NSW Government to roll out a helpline for families and frontline workers seeking help for young people at risk of radicalisation.

o $1 million to support the Office of the Children’s e-Safety Commissioner to expand digital resilience programs for young people and families.

o $2 million annually to CVE initiatives through the Australia-New Zealand Counter Terrorism Committee.

o $0.9 million to support initiatives to engage a wide range of influential young people to understand and counter online hate.

· There is a consistent future funding stream for CVE within the Home Affairs portfolio of up to $10 million dollars a year.

· The Government is also funding a $71 million package of social cohesion initiatives to create a stronger, more cohesive Australia that will:

o encourage and support new arrivals to become part of Australia’s economic and social development

o build interfaith and intercultural understanding

o encourage diversity in the public debate and promote resilience against harmful and divisive messages.

· Living Safe Together is an ongoing Australian Government initiative to protect our communities against all forms of violent extremism. Further information can be found here.

· The Government’s countering violent extremism (CVE) strategy addresses the drivers of all forms of violent extremism, including far right extremism.

· There are three key elements of the Government’s CVE strategy:

o to build the resilience of communities to counter violent extremism and recover from extremist events

o to support the diversion of individuals at risk of being drawn to violent extremism

o to rehabilitate and reintegrate violent extremists when possible

· Intervention programs are a key activity in the CVE space. Interventions can include support from a range of community organisations, including local religious leaders, psychologists, employment services, as well as social and cultural activities.

· These programs run across the country to refer, assess and support people at risk of violent extremism. State and Territory partners manage these programs and make assessments about the kind of intervention needed for each individual.

· We have previously provided funding to community groups via a specific, one-off funding program designed to help organisations prepare to deliver services to address radicalisation. The Government’s CVE program now leverages this investment and provides funding through States and Territories to build on our strong partnership with the community sector.

· Our intervention programs take an ideology-neutral approach and cater for all drivers of radicalisation to violence. The nature and mix of services provided is unique to each individual and depends on their assessed risk and needs.

· Following the March 2019 terrorist attacks in Christchurch, the Australian Government acted quickly to legislate tough new laws to prevent the misuse of online platforms and established the Taskforce to Combat Violent Terrorist and Extreme Material Online.

· Australia also lead the G20 leaders’ statement that secured a global commitment to ensure that online platforms do not allow use of their platforms to facilitate terrorism and violent extremism conducive to terrorism.

· The Australian Government also collaborates with digital industry to build resilience in the Australian community to online extremism through forums including DIGI Engage 2019.

FULL RESPONSE FROM FACEBOOK AUSTRALIA/ NZ

A Facebook company spokesperson provided the following statement:

Terrorist material, hate groups, and hate speech are not allowed on Facebook. We proactively search and remove more than 98.5% of terrorist propaganda on our services before anyone reports it to us and have built a counterterrorism team of more than 350 experts in law enforcement, counterterrorism intelligence and radicalisation studies to stay ahead of violent extremism.

We recognise that combating hate requires ongoing commitment, which is why we met with world leaders in Paris to sign the New Zealand Government’s Christchurch Call to Action and co-developed a nine-point industry plan outlining the concrete steps we’re taking to address the abuse of technology to spread terrorist content.

As part of our commitment, we’ve made restrictions to who can use Facebook Live, updated how we define terrorism and extremism on our services, and strengthened our policies to ban white nationalism and white separatism. We’ve banned more than 200 white supremacist organisations from our platform based on these updates and use a combination of AI and human expertise to remove content praising or supporting these organisations.

Last year, we also launched a de-radicalisation program in Australia, in partnership with EXIT Australia, where we connect people who search for terms associated with white supremacy on Facebook to resources from EXIT focused on helping people leave behind hate groups.

All of these actions are helping keep people safe – but there’s no single solution to prevent terrorism and extremism in society. We are committed to continuing this work and being transparent about our efforts.

Additional information on EXIT Australia partnership

EXIT Australia is a self-funded not-for-profit, founded in 2015 and has since been helping communities, not for profit organizations with prevention methodologies.

This is deeply important and specialised work and we take these partnerships very seriously as we know they have the potential to protect people in the community. The program started with EXIT USA and has extended to EXIT Australia. There is a shared goal within global EXIT programs to assist people who have noticed an increase in their violence and are now seeking a safe exit. The organization also works with other organizations to support those wanting to leave groups that are of high demand or using coercive control against their members. EXIT Australia’s team has background with lived experience including formers, psychologists, social workers, academics and connected not-for-profits. Specifically, the EXIT team assesses the safest options with people to help find the best support, whether that's via intervention, mental health support, counselling for family and friends through trusted partners, adjusting within the community or legal assistance.

Additional materials from Facebook

Combating Hate and Extremism

Can Facebook do more to stop the sharing of violent extremist content online?

An update on content removals following the Christchurch attack

“From March 15 to September 30 (the end of the third quarter of this year), we’ve removed about 4.5 million pieces of content related to the Christchurch terrorist attack, over 97% of which was identified proactively before anyone reported it. Most of this content was identified by our media-matching systems when they were uploaded and we continue to remove known copies of this content from the platform.”

Additional materials from GIFCT (Facebook is the current Chair of GIFCT)

  • In June 2017, Facebook, YouTube, Microsoft and Twitter came together to form the Global Internet Forum to Counter Terrorism (GIFCT), appointing Facebook as GIFCT Chair. This year, Dropbox and Pinterest also joined as members of GIFCT.
  • The objective of the GIFCT has always been to substantially disrupt terrorists’ ability to promote terrorism, disseminate violent extremist propaganda, and exploit or glorify real-world acts of violence on our services. GIFCT does this by joining forces with counterterrorism experts in government, civil society and the wider industry around the world.
  • GIFCT’s work is focused on four interrelated strategies:
    • Joint Tech Innovation
    • Knowledge-Sharing
    • Conducting and Funding Research
    • Content Incident Protocol
  • Results achieved from advances in machine learning and industry collaboration so far includes:
    • More than 200,000 unique hashes now in GIFCT’s joint database: When terrorists misuse the internet, they often upload the same piece of content to multiple platforms to maximise their reach. To disrupt this behaviour GIFCT jointly developed a shared industry database of more than 200,000 “hashes” — or digital fingerprints — that allows us to safely share known terrorist images and video propaganda with partner companies. This enables companies to more quickly identify and take action against potential terrorist content on our respective platforms.
    • Knowledge sharing with technology, government and non-government organisations and academic experts: In partnership with Tech Against Terrorism, GIFCT held 11 workshops in nine countries on four continents. GIFCT has met with 120 different tech and innovation platforms; and provided funding to secure Jihadology.net to make sure that researchers studying terrorism can still access primary research material while ensuring that terrorists and people vulnerable to recruitment cannot.
    • Expanded industry sharing of links to extremist content: As GIFCT members take steps to deliver on the four collaborative actions set forth in the Christchurch Call to Action, GIFCT is also expanding the shared industry database to extend beyond photos and videos to include URLs that lead to known terrorist and violent extremist content online.
    • First GIFCT Transparency Report: GIFCT released its first GIFCT Transparency Report in July, which goes into detail about the Forum’s primary work streams, providing greater insight into how the Hash Sharing Consortium has defined terrorist content, and the volume and types of content included in the database.
    • A toolkit to Counter Violent Extremism (CVE): GIFCT launched a cross-platform counter-violent extremist toolkit that GIFCT jointly developed with the Institute for Strategic Dialogue to assist civil society organisations in developing online campaigns to challenge extremist ideologies, while prioritising their safety. This CVE tool has been included in resources shared with the Islamic Council of Women New Zealand’s community to assist with empowering the community to stand up to online hate.
    • Enabling and empowering companies to respond to crises like Christchurch: Building on the commitments the industry made as part of the Christchurch Call to Action, GIFCT has introduced joint content incident protocols for responding to emerging or active events like the horrific terrorist attack in Christchurch, so that relevant information can be quickly and efficiently shared, processed and acted upon by all member companies. The Content Incident Protocol has a triaged system aiming to minimize the online spread of terrorist or violent extremist content resulting from a real-world attack on defenseless civilians/innocents. The GIFCT commits to working collaboratively across industry, governments, and NGOs on protocols for responding to emerging or active events, on an urgent basis.
    • Conducting and funding research: GIFCT supports the Global Research Network on Terrorism and Technology (GRNTT), aimed at developing research and providing policy recommendations around the prevention of terrorist exploitation of technology. The research conducted by this network seeks to better understand radicalisation, recruitment and the myriad of ways terrorist entities use the digital space around the world.