With a federal election looming, concerns have been raised that the foreign influence exerted on democratic processes during the 2016 US presidential election and Brexit campaigns could occur in Australia.
Experts Which-50 spoke with suggested attempts by foreign governments to sow discord are likely here, as the political and civil debate moves increasingly online. But another threat has emerged – the digital “race to the bottom” is also being driven by domestic actors, including political parties, problematically exempt from privacy laws.
In both scenarios Australian citizens appear ill equipped to identify misinformation or deception through digital platforms on their own, not helped by the opaque operations of the platforms and the limited remit of authorities.
Research from the University of Canberra suggests Australians are not adept at spotting misinformation online.
The Digital News Report: Australia 2018 revealed 68 per cent of Australian news consumers have low or very low news literacy and the problem is worse for individuals who rely heavily on social media.
Poor news literacy is strongly correlated with an inability to detect misinformation on digital platforms, according to the study’s authors. The research also revealed how poorly understood Facebook’s algorithm model of ranking content is – a point of concern for the authors.
“Given the enormous public debate about Facebook and the role algorithms play in the selection of news, it is surprising that only 27 per cent were aware of this. That means 73 per cent of those Australians surveyed do not understand how story selection works on Facebook,” the authors write.
“Given that 52 per cent said they had accessed news via social media in the last week, the low level of awareness about how it functions is concerning.”
While there is little evidence of past interference here – at least from the same Russian forces who have been active in Europe and US – this year’s election will likely bring an increase in attempted foreign influence through digital channels, says Dr Michael Jensen, a senior research fellow at the University of Canberra’s Institute for Governance and Policy Analysis.
Foreign actors including both Russia and China may take the opportunity to create disharmony by amplifying social fissures in Australia, rather than seeking to influence the outcome directly, Jensen told Which-50. That is a strategy that authorities can find difficult to contain.
“Part of sowing discord is not an end in itself. It is a means to weaken the capacity of Australian society to unify together and accomplish great things.
“When governments have had to do something big they often have to rally the society behind them. In wartime it’s the rally around the flag effect, in peace time it’s creating a uniform vision for what the country could be and trying to get everybody on board to support what’s happening.”
“This [discord] also makes it harder to combat a foreign influence operation as the threat is more complex and harder to confront. Also, by inserting themselves into multiple networks across a political system, a foreign entity can activate these networks at different times in an effort to influence the flow of events in the target country.”
Similar concerns existed following Australia’s last federal election in 2016 — the same year Russia attempted to muddy the US presidential campaign through the manipulation of digital platforms. The concerns contributed to a Parliamentary Inquiry into the conduct of the 2016 federal election and matters related thereto, one which attracted over 200 submissions from political parties, tech companies and advocacy groups.
It delivered its findings late last year in a report which concluded there was little evidence of social media manipulation by foreign actors in the 2016 Australian election.
However, the potential consequence means the threat should be continually monitored, according to the report, including the establishment of a dedicated taskforce, which would also provide transparent post-election findings.
The committee responsible for the inquiry did not disband following its conclusion, instead it remained to monitor interference in in Australian elections, presumably with a specific eye to the upcoming federal election.
“Whether it is the malicious spread of misinformation or ‘fake news’, or the manipulation of political advertising, cyber interference has the capacity to undermine confidence in our democracy,” said Committee Chair, Senator James McGrath.
The threat is clearly a priority for Australian authorities. Yet, the sophistication of the cyber attacks, the closely guarded operations of digital platforms and the addition of domestic threats means it could prove a considerable challenge for authorities, already ill equipped to monitor and enforce any incidents occurring in digital channels, according to experts we spoke with.
For example, the Australian Electoral Commission, the country’s election watchdog, has particular difficulty when the source of any misinformation or interference can’t be identified or originates overseas.
How interference is prevented
Which-50 understands that the AEC is working with major digital platforms in an effort to safeguard Australian elections from interference. The AEC was contacted for a comment on measures taken for the upcoming federal election but did not respond.
Facebook confirmed it is continuing to work with the AEC but did not answer specific questions on how they combat the threat in Australia in the lead up to this year’s federal election.
Already, Facebook says it has taken several steps to improve the transparency of political advertising and suggested more announcements will be made once a date for the federal election is confirmed.
Google did not respond directly to Which-50’s questions but issued a statement which included a commitment to “protect election information online”.
The two platforms have long argued they are not publishers and have limited responsibility for the content they host, although both have invested significantly in efforts to curb foreign interference in elections.
But critics argue more transparency is needed from the platforms where much of the political posturing will occur. The focus on foreign impact may also be obscuring a domestic threat.
What is clear is the increasing role platforms play in elections, according to Dr Andrew Hughes, an expert in government marketing and campaigns at the Australian National University.
Hughes told Which-50 traditional media channels are losing their effectiveness because they are seen as less credible now by consumers and political advertisements don’t have the same “cut through” they did five years ago.
Digital platforms are also not subject to election advertising restrictions like traditional channels, which are prohibited from broadcasting political advertisements in the three days preceding polling day. Free from this restriction digital platforms become incredibly important in the final campaign days, Hughes said.
“What we saw in 2016 was the parties unleashed a social media advertising blitz for the last three days of the campaign. You can save up your spend and then target the undecided voters.”
The appeal of the platforms extends to others too, Hughes says, including those with iniquitous intentions. Unfortunately they are hard to identify and trace, as the transparency measures platforms have in place extend largely to official political organisations.
“It’s becoming harder and harder to identify that maybe there is someone else behind the individual. So maybe that account is fake or maybe it’s from a troll farm,” says Hughes.
“An individual out there can very quickly set up a whole series of accounts and post to those accounts instantaneously.”
The difficulty in identifying fake or deliberately misleading accounts is compounded by the often sluggish response from platforms.
“It makes their effectiveness quite remarkable because they can actually be there for a long time in a campaign,” Andrews said. “It can be hard to verify where those accounts are coming from as we’ve seen with America.”
In the US it took up to two and a half years to track down the origin of some accounts which had spread misinformation.
“That’s the time frame we are talking about,” Andrews said.
“The election will be run and won long before people sit down and analyse or find out who those troll accounts are.”
The danger in Australia is not only nefarious foreign players, says Tim Singleton Norton, chair of the Digital Rights Watch, but rather, the domestic threat of interference or the muddying of civil debate.
The perpetrators in this regard can include politicians and parties and non-affiliated third party campaigners, whose “race to the bottom” of digital can undermine democratic processes.
The lack of transparency and safeguards common with broadcast channels during elections is encouraging a more toxic debate, according to Singleton Norton.
Dark ads and privacy exemptions
“Every election you have these black and white grainy photos of Bill Shorten with union thugs behind him, that kind of thing. [But] in a digital world the big problem is dark ads and targeted advertising,” Singleton Norton told Which-50.
Dark ads are targeted advertisements which can not be easily monitored. Stricter transparency rules only apply to political ads when they are run by a registered political party, leaving some room for non-affiliated groups to run opaque targeted advertising, according to Singleton Norton.
“With dark ads we don’t know who is getting them and we don’t know how to report them and we don’t know how to reach people to tell them that they have a right to report them if they receive them.”
He concedes it can be difficult to determine the line for what constitutes political advertising and affiliated groups but maintains the improved transparency in other jurisdictions, like the US, should be applied here as well.
Singleton Norton and Digital Rights Watch, an advocacy group promoting digital rights, are also pressing for more scrutiny on politicians exemption from the privacy act, particularly at election times.
Third party groups campaigning are bound by Australia’s Privacy Act, preventing them from concealing the origin of data mining and selling data lists without consent. The law provides checks and balances for digital campaigns.
“Political parties are entirely exempt from that,” Singleton Norton says. “They can do whatever they want with data sets. They can buy them, sell them, match them. They have access to the electoral role, they can match that against corporate information to build up more accurate profiles.”
The practice has a long history but was previously a manual task. Digital platforms not only expedite the process they also scale it up considerably.
“In a digital age it is incredibly powerful. Because when you combine it with targeted advertising you get your AEC data, you cross reference it with party data and with other bought lists, you upload that to Facebook, you match it against users, look for lookalike audiences and then you target your message directly to those people and that’s where that ethical issue comes in.”
In the US the strategy was allegedly used by Russians and Republicans to identify and suppress black voters. The Privacy Act exemption means it is almost impossible to monitor similar strategies here, although compulsory voting protects against voter suppression and mobilisation strategies, Singleton Norton said.
“But the principle is the same and the checks and balances that stop political parties from doing that are not there either. There’s no privacy cop watching the political parties to make sure they are doing the right thing.”
Australia’s compulsory voting laws make unethical campaigning and deliberate interference less attractive, according to the experts Which-50 spoke with.
Research suggest the biggest impact of digital, negative messaging and misinformation is on voter mobilisation – whether people will be mobilised to go and vote, according to Dr Glenn Kefford, a senior lecturer in politics at Macquarie University.
“In Australia, we have little need for this [mobilisation strategy] as compulsory voting means that the vast majority of eligible voters are incentivised to turn up and vote. The question, then, is whether misinformation or negative messaging via digital platforms can have a persuasive effect and convince voters to change their vote.”
Kefford told Which-50 more research is required in this regard as voting is a complex process and voters “generally don’t change their vote easily and they are unlikely to do so based on single or even multiple pieces of new information, whether that is via digital platforms or elsewhere”.
“If anything, the evidence suggests that misinformation confirms the biases of voters who are already committed partisans and contributes to greater polarisation and enmity between the supporters of various parties.”
But Kefford is adamant that platforms need to do more to more.
“They absolutely have not done enough. A healthy democracy requires an effective media to ensure that governments and political elites are held to account.
“Considering how some digital platforms have transformed – for better or worse – the media industry, citizens and nation-states should demand more from organisations like Facebook and Google as the now play a critical role in how and where voters access news and information that they may take account of when deciding who to vote for.”
Mitigating the threat
But what level of control platforms should exert is problematic, according to Digital Rights Watch’s Singleton Norton. He does not want platforms taking a greater role in the moderation of content but would like to see greater transparency from the tech giants.
“We don’t want for-profit platform providers deciding on the content that goes on there.”
A coordinated effort from both platforms and authorities is required to address misinformation and unethical behaviour, Singleton Norton says.
While the platforms have indeed been working with authorities on the issue, governments should be demanding more, he said.
“The only reason we haven’t seen massive action is because we haven’t seen a government willing to step up and say ‘no, we’re calling the shots, we’re putting these laws in place’ and actually pushing them.”
“It shouldn’t be the responsibility of Facebook to do the right thing. It should be the responsibility of the Australian government to put in place laws and mandate that they have to.”
An effective approach would include more information sharing between platforms and authorities, according to Jensen. In the past this has been a challenge because platforms remained “skeptical” of intelligence agencies.
“I think there needs to be a cooperation between them with respect to information sharing. The platforms have access to a whole lot of data and metrics on, for example, where the platforms are being accessed and what people are doing when people access those platforms which would not otherwise be made public [or shared] with intelligence agencies.”
This approach, combined with some bipartisanship and greater public awareness about misinformation and the source of online content, is the best way to tackle the problem, according to Jensen.
The federal election is widely expected to be held in May and, with Australians spending on average around 90 minutes a day on social media, that leaves plenty of opportunity for all players.