Saturday 17 March 2018

Public Hearings on Fake News: 14 - 16 March 2018

Select Committee on Deliberate Online Falsehoods - Public Hearings


Eight-day hearing on how Singapore should battle online disinformation begins
By Nur Asyiqin Mohamad Salleh, The Straits Times, 14 Mar 2018

The fight against deliberate online falsehoods in Singapore is under way at Parliament House, with academics, legal experts and religious groups slated to speak on Wednesday (March 14) at the first public hearing on the issue.

At the opening of the hearing, chairman of the committee looking into the issue, Deputy Speaker Charles Chong, said: "Deliberate online falsehoods are a serious global problem, which many countries, including Singapore, have to grapple with. It is a complex problem, affecting us in many different ways."

The full-day session will first see academics Carol Soon and Shawn Goh of the Institute of Policy Studies, who have studied the impact of new media, as well as their colleague Mathew Matthews - known for his research on societal cohesion in Singapore - give their views on the problem of online fabrications.

Singapore Management University law school dean Goh Yihan, lawyer and former Nominated MP Shrinivas Rai, and cyber-conflict expert Michael Raska, an assistant professor at the S. Rajaratnam School of International Studies, are also slated to speak.

Representatives for the Roman Catholic Archdiocese, National Council of Churches of Singapore and the Singapore Buddhist Federation are scheduled to weigh in with their community's views as well.

These speakers will appear before the 10-member Select Committee, which was set up in January to look into how Singapore can tackle deliberate online falsehoods.



The high-level parliamentary committee will speak to a diverse range of individuals and organisations from Singapore and abroad to help it decide on its recommendations, which will be submitted to Parliament.

This will take place over eight days this week and the next two weeks. As the hearings go on, the committee will decide whether all the dates are needed.

It received a record 164 written submissions, toppling the previous high of 99 submissions to the 1988 Select Committee on the Parliamentary Elections (Amendment) Bill and reflecting the interest and anxiety sparked by the scourge of disinformation. These included perspectives from religious groups, traditional and alternative media, technology companies and academics.

A total of 79 individuals and organisations have been invited to speak - outstripping past Select Committee hearings. Said Mr Chong: "We may revise the witness list as the hearings progress."

He said the current committee’s decisions on the process so far have been unanimous and consensual. "This reflects our common intention to engage widely on our terms of reference."

The committee said in a statement on Tuesday: "This is an indication of the importance of these issues at stake, and the (committee's) commitment to consult widely."

The hearings will cover themes such as the impact of deliberate online falsehoods on different parties here, how technology has worsened the problem, and the merits of different options, such as legislation.

Members of the public can sit in on these hearings, which are held at the Parliament House's new Public Hearing Room.

Fake news has plagued countries around the world, and hearings on Thursday and Friday will include perspectives from foreign bodies such as the European Values Think-Tank and the Ukraine Crisis Media Centre, some of them speaking via video-conference.



Mr Chong has said that the committee hopes to gather a broad range of views while his fellow commitee member, Home Affairs and Law Minister K. Shanmugam, hopes the process will allow members of the public to better understand the issues that Singapore and the rest of the world face when it comes to online falsehoods.

The other Select Committee members are: Social and Family Development Minister Desmond Lee; Senior Minister of State for Communications and Information as well as Education Janil Puthucheary; People's Action Party MPs Seah Kian Peng, Rahayu Mahzam, Sun Xueling and Edwin Tong; Workers' Party MP Pritam Singh and Nominated MP Chia Yong Yong.





Day 1: 14 Mar 2018





Disinformation campaign could be the first step in attacking Singapore: K. Shanmugam
This applies especially to foreign actors unable to use military means against Singapore, he says
By Lester Hio, The Straits Times, 15 Mar 2018

A campaign of disinformation, where falsehoods are spread subtly and gradually, could be the first step in an attack on Singapore.

This would apply especially to foreign actors who want to attack Singapore but cannot do so using military means, Law and Home Affairs Minister K. Shanmugam said yesterday, the first day of hearings by a parliamentary Select Committee on deliberate online falsehoods.

He was deliberating with Dr Michael Raska of the S. Rajaratnam School of International Studies on how information warfare and cyber conflicts have been in play throughout the world, and their implications for Singapore.



While Singapore might have the edge in conventional warfare, said Dr Raska, the use of disinformation campaigns aimed at political, racial and religious fracture points that may exist in society can offset an attacker's military inferiority.

"These attacks can happen (from) far away where a conventional response is not viable," said Dr Raska, one of 10 individuals invited to elaborate on written submissions they had sent to the committee.

A campaign of disinformation also conflicts with Singapore's concept of deterrence, which has revolved around conventional warfare, he added.

Mr Shanmugam then pointed out that the "logical conclusion" for foreign actors to weaken Singapore is to "first engage in information warfare". Agreeing, Dr Raska said that the fracture points do not even have to stem from within Singapore to create tensions. "Singapore can be drawn into crisis outside, externally, where Singapore's reputation... any sort of issue that would threaten its international standing, could be challenged."

Senior Minister of State for Communications and Information Janil Puthucheary cited Dr Raska's earlier research where he emphasised the growing importance of social media in military campaigns, by identifying existing fault lines and creating new ones.

Dr Raska said: "(Social media) can create new fault lines, amplifying tensions or issues that have not been previously thought of as important, and these can be rapidly disseminated or diffused and suddenly become very, very, important."

In his written submission, Dr Raska said cyber warfare has increasingly been recognised as a fundamental strategy for warfare by countries. He cited the example of China's People's Liberation Army, which has moved to focus on "information dominance" since 2015, with the aim of fighting and winning localised information wars.

In comparison, Singapore's security paradigm has not really changed and would need to further adapt to these changes, he said.

He added that the centre of gravity for future conflicts will be perceptions of the population and decision-making cycles of the leadership.

Information warfare might already have started on our shores, as Mr Shanmugam alluded to cyber attacks made on ministries here, such as the Foreign Affairs Ministry, at an earlier session with researchers from the Institute of Policy Studies.

"To the extent that one can know about these things, I think one can assume that these are attacks that originate from outside, perhaps from a foreign state actor, given the nature of their targets," said Mr Shanmugam.

The last parliamentary committee hearings were held 14 years ago - on laws on building maintenance and management.

The public hearings on deliberate online falsehoods continue today.









Tackle fake news with laws and other steps, panel told
Measures suggested include fact-checking services and media literacy programmes
By Ng Jun Sen, Political Correspondent, The Straits Times, 15 Mar 2018

Tackling fake news will call for a host of measures including fact-checking organisations and media literacy programmes. This is in addition to laws to tackle disinformation, experts told a parliamentary committee looking into the issue.

A total of 10 speakers - the first batch of 79 individuals and organisations to speak over the next three weeks - with backgrounds in research, law, defence studies and religion were quizzed on their written submissions yesterday.

Among them, Singapore Management University law dean Goh Yihan noted gaps in Singapore's existing legislative framework in dealing with the rapid spread of fake news.



Associate Professor Goh, in looking at how existing laws such as the Sedition Act and Telecommunications Act could apply to cases of online falsehoods, concluded that they are currently limited in speed, scope and adaptability. "I have looked at the existing legislation and submitted they are not sufficient to deal with the problem," he said.

Prof Goh said any legislation targeting such falsehoods would have to punish and deter perpetrators, prevent the spread of falsehoods - by way of removal or restricting access - and provide remedy through clarification or apology.

At the same time, legislation cannot be the only solution, he added. "We must balance legislation with education as well as reaching out to different communities," he said.

It was a call made by many of the other speakers yesterday, as they offered ideas on tackling a scourge that Law and Home Affairs Minister K. Shanmugam said was one that could cause Singapore great harm. This is especially given that Singapore is polyglot, multiracial and data-rich - traits that make it a tempting target of organised disinformation campaigns, he added, citing a submission to be made by academic Shashi Jayakumar.

In their submission, Institute of Policy Studies senior research fellow Carol Soon and research assistant Shawn Goh proposed that Singapore tap and reinforce its current legislation. This Mr Shanmugam noted, saying it will be a matter for the Government to decide on.

They also suggested an independent body that advises on the type of online falsehoods to act against, but Mr Shanmugam wondered if it can act quickly enough against the viral nature of such fabrications.

Fact-checking services was a popular suggestion. S. Rajaratnam School of International Studies defence and strategic studies specialist Michael Raska suggested an independent centre that inspects fake news sources, similar to what has been done in the Czech Republic. It can monitor fake news sites and track their funding and ties to disinformation networks, he said.

Dr Soon cited the BBC's fact-checking arm, The Reality Check. Students can play a role in fact-checking if it is built into the curriculum.

Other measures include mandatory training for people who have shared falsehoods.

The hearing also tackled the issue of whether potential legislation could stifle freedom of speech.

Committee member Pritam Singh, a Workers' Party MP, asked how such laws might look like, noting that some are concerned about curtailment of speech as "the line between falsehood and opinion is not drawn clearly".



Dr Soon said legislators must balance national security and public order with free speech in the "interests of enabling people to speak up and have meaningful discussions pertaining to governance".

Prof Goh said laws that seek to take down content via executive action must have recourse for the person who made the statement to appeal for it to be restored - because "just as much as the falsehood might cause serious consequences, it might also be that the statement-maker has a reason to put it out, or that it is not actually false".

He added: "The freedom of speech is of course important... but it is by no means absolute. And indeed I think the freedom of speech would be compromised if we allowed falsehoods to be perpetrated."





Truth comes a distant second to alcohol and sex
Personal biases an obstacle to the truth
By Nur Asyiqin Mohamad Salleh, The Straits Times, 15 Mar 2018

Alcohol, sex - or the truth?

For many people, fact-checking and truth come a distant second, according to a literature review on misinformation published by Institute of Policy Studies (IPS) senior research fellow Carol Soon last year.

Law and Home Affairs Minister K. Shanmugam said yesterday, with a slight chuckle: "But I don't see how truth can compete with alcohol, or opiate, or love, or sex, or chocolate. Very difficult, right?"


Dr Soon, who was speaking in her personal capacity, responded: "The devil is in the details. I think we need to see how we market truth."



The exchange underscores a theme that emerged yesterday, as Mr Shanmugam and media academics grappled with the challenges of battling disinformation. Over two hours, he and Dr Soon explored the personal biases that make online fabrications so insidious. These biases present a serious obstacle to the truth, they agreed, noting, among other things, the tendency for people to seek out and favour information that confirms pre-existing beliefs.

The 2017 IPS report had contained the quote: "Individuals get a rush of dopamine... when they find confirming data, similar to when they eat chocolate, have sex or fall in love. Thus, people tend to focus on information that supports their confirmation bias."

Dr Soon yesterday noted that attempts to correct falsehoods - which at times leave a greater impression - must thus be done quickly and designed in an appealing way. "(These corrections) have to be as sexy as the falsehoods, unfortunately."

Mr Shanmugam and Dr Soon also went through the dangers of falsehoods that threaten public order and national security "if they are able to effectively exploit existing cleavages, say, among communities", as she put it. He said: "There are people who think we are a post-racial society. The point is not to forget that we are not quite there."

When Dr Soon spoke of research showing that people more susceptible to biases tend to be those on extreme ends of the political spectrum, he responded: "My concern is not people with strong political beliefs, it is more group identity based on racial and religious lines."






5Cs to fight online falsehoods
It's vital to understand the categories of harmful information online, and the different intent of people creating them.
By Carol Soon and Shawn Goh, Published The Straits Times, 21 Mar 2018

During its first week of hearings, the Select Committee on Deliberate Online Falsehoods heard from academics, experts and community leaders on the potential impact of the problem and possible approaches to combat it.

They highlighted how Singapore's multicultural make-up and high Internet connectivity make it an easy target of disinformation campaigns that sow enmity among different groups and destabilise the country.

Deliberate online falsehoods bear trademarks of what has been termed a "wicked problem", which refers to a complex problem, often intertwined with other issues, that has no easy solution.

It is a "wicked problem" due to considerable ambiguity, such as how falsehoods should be defined and who should define it. There is also a lack of data on the precise magnitude and impact of exposure to deliberate online falsehoods.

The problem, as evident in countries like the United States, Ukraine and Indonesia, is often connected to broader societal issues such as highly polarised politics and historical conflicts among communities.

Such a "wicked problem" warrants a whole-of-society approach, and experts seem to agree that a suite of measures comprising self-regulation by Internet intermediaries, increasing critical literacy, and fact-checking is required.

The role of legislation, its potential limitations and pitfalls have been a focal point in the ongoing deliberations. If the Government decides to adopt a legislative tool to tackle the problem at hand, what principles and considerations should guide the use of legislation?

First, it is imperative to balance the need to protect national security and public order with safeguarding the ability of citizens and media to discuss and comment on pertinent issues, including those relating to governance and policies.

Second, legislation should not have the unintended effect of cultivating over-reliance on authorities among members of the public to help them discern truth from fiction.

To achieve this, there are two considerations when leveraging legislation.

FOCUSED APPROACH MATTERS

The first is to be as precise as possible in deciding which type of deliberate online falsehoods to act against.This will avoid legislative overreach as well as channel limited resources to fighting the real fight. Based on our research, we categorise deliberate online falsehoods into two types - "low breach" and "high breach" falsehoods.

"Low breach" deliberate online falsehoods create anxiety among the public and cause inconveniences to people. Some examples in the local context include the photograph of a "collapsed rooftop" of Punggol Waterway Terraces in an article published on the All Singapore Stuff website, and the alleged selling of plastic rice and issuing of fines by the National Environment Agency at hawker centres.

Fortunately, in many "low breach" cases, the stakeholders involved are often able to quickly establish the facts and debunk the falsehood; corrective action is promptly taken. For instance, residents in Punggol took to Facebook to debunk the falsehood of the "collapsed rooftop", and the website editors deleted the article and issued an apology.

However, what pose a more severe threat or "high breach" are the coordinated and covert efforts targeted at disrupting democratic processes in a country. Deliberate online falsehoods as part of a disinformation campaign have wreaked havoc on domestic politics and allegedly influenced referendum and election outcomes in other countries. Recently, the US Justice Department charged 13 Russians and three Russian firms with using stolen identities to pose as Americans, and with creating Facebook groups to distribute divisive content to subvert the 2016 US presidential election.

"High breach" deliberate online falsehoods also disrupt social and national stability by exploiting the pain points of a society, as seen in France and Indonesia. Thus, any use of legislation should focus on targeting such "high breach" deliberate online falsehoods.



FRAMEWORK TO DECIDE WHICH FALSEHOODS TO TARGET

Second, the evolving and amoebic nature of cyberspace makes it difficult and impractical to come up with a precise legal definition for deliberate online falsehoods that will stand the test of time. Thus, we propose a "5Cs" framework to help determine what online falsehoods warrant regulatory intervention and whom to act against.

The first "C" is Content: An important question to ask is if the content is verifiably false. Falsehoods should be distinguished from opinion.

The second "C" is Context: The content of an online falsehood should be considered within a country's political, economic and social milieu. Despite rapid changes in the online space, Singapore's approach to regulating speech in general has always focused on protecting racial and religious harmony among its population and maintaining public order and security. Moving forward, one possible approach is to focus on deliberate online falsehoods that pose a threat to these pillars that Singapore has always upheld.

The third "C" is Communicator's Identity. Our research found that there are different types of perpetrators. They include members of the public, corporations, domestic political agents and foreign state actors. Some actors could also be part of a larger network (for example, accounts linked to a foreign Internet troll factory).

The fourth "C" is Communicator's Intent. Looking at intent means dealing with perpetrators differently: Ordinary individuals might then be dealt with differently from networked players and foreign state actors who act from a larger insidious agenda to disrupt social stability and national security.

Different categories of potentially harmful online information serve different intents. For example, non-profit organisation First Draft identifies three types: misinformation such as parody and satire is false or inaccurate information produced or shared without the intent to deceive or harm; mal-information is information that is genuine but produced with a clear intention to cause harm; and disinformation is the deliberate creation and sharing of information known to be false with the intent to deceive or incite hostility. The earlier example of Russian manipulation of online discussion illustrates this.

Legislative intervention should focus on disinformation.

The final "C" is Consequence. The extent and magnitude of a falsehood in terms of its frequency and volume should be considered. The likelihood of harm to Singapore's social fabric and national security, including its imminence, should also be taken into account.

Deliberate online falsehoods pose severe challenges for societies, countries and global politics. The above proposed approaches will not eliminate the problem, but they help provide the necessary clarity and focus for countermeasures.

Carol Soon, senior research fellow, and Shawn Goh, research assistant, are from the Institute of Policy Studies, National University of Singapore.

This article is written based on the authors' submission in their personal capacity to the Select Committee on Deliberate Online Falsehoods.





Benefits to religious studies in schools, but also risks too
By Nur Asyiqin Mohamad Salleh, The Straits Times, 15 Mar 2018

Public education is crucial in quashing misunderstandings arising from disinformation, religious groups said during a public hearing on online falsehoods yesterday, setting off a discussion about whether religious studies have a place in schools.

But both the religious leaders and the members of the committee running the hearing agreed that while there are benefits to teaching religion, there is also a risk of it being seen as proselytising.

Dr Roland Chia of the Trinity Theological College, Dr Kweh Soon Han of the Singapore Buddhist Federation and Roman Catholic Archdiocese communications director Andre Ahchak spoke about how religious studies could be a bulwark against false or twisted information about certain faiths online.

Dr Chia said of this misinformation: "There is no basis for comparison, so... (youth will) take it as a true representation of that particular religion."



But they accepted concerns raised by Select Committee members, Senior Minister of State for Communications and Information Janil Puthucheary and Law and Home Affairs Minister K. Shanmugam, that parents may not be on board with the plan. Mr Shanmugam told Mr Ahchak: "I understand you when you say learning about the different religions made you a better person, because it leads to greater tolerance, greater understanding, but we actually get a lot of pushback now from parents when this idea is broached."

Religious knowledge was made a compulsory subject for Secondary 3 and 4 students in 1984. It was phased out in 1990, as it was seen to have emphasised differences among the religions and encouraged students to proselytise.

Dr Puthucheary also noted that, among other things, some parents are concerned about exposing their children to faiths other than their own.

Reverend Dr Ngoei Foong Nghian of the National Council of Churches of Singapore acknowledged that there would be many potholes to watch out for, but added that inviting religious groups to talk to students can help to foster greater understanding and respect.





Not simply a metaphorical fight against fake news, but a real battle
First day of hearings makes clear fake news is no longer a simple issue
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 15 Mar 2018

The battle against fake news took a literal turn yesterday on the first day of hearings by the Parliamentary Select Committee set up to look into the problem.

While many see the fight as a metaphorical one, Assistant Professor Michael Raska of the S. Rajaratnam School of International Studies urged the committee to see it for what it is: A real war.

The expert in military innovations, information conflicts and cyber warfare said that information wars are already part of the established cyber warfare strategy of countries, and are often waged on a permanent basis.

Singapore's policymakers need to see it in this context to properly counter the threat, he said.

He was among six people and groups who had their written submissions and suggestions scrutinised by the Select Committee on Deliberate Online Falsehoods - Causes, Consequences and Countermeasures yesterday.

The Public Hearing Room in Parliament House became a temporary "war room" of sorts as the committee comprising ministers and MPs, and those invited to explain and elaborate on their written submissions, discussed and examined the different strategies that could be used to defend Singapore against this new kind of threat.

Dr Raska's oral evidence got technical in places, but served to drive home just what could be at stake.

Information wars have allowed foreign governments and organised groups to divide, disrupt and conquer using just words, the Internet and social media - and without having to fire one bullet.

Deliberate online falsehoods, he warned, can "create similar political effects as through the use of force".

To make things worse, he said, he is not sure that Singapore is prepared for such warfare.

Launching attacks on social media falls far short of armed conflict, which means sophisticated weapons count for little. This challenges Singapore's concept of deterrence based on using military might to scare off possible attackers, he explained.

He described how countries like China, North Korea, Russia and the United States have doctrines on information wars, but added that "it doesn't have to be a particularly great power. It can also be any state in the region that relies on this method and tools".

His evidence also helped bring into sharp focus Singapore's vulnerability as a multiracial and multi-religious society.

Of particular concern to several of the others who spoke at the hearing was how disinformation, distortion, rumours and untruths can be used to exploit fault lines between races and religions - thus tearing society apart.

Representatives from the Catholic Church, the National Council of Churches of Singapore and the Buddhist Federation said they have had to step up efforts to counter deliberate online fabrications about their religions.

Roman Catholic Archdiocese communications director Andre Achak said: "It is a continuous battle, day in and day out."

Institute of Policy Studies (IPS) senior research fellow Matthew Matthews said the harmonious state of affairs that exists now between the different races and religions here should not be taken for granted. While existing fault lines have been well managed through a combination of legislation and social policies, and most Singaporeans endorse multi-cultural living, stereotypes and prejudices are still held by a sizeable proportion of the population, he noted.

He warned that a daily dose of online falsehoods - for example fabricated reports on particular ethnic or immigrant groups and their loyalty to Singapore, their potential to commit crimes or lack of contribution to society - could wear down the good relations through a "slow-drip effect" and leave certain groups susceptible to manipulation by hostile powers.



Citing a survey question by IPS and OnePeople.sg, which asked people how they would feel towards various communities in the aftermath of a terror attack, he said that nearly two out of five non-Muslim respondents said they would view Muslims with suspicion if the attack was planned by a foreign Muslim organisation.

This can alienate Muslims here and make them "easy bait" for "groups from elsewhere who want to further prey and make them feel that the rest of society is against them", said Dr Matthews.

A related issue was brought up by Home Affairs and Law Minister K. Shanmugam, who noted that research has found that online falsehoods appeal viscerally to people, especially when they have to do with their group identity.

He said: "The purpose of deliberate online falsehoods is you start out with people in the middle ground, you appeal cleverly to their ethnic identities, you appeal to their racial identities, then you target them politically and try and move them along the political spectrum to harden their attitudes."

The committee was only into the first of its eight-day public hearings, and the search is on for solutions. But the hearings clearly highlighted another dimension to the issue. When deliberate falsehoods in the virtual world can undermine government processes and tear societies apart, sparking conflict in the physical world, fake news is no longer a simple issue of deciding what is real and what is not.





Day 2: 15 Mar 2018





Foreign experts share views on fake news
The Straits Times, 16 Mar 2018

The dangers of foreign disinformation campaigns took centre stage on Day 2 of public hearings in Parliament as international experts shared their views and experiences in dealing with, and countering the impact of, such falsehoods.

For nearly seven hours, the Select Committee on deliberate online falsehoods heard evidence from eight media researchers and practitioners from countries including Ukraine, Latvia, France and the Czech Republic.

All are countries affected by fake news campaigns, and in which Russia featured heavily.

Law and Home Affairs Minister K. Shanmugam made clear that the committee was not taking any specific view on Russia.

The hearings, which continue today, are taking place over eight days until March 29.





Singapore a tempting target for disinformation tactics: Expert
Such campaigns can be waged by nations near and far against multiracial Republic, he warns
By Nur Asyiqin Mohamad Salleh, The Straits Times, 16 Mar 2018

Singapore's close neighbours may already be seeing disinformation tactics being deployed internally - and these could well be turned on Singapore if relations were to fray, a security expert warned yesterday.

Information warfare has become an accepted part of military doctrine, noted Dr Shashi Jayakumar of the S. Rajaratnam School of International Studies (RSIS) on the second day of public hearings by the Select Committee studying ways to thwart deliberate online falsehoods.

Some state actors have a full suite of tools, including "kinetic", or conventional, warfare, cyber attacks and propaganda to influence minorities, he said.

And while Singapore's relations with its close neighbours are excellent now, it is not a stretch to think that a breakdown in diplomatic ties may see them use disinformation to sow discord in Singapore.

"It would be a mistake to assume the means and plotting against us would be merely kinetic," he said, adding: "Without meaning to cast allegations or smears, the means and tools are actually there, because you have these hired guns... which have a presence (in these countries)."

One such "hired gun" is Cambridge Analytica, a data company accused of helping Russia spread disinformation during the 2016 United States presidential election.

Dr Shashi said that it now has a presence in polls-bound Malaysia, where it is thought to be hired "by people involved in the coming election".

Meanwhile, the organised spreading of fake news and smears has become commonplace in Indonesia, where outfits such as Saracen have targeted political figures like President Joko Widodo and former Jakarta governor Basuki Tjahaja Purnama with inflammatory rumours that tap into pressure points such as race and religion.



The warning that disinformation campaigns aimed at Singapore can be waged by countries near and far came at the start of yesterday's hearing, which was dominated by speakers from foreign organisations who detailed the propaganda efforts of far-away Russia.

It would be a mistake to assume that such efforts are not already under way in Singapore, said Dr Shashi, adding: "You deploy them long in advance, before you actually need to use them."

In fact, countries such as Singapore - which are polyglot, multiracial, data-rich and aiming to become a Smart Nation - present tempting targets for those looking to undermine societies, he said.

Dr Shashi and Law and Home Affairs Minister K. Shanmugam, a member of the committee, said that Singapore could be a "sandbox for subversion", with social media and disinformation used to tap into deeply ingrained historical and cultural issues, and turn groups against the Government or each other.



Mr Shanmugam said: "We are a sandbox because we have the in-built potential for being divided along racial lines, along religious lines, along some nationalistic lines."

Disinformation, he noted, can cause people to doubt what is real and fake, driving them to seek out groups that reinforce their beliefs.

Dr Shashi said that just as efforts to deradicalise victims of radical ideology take place both online and offline, there must be some human interaction involved to complement online counter-measures.

Open, face-to-face dialogue where a diversity of opinions can be aired could be crucial.

Dr Shashi cited the 2013 Our Singapore Conversation series to engage Singaporeans on a variety of issues as having brought together people with diverse opinions.

"I wonder whether we need more real-world interventions," he said.









Panel discusses role of ST, Zaobao in fake news fight
By Seow Bei Yi, The Straits Times, 16 Mar 2018

Newspapers such as The Straits Times and Chinese daily Lianhe Zaobao can play a part in debunking disinformation, said Dr Shashi Jayakumar of the S. Rajaratnam School of International Studies. He noted that in some countries, major networks or websites like Facebook have sometimes engaged national newspapers to help with fact checking.

The fight against falsehoods, he added, does not necessarily have to begin with the Government, but could also start with "any agency which has tremendous credibility and trust in the eyes of the public".

In Singapore's case, "ways should be found to support The Straits Times and Lianhe Zaobao, in a nuanced and calibrated fashion, such that they can once again be seen as the pre-eminent news sources, bar none, in the eyes of the Singapore public", Dr Shashi said in his written submission.



Home Affairs and Law Minister K. Shanmugam noted that while ST's print circulation may have been slipping, "the online figures appear to be going up".

Annual reports by Singapore Press Holdings, which publishes both papers, show that from 2015 to last year, ST's print circulation fell from 304,300 to 263,200. But digital circulation rose from 74,100 to 120,400.

The combined numbers in that same period show a marginal increase as well, said Mr Shanmugam, presenting slides showing they rose from 378,400 to 383,600.

"The point we both probably can agree on is that you need good newspapers which people can trust. But again, that is not going to be the silver bullet. It has got to be one of a series of solutions," he said later.

Dr Shashi said: "Everyone has a part to play. The Straits Times has a part to play. It is a superb newspaper and has widespread credibility, and if you include digital, the readership is helpful. But in an anti-fake news coalition, one would argue the (newspaper of) record has an important role to play."

Reacting to the comments, ST editor Warren Fernandez said: "Our overall readership continues to grow as more people are reading us online, or on their mobile phones, on social media, as well as in print. So, while print numbers have seen a gradual decline, in line with global trends, we have many more points of contact with our readers throughout the day than just the print product in the mornings.

"So, our main challenge is not so much readership; it is more about keeping revenues healthy and on a sustainable level, as producing reliable and quality content is a costly business. To do so, we have to serve our readers well, and strive to continue to earn their trust as a credible source of news and views on the major events that matter to them."










'Expert geopolitical analyst' was really a car insurance salesman
By Ng Jun Sen, Political Correspondent, The Straits Times, 16 Mar 2018

Touted by Russia's state broadcaster RT as a geopolitical analyst, a New Yorker was a frequent guest and oft-cited expert in Russian media.

For years, Mr Eric Draitser spoke authoritatively on Russian TV against the United States' "imperialist foreign policy", and accused the Ukrainian government of being its puppet.

But in 2015, Mr Ruslan Deynychenko, co-founder of fact-checking organisation StopFake.org, dug into Mr Draitser's background and proved the so-called expert to be a fake. He was a car insurance salesman, with no academic background or scientific research in geopolitics, Mr Deynychenko said.

"It is disgusting," said the former Ukrainian journalist.

The case is one of thousands that Mr Deynychenko, 46, and the StopFake project encountered since it was established after Ukraine's 2014 revolution, he told The Straits Times in an interview yesterday after speaking before the Select Committee on deliberate online falsehoods.



StopFake was set up at a time when the term "fake news" was no different from Russian propaganda, he said.

He was the first of seven foreign experts invited to speak before the Select Committee and the only one to turn up in person, having flown in from Ukraine the previous day. The others gave their comments and views via video conferencing.

StopFake is staffed by about 30 journalists, academics, IT specialists and translators. It receives funding mainly from the British Embassy in Ukraine, he said.

He believes Russia used sensational news, albeit false, like Ukrainians raping and killing Russians, to gain support for pro-Russian separatists in Ukraine.

Eventually, it led to the Crimea being annexed and the war in eastern Ukraine, with 10,000 people losing their lives, he added.

Evidence from StopFake led to people no longer seeing Russian media organisations as free media, he said. The Ukraine government banned Russian television networks despite other nations arguing that it stifled free speech.

He warned the Select Committee that any country that ignores disinformation risks falling prey to such campaigns without even realising it.

Mr Deynychenko said: "It may happen with any country that one day, you wake up and look out the window and see people with machine guns because somebody on TV persuaded them they should hate each other.

"Our experience demonstrated that disinformation is a powerful weapon, and it could be pointed at any country at any time very, very quickly."










Debate over how to determine deliberate online falsehood
By Lester Hio, The Straits Times, 16 Mar 2018

A British academic and Law and Home Affairs Minister K. Shanmugam had a spirited debate yesterday over whether falsehoods can be defined easily.

Mr Ben Nimmo, a senior fellow at American think-tank Atlantic Council, said there are so many shades of grey that a law to tackle the problem of deliberate online falsehoods would likely have a preamble as big as the Oxford English Dictionary.

"How do you define the problem? What if it is 5 per cent true? Is it still a false story? Is it deliberate?" asked Mr Nimmo, who analyses disinformation and fake news at the think-tank's Information Defence Digital Forensic Research Lab. "There are so many grey areas here, in terms of the spectrum, from a story that is 100 per cent made up, to 50 per cent made up, to not made up at all."

But Mr Shanmugam disagreed.

"I beg to differ with you in this sense that there are items which are completely manufactured and totally untrue, which are legally very easily identifiable," he said.

Mr Nimmo, who spoke via video conferencing, said legislation should be used as a last resort.



Workers' Party MP Pritam Singh, a committee member, had first raised the issue when he asked Mr Nimmo if he was proposing legislation as a way to get social media platforms to remove false content.

He was referring to Mr Nimmo's written submission to the committee, which said governments should engage with platforms like Facebook and Twitter to shut down disinformation networks. Mr Nimmo said: "No, I am not proposing that. I am very wary of any legislative proposal, anywhere in the world, which will allow politicians to order social platforms to change the content on their platforms. Because the precedent for countries hostile to democracy would be very, very alarming."

Instead, he suggested a direct hotline for the authorities to reach social media firms and get them to take action when there is a spread of disinformation. Such a solution had worked for him personally. When there was false news of his death on Twitter last August, he went directly to Twitter to get thousands of automated accounts shut down.

But this may not work for all, argued Mr Shanmugam. "In the United States, I suppose you can talk it out. But what about other countries, when they talk to these large international platforms, what if they say no to you? To a country like Singapore? What do we do?"

He added that legislation may not be the only solution, and there might be different solutions for differentiated outcomes.

Mr Nimmo said if social media platforms did not agree to cooperate, "that is when you can start thinking about legislation", adding that it is within Singapore's purview to introduce legislation.

But he said that international cooperation usually takes time and that direct cooperation might lead to quicker responses.









Trust between government, people key in fighting fake news problem
Willing audience of disaffected citizens the most difficult problem in tackling issue
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 16 Mar 2018

Three speakers at yesterday's Select Committee hearing identified what they saw as a key problem in the fight against deliberate online falsehoods: Whether in the United States, France or Latvia, it is a willing audience of disaffected and disenchanted citizens that makes fertile ground for its spread.

Associate Professor Kevin Limonier of the French Institute of Geopolitics at the University of Paris 8 said Russian propaganda in France and Europe was just a symptom of a wider problem: "There is an audience for this kind of content... So, it is basically most of all an internal problem. Russians are just exploiting the weaknesses of our democracies."



Sharing his view, Dr Janis Berzins of the Centre for Security and Strategic Studies of the National Defence Academy of Latvia said that information wars would work only "if there is ground for them".

He put the responsibility for this development squarely on the shoulders of what he said were tone-deaf politicians who rode in stretch limousines, and their failed policies. They have lost touch with the ground, bred mistrust between the public and leaders, and have made themselves easy targets for conspiracy theories and lies.

The situation was not much different across the Atlantic Ocean.



Mr Ben Nimmo of the Digital Forensic Research Lab of the Washington-based Atlantic Council said a "willing audience" was the "most difficult problem" in the fight against false news.

This audience is not just made up of people who are emotionally invested in believing that the fake news is actually true, but also those who knowingly share a false story because they think it serves some higher purpose, he said.

In the US presidential election campaign in 2016, for example, people shared stories they knew to be false about Mrs Hillary Clinton simply because they did not want her to be elected.

And so, as Dr Shashi Jayakumar of the S. Rajaratnam School of International Studies pointed out, any effort to counter online falsehoods must start with the people and involve some "human agency".

He suggested bringing people together through efforts like Our Singapore Conversation so those with different views do not end up in their own echo chambers.

He was the only Singaporean speaker on Day 2 of the hearings. The others were non-governmental organisation (NGO) representatives, academics and experts from Europe and the US. Most were quizzed by the Select Committee via video conferencing.

Whether implicitly or explicitly, the speakers acknowledged the tricky role of governments in the fight against online falsehoods.

Government action to define, investigate and take down information can be seen as curbs on free speech, warned Mr Jakub Janda, head of the Kremlin Watch Programme in Prague.

A similar sentiment was expressed by Mr Nimmo, when the idea of legislation was raised by committee member Pritam Singh.

The British academic said he was "very wary of any legislative proposal... because the precedent for countries which are hostile to democracy would be very, very alarming".

He said a better solution was for governments to work with social networks to fight false news.

But Law and Home Affairs Minister K. Shanmugam countered that social networks which may be open to working with countries like the US may not extend the same cooperation to smaller countries.

A balance had to be struck between a person's right to propagate falsehoods and society's right to protect itself, he said, adding that this was not about democracy or free speech.

Mr Nimmo said governments could think of using legislation if social media companies refused to cooperate: "Within the territory of Singapore, you are the legislators, you can do that."

Acknowledging the problem faced by governments, several speakers said newspapers and non-governmental organisations would be in a better position to deal with the problem.

Dr Shashi said that while governments have an important role, his advice on debunking fake news was to "start not so much with government, but any agency which has tremendous credibility and trust in the eyes of the public".

He said newspapers could be part of a fact-checking coalition, and said The Straits Times and Lianhe Zaobao should be supported "in a nuanced and calibrated fashion such that they can once again be seen as the pre-eminent news sources, bar none, in the eyes of the Singapore public".

Civil society groups and NGOs can also help, and have played a central role exposing perpetrators of disinformation campaigns and in raising public awareness in many European countries. The committee heard representatives from two such NGOs: StopFake.org and Ukraine Crisis Media Centre.

If one thing was clear from yesterday's hearing, it is that trust between government and people is central in fighting the problem.





DAY 3: 16 Mar 2018






Spread of falsehoods in mother tongues a worry

NUS expert warns that such messages can amplify shared identity and are very relatable
By Seow Bei Yi, The Straits Times, 17 Mar 2018

Falsehoods could be spread in mother tongue languages and dialects, and the potential impact of such messages should not be ignored, a National University of Singapore (NUS) communications and new media expert has warned.

Appearing before the Select Committee on deliberate online falsehoods yesterday, Assistant Professor Elmie Nekmat noted that many discussions about disinformation have been conducted in English.But some segments of society, such as the elderly, may be more comfortable in their mother tongues and less aware of the dangers of falsehoods, he said.

Also, messages in mother tongue languages could go viral on closed-group platforms such as WhatsApp chat groups, he added.


These messages tend to be "more closely relevant" to the communities they are shared in, and the use of a certain language amplifies the shared identity of the group as well, he added.


This is why there should be more efforts to examine the impact of disinformation in different languages, said Dr Elmie, one of six speakers yesterday and the first to speak in Malay in the first three days of public hearings by the committee.



Dr Elmie spoke mainly in English, but replied in Malay when Select Committee member Rahayu Mahzam, an MP for Jurong GRC, quizzed him about falsehoods in the Malay-Muslim community here.


He said: "The (Malay) language is almost always closely linked to religion, so people are more comfortable with it (the language) and because of that, the role of language is also very influential in falsehood dissemination."


He said the impact of Malay language messages originating from neighbouring countries such as Malaysia and Indonesia might be felt in Singapore as well.


Dr Elmie cited the case of an untrue viral allegation originating from Malaysia last year, about how shoe company Bata stocked footwear with the word "Allah" on the soles.


While the story started from a school in Malaysia, it eventually found its way here.


The NUS assistant professor in communications and new media said the potential spread of falsehoods via the Chinese language and dialects could be a concern too. "They are very relatable to a certain community, so when this language-based information is passed... you see more responses from the particular group that speaks that language."


In his written submission, he called for "more relevant forms of regulations, educational initiatives as well as greater research into the impact of deliberate falsehoods in a multiracial society".












Tapping into race, religion to spark reactions
By Nur Asyiqin Mohamad Salleh, The Straits Times, 17 Mar 2018

The ubiquitous halal logo was at the centre of a controversy in the Muslim community last month, when a photo showing an image of the logo next to a poster advertising pork belly rice spread online.

The photo appeared in a Facebook post last month which warned people in a mix of English and Malay "to be careful when you go to this place... There are several halal logos, but it also sells pork".

National University of Singapore Assistant Professor Elmie Nekmat, 36, recalled how the post caught the eye of some of his family members, who dismissed it as "fake news".

They were right.

The halal logo and the poster belonged to two adjacent stalls, which had put these up on different sides of a pillar. The halal logo was from halal-certified yong tau foo stall, Green Delights, while the poster was from a non-halal noodle stall.

Some tried to correct the misinformation online, but the damage was done - Green Delights saw a dip in business.

Dr Elmie, whose research areas include public opinion formation, said this shows how a falsehood that taps into issues such as race and religion can spark a strong, knee-jerk reaction from the group.

"If (it is) news that is very relatable in terms of how it is being framed when it comes to language and religion, we will take more notice of it," he told The Straits Times yesterday.

Like-minded groups are congregating on platforms like Facebook, he noted, urging group administrators to fact-check posts or set guidelines to verify information before posting.





No proof fake news can alter a person's political views: Expert
Two overseas experts give different takes on the power of disinformation
By Yuen Sin, The Straits Times, 17 Mar 2018

There is no concrete proof that online falsehoods can change people's political views, though these can be used to shape political agendas by creating false impressions, said a political data expert from Germany.

"And there is even evidence against it," Mr Morteza Shahrezaye of the Bavarian School of Public Policy, Technical University of Munich, told the parliamentary Select Committee on fake news yesterday.

It is not that hard to identify fake automated online accounts and a person can spot one easily if he comes across, say, a Twitter account with 10,000 tweets in one day, he said. In a joint submission with his colleague, Associate Professor Simon Hegelich, they argued that fears of orchestrated attempts to transform political opinion on social networks are exaggerated.

"It is very unlikely that anyone is changing his or her mind on important political issues just because of some suspicious accounts in social media," they wrote.



Mr Shahrezaye also cited a study by Harvard University's Berkman Klein Centre for Internet and Society on on the 2016 United States presidential election. It found that just two of the top 100 most shared election stories on Twitter and Facebook were fake, and did not have significant impact in swaying opinions.

But the researcher acknowledged that political agendas can be influenced by online fabrications. It is easy to create the impression that an opinion is very popular online, he said. "Journalists, politicians or normal citizens might fall for these wrong trends and comment on them, thereby making them even more popular," he said.

Mr Shahrezaye called for social media firms to be more transparent by explaining the goal of their algorithms, which determine which posts become more popular.

Law and Home Affairs Minister K. Shanmugam, a committee member, said the aim of these is to maximise profitability. Calling this a classic case where commercial and public interests may clash, he said: "And we will then have to decide what is in the public interest and see how their commercial interests and the public interest can be coincided, right? That is the task of every government." 

Mr Shahrezaye concurred.









Indonesian group gears up to battle fake news during polls
By Seow Bei Yi, The Straits Times, 17 Mar 2018

Regions with greater racial and religious tensions would likely see the spread of more falsehoods during elections, the founder of an Indonesian anti-hoax community said yesterday.

"In every election, (particularly) in elections where there are racial and religious tensions, (the propensity for) misinformation is much higher," Mr Septiaji Eko Nugroho, founder of Mafindo, told the Select Committee on deliberate online falsehoods on the third day of its public hearings.

He was replying to Select Committee member Sun Xueling, an MP for Pasir Ris-Punggol GRC, who asked if past elections whose outcomes were affected by fabricated news are an indication of what could happen in upcoming polls in Indonesia.

She cited former Jakarta governor Basuki Tjahaja Purnama, nicknamed Ahok, who was accused of blasphemy after an edited video appeared to show him insulting the Quran in 2016. He subsequently failed in his bid to be re-elected as Jakarta governor.



Noting in his written submission that Mafindo saw a spike of disinformation at every major election, Mr Septiaji said he expects to see a similar trend as Indonesia gears up for a series of polls culminating in the presidential election next year.

Mafindo, short for Masyarakat Anti Fitnah Indonesia, is launching what it calls hoax crisis centres in three provinces next month. Such centres will bring together stakeholders including the police, election supervisory board, netizens, academics and community leaders, he said.



Mr Septiaji said another possible measure is a customised search engine featuring the Google platform but listing only legitimate sites such as those of registered media organisations.

"This is the approach we are going to advocate to the people - using a clean search engine," he added.

While Mafindo already has 300 volunteers in 15 cities, he said there was a need to involve more people on the ground, and to work closely with journalists, the authorities and other parties.










Of personal choice and Government's role
Protect people from falsehoods or let them face the risks - experts weigh in on question
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 17 Mar 2018

Is it the Government's duty to save or protect people from themselves? Or should people make their own choices - damn the consequences?

This age-old question bubbled under the surface of discussions at the Select Committee's hearings yesterday as speakers discussed how Singapore should move to counter the problem of online falsehoods.

But at its heart, the question is really about autonomy.

Commentators here and abroad have chafed at Singapore for being too paternalistic, often citing the chewing gum ban and sin taxes to deter smoking and drinking.

They will no doubt also debate whether or not the Government has a duty or role to venture into the area of tackling deliberate online falsehoods.

Removing false news at the first instance requires that it is first determined to be false by some expert or authority.

But there are those who believe they should have the right to access the information first, and then decide if they want to believe it.

On the other hand, others will argue that people may not be equipped or knowledgeable enough to make that call. And even if they are, what is acceptable to them might not be appropriate for the common good.

The proposals that various speakers outlined yesterday revealed their philosophical positions on this issue.

Mr Morteza Shahrezaye of the Technical University of Munich, for instance, argued for a balance between private freedoms and public interest.

He believes that government should step in to hold social media platforms responsible for content they allow to be spread - but only after it has been found to be false. If the algorithms that a platform used resulted in such an article being distributed more widely, then the platform should be fined for each click made on that post, he said.

Mr Shahrezaye's idea of allowing potentially false information to be in the public domain until it is shown to indeed be false may appear to be at odds with the thinking of security experts who spoke in the past few days, but he argued that this was a balanced approach between allowing people to still post and read what they wanted, and protecting society.

"To judge everything from the perspective of the higher common good would mean the end of personal freedom. But unlimited personal freedom would destroy society," he said in a written submission to the committee which was read out yesterday.

Others, like National University of Singapore communications and new media assistant professor Elmie Nekmat said it is necessary to take steps, such as through legislation, to prevent false news from spreading widely.

Yet this would not, in the end, be enough, to address a more pertinent issue: The ability of an individual to determine, on his own, the veracity of a piece of information he comes across, and to be able to tell what is real and what is not.

Dr Elmie favoured solutions that would include education, and the means to quickly verify the accuracy of information one comes across, whether on websites or in small and closed groups such as on WhatsApp chat groups one belongs to. This is especially crucial for the elderly.

Such a solution could in fact come from technology companies.

Ms Myla Pilao of Trend Micro, a cyber security and defence company, said that technology to spot false information that has been put out by a network of computers that are compromised already exists.



In her presentation, the director of the Forward-looking Threat Research team company, said that social media platforms and Internet service providers should use such technology to red-flag such falsehoods to their users.

But she added: "At the end of the day the responsibility rests with the user whether to push the button or otherwise."

Indeed, any laws to stem or remove online falsehoods are unlikely to totally eradicate them.

In the end, it will still be the individual, sitting at his terminal or with his tablet or phone, who has to make the decision to click on a link, read the news and choose whether to believe it. And how do you legislate for that?





Indications of information warfare against Singapore, say two experts
By Lester Hio, The Straits Times, 17 Mar 2018

Some indicators have shown up in recent months that an information warfare against Singapore is under way, two experts said in separate closed-door sessions of a Select Committee hearing yesterday.

It involves an unnamed country that is trying to influence opinion through news articles and social media to legitimise its actions on the world stage.

Singapore, they added, is not yet fully prepared to handle such disinformation campaigns, which involve cyber attacks and undermining trust in institutions such as the police.

The two experts who spoke to the Select Committee for deliberate online falsehoods are Dr Gulizar Haciyakupoglu and Dr Damien Cheong, both research fellows at the S. Rajaratnam School of International Studies (RSIS).

What they said was summed up in a media statement, in which it was sketched how information warfare is waged against some countries.



Dr Haciyakupoglu, who is from RSIS' Centre of Excellence for National Security, said countries that mount disinformation campaigns do not distinguish between wartime and peacetime, and between what is military and what is not. "Nothing is off the table."

Behind closed doors, she identified a country that has made efforts to infiltrate another society by using civilians in the latter. The Straits Times understands that the society referred to is not Singapore.

She said the efforts were made in three ways: Manipulating the media by using media professionals from mainstream media and content creators in social media; using a state agency to spread influence through businessmen, students, academics and other groups; and by carrying out cyber attacks with the help of that society's civilians.

These cyber attacks included malware attacks and invasions that caused a website to crash.

She noted that cyber attacks in the recent past had targeted sensitive ministries in Singapore.

Dr Cheong, from the National Security Studies Programme, said the goal of a state-sponsored disinformation campaign is to destabilise government and society. Singaporeans, knowingly or unknowingly, could be involved in such campaigns or share untruths without malicious intent.

He said Malaysia and Indonesia have cyber armies that could be deployed against Singapore directly or as proxies for other nations.

He called for public and private measures to prepare for a response. Substantial changes should also be made to laws to counter this threat.









Are Singaporeans vulnerable to fake news? 5 key themes from the public hearings on deliberate online falsehoods

Over three days last week, a group of men and women huddled in the new public hearing room in Singapore's Parliament House to discuss the issue of disinformation. A total of 24 speakers - from countries as far away as Ukraine to closer to home, Indonesia, as well as from Singapore - shared their experiences and research. They also shared suggestions on how the Republic can deal with the "threat of our times". Here are five key themes that emerged during the hearings, as well as through separate interviews with Insight.
By Ng Jun Sen, Political Correspondent and Seow Bei Yi, The Sunday Times, 18 Mar 2018


1. SINGAPOREANS VULNERABLE TO FAKE NEWS?

Fake news is as attractive to consume as alcohol, sex and chocolates, with the truth coming a distant second, said Dr Carol Soon of the Institute of Policy Studies (IPS).

"Why do falsehoods gain traction? It is because they tend to be sensational and emotional," the senior research fellow told Insight.

People are psychologically wired to seek out untrue information that reaffirms their beliefs, Dr Soon told the Select Committee on fake news.

As with many who presented their findings, she said Singapore's "pain points" are likely to be issues about race, language and religion. Falsehoods can be deployed for insidious purposes, such as wreaking havoc between different communities, said Dr Soon.

Similarly, National University of Singapore's Assistant Professor Elmie Nekmat spoke about how falsehoods that are spread in Malay or Mandarin are a concern as language is closely linked to religion.

Meanwhile, many speakers showed that falsehoods affect all parts of the political spectrum.

Dr Kevin Limonier of the French Institute of Geopolitics at the University of Paris 8 discussed a map of how people of varying ideologies and languages shared a similar type of false content on social media.

While Dr Soon's research showed that those on the extremes of the political spectrum were more vulnerable, Law and Home Affairs Minister K. Shanmugam said: "My concern is not so much with people with strong political beliefs. It's more group identity based on racial and religious lines, and that's where I would be focusing on when we come to what we need to do."

IPS researcher Mathew Matthews said that even among people whose beliefs were not strongly held, fake news can "amplify" these over time and with repeated exposure - the "slow-drip effect".

But one academic said there is no evidence that fake news can fundamentally change views. Mr Morteza Shahrezaye of the Bavarian School of Public Policy said fears of orchestrated attempts to transform political opinion on social media are exaggerated.

Whether Singaporeans are susceptible to disinformation or not, media literacy efforts should be part of measures to combat fake news, said many speakers.



2. WARFARE WITH NO BULLETS BEING FIRED

When defence specialist Michael Raska was on a taxi here in 2016, the driver said something unexpected. Recalls Dr Raska of the S. Rajaratnam School of International Studies (RSIS): "He openly said that while he is Chinese Singaporean, essentially, our hearts and minds are connected with China."

This was during the Terrex incident, when nine Singapore Armed Forces Terrex infantry carrier vehicles were seized by Hong Kong Customs officials. They were returned last year. "I don't know if his conclusion came from fake news, but it could be an example that fake news can change your identity and shape it to a particular direction," Dr Raska told Insight.

The use of disinformation to promote political aims is not new, and could be used by countries to intrude on another nation's sovereignty and win conflicts without a single bullet, said Dr Raska.

Head of RSIS' Centre of Excellence for National Security Shashi Jayakumar described the scourge of falsehoods as "the threat of our times" that could be more dangerous than terrorism. Singapore could be a "sandbox for subversion" due to its smart nation push, he said. "Any state actor (seeking) to influence Singapore can use the means that are already here, the infrastructure of our smart nation, our social media penetration, our broadband usage and so on."

RSIS' Dr Gulizar Haciyakupoglu testified to the committee in a closed session that there were signs such efforts have been deployed against Singapore in recent months, with a country putting its narrative through news articles and social media to influence minds and legitimise its actions.

In another private hearing, Dr Damien Cheong also shared how a state-sponsored campaign can destabilise the government and society of a target country, describing how Singaporeans would be unknowingly involved in spreading disinformation.

Such campaigns have taken a toll on countries that contend with alleged Russian interference, such as France, Latvia and the Czech Republic, speakers pointed out.

Ukraine, for example, was unprepared for Russian disinformation in support of pro-Russian separatists, contributing to Crimea's annexation, said Kyiv Mohyla School of Journalism executive director Ruslan Deynychenko. He told Insight: "You have to be aware that it might happen. I and my friends, no one believed it was possible that we could ever have military conflict with Russia. We had nothing to fight about, and suddenly war happened."







3. FREE SPEECH V PROTECTING SOCIETY

Does government action to curb disinformation impinge on freedom of speech? Or does it, in fact, protect it? This was vigorously discussed in the first week of hearings by the Select Committee on Deliberate Online Falsehoods.

Director of the European Values think-tank Jakub Janda noted that government action could "clash with various concerns over freedom of speech", making it tough for governments in many countries to implement such action. While he believes civil society should play a primary role in curbing falsehoods, he added that the authorities should also have the mandate to conduct investigations and alert the public to disinformation efforts - particularly in major incidents targeting a country's internal security or the integrity of elections. This sits alongside potential legal frameworks it could adopt, he said.

However, Dr Ben Nimmo, a senior fellow at the Atlantic Council, an American think-tank, said: "I am very wary of any legislative proposal, anywhere in the world, which would allow politicians to order the social platforms to change the content of their platform, because the precedent for countries which are hostile to democracy would be very, very alarming."

He was responding to Select Committee member Pritam Singh, who asked if he was proposing using laws to deal with fake social media accounts when he suggested governments should work with platforms to shut down such accounts.

Researcher Morteza Shahrezaye of the Technical University of Munichsaid systems like Germany's, where social media platforms are required to take down illegal content after being notified, could be vulnerable to manipulation as political opponents may systematically flag posts they do not agree with.

Discussing the varying responses, Home Affairs and Law Minister K. Shanmugam, also a member of the committee, said it is a "question of which philosophy you prefer". There is a balance to be struck between somebody's right to propagate falsehood and a society's right to make sure that there is peace and harmony, he said.

The right to express views needs to be protected, including from deliberate online falsehoods, he said. He added that dealing with fabrications, in fact, safeguards and enhances free speech. Spreading disinformation, such as via bots, to mislead others is "the very antithesis of free speech", he said.









4. NEW LEGISLATION THE WAY FORWARD?

Whether a new law is the way to go to combat disinformation drew mixed reactions from speakers.

Some, like Singapore Management University law dean Goh Yihan, pointed to gaps in existing laws and called for new legislation allowing the Government to quickly remove or prevent access to online falsehoods.

He told Insight: "The purpose of legislation is always twofold - one, you can use it to get someone to do something like an apology; second, it sends a message as to what is the right kind of conduct online."

Germany achieved this message by its recent Network Enforcement Act, which holds tech companies to account, said Dr Shashi Jayakumar of the S. Rajaratnam School of International Studies.

"It is a reminder to the big companies, Facebook in particular, that their community standards are not the same as law. The key thing in the German saga was to emphasise the networks have to comply with national law. That itself is very important," he said.

But some speakers noted the difficulties of enforcing a law against fake news.

After all, fake news is difficult to define since it can be mixed with truths, said Dr Ben Nimmo of the Digital Forensic Research Lab of the Washington-based Atlantic Council.

Disinformation could also consist entirely of one-sided material, which would be a breach of journalism standards. But that would not be considered false, Dr Nimmo told the committee on Thursday.

"There are so many grey areas here. Just the preamble to your legislation is going to be the size of the Oxford English Dictionary," he said.

Law and Home Affairs Minister K. Shanmugam disagreed, arguing that "there are items which are completely manufactured and totally untrue, which are legally very easily identifiable". He added that any new laws need not be the sole solution and there could be others with "differentiated outcomes".

This need for a nuanced view on legislation was shared by most speakers, including Dr Goh, who told Insight this may not be the best method.

It must be balanced with judicial oversight and complementary to other non-legislative means, he said, adding: "Sometimes, you can't just take down things that have gone viral. What might be more effective is to spread the truth as well. You can use legislation to compel one to do that, but often times, it will be regular people who fight against falsehoods by spreading the truth out of their own volition."



5. DEBUNKING FROM THE GROUND UP

At the peak of the Ukraine crisis in 2014, people from all walks of life asked what they could do to guard their country from foreign disinformation.

Following a Facebook post that suggested a get-together to discuss solutions, a website was created.

This was the origin of Stopfake.org, a project aimed at verifying information and refuting propaganda in the Ukraine, its co-founder Ruslan Deynychenko told Insight. He was among foreign experts at last week's hearings. Although Stopfake.org started out with journalists and IT specialists, among others, who wanted to do something for their country at a time of the annexation of Crimea and war in eastern Ukraine, it grew to become an "information hub" analysing Kremlin propaganda, says its website.

"One of the achievements of debunking the story is that regular news organisations started to be more responsible... They try to fact check before publishing," said Mr Deynychenko.

Ground-up efforts like this can help combat falsehoods, noted experts during last week's hearings.

Besides potential legislation, such efforts, along with the need to strengthen media literacy, were among counter-measures suggested.

Highlighting that the best fact-checking platforms in some countries are run by citizens, journalists or a coalition of both, academic Shashi Jayakumar said Singapore could consider establishing a body that uses grassroots participation to counter disinformation. "In many instances, it is the citizenry and journalists who are better placed to act, and to act quickly," he said.

Fact-checking efforts with schools' and students' help could also have the effect of enhancing media literacy, said Dr Carol Soon of the Institute of Policy Studies. She added that non-government entities like the mainstream media can play a role in fact-checking as well. However, this should complement, and not substitute, government-led efforts.

But some, like Assistant Professor Elmie Nekmat of the National University of Singapore, argued that too much emphasis on educational measures to counter falsehoods could "downplay the importance of legal measures".

"Regulations and education-based efforts are both necessary to establish short-term protection towards building long-term resilience to safeguard society from online falsehoods," he said.




No comments:

Post a Comment