Saturday 24 March 2018

Public Hearings on Fake News: 22 - 23 March 2018

Day 4: 22 Mar 2018

Social media execs face panel on fake news
Facebook admits it should have told users earlier about breach of policy
The Straits Times, 23 Mar 2018

Facebook executives were grilled yesterday by members of the parliamentary Select Committee on deliberate online falsehoods.

Law and Home Affairs Minister K. Shanmugam took the tech giant to task for not telling its users soon after it found out their data was breached by political consultancy firm Cambridge Analytica.

He said Facebook fell short of its own pledges on transparency, which calls into question whether it can be relied on to fight false news on its own.

Facebook Asia-Pacific vice-president of public policy Simon Milner admitted it did wrong in withholding the information, but said steps have been taken to fix the problem, and it is serious about cooperating with governments to fight disinformation.

The company, with Twitter and Google, had suggested new legislation was not necessary in Singapore. But the committee said other experts at earlier hearings had pointed out gaps in existing laws.

Facebook chief executive Mark Zuckerberg has acknowledged the company should have done better in handling the breach. Outraged US and British lawmakers have opened investigations into the scandal that came to light in the past week.

Cambridge Analytica scandal: How Facebook data helped Trump win over voters
The Straits Times, 22 Mar 2018

WASHINGTON • It was one of hundreds of cute questionnaires that were shared widely on Facebook and other social media platforms, like "Which Pokemon are you?" and "What are your most used words?".

This one, an app called "thisismydigitallife", was a personality quiz, asking questions about how outgoing a person is, how vengeful one can be, whether one finishes projects, worries a lot or is talkative.

About 320,000 people took the quiz, designed by a man named Alexsandr Kogan, who was contracted to do it by Cambridge Analytica, founded by United States Republican supporters including Mr Steve Bannon, who would become the strategist for Mr Donald Trump.

As Dr Kogan's app was circulated via Facebook, it reaped far more than just the information on those who took the test.

At the time, in 2015, such apps could scrape up all the personal details of not only the quiz-taker, but also all their Facebook friends.

That ultimately became a horde of data on some 50 million Facebook users - their personal information, their likes, their places, their pictures and their networks.

Marketers use such information to pitch cars, clothes and vacations with targeted ads. It was used in earlier elections by candidates to identify potential supporters.

But for Dr Kogan and Cambridge Analytica, it was a much bigger goldmine. They used it for psychological profiling of US voters, creating a powerful database that reportedly helped carry Mr Trump to victory in the 2016 presidential election.

The British-based political consultancy firm says its work with data and research allowed Mr Trump to win with a narrow margin of "40,000 votes" in three states, providing victory in the electoral college system despite losing the popular vote by over three million votes, according to Slate, an online magazine.

The data let the Trump campaign know more than perhaps anyone has ever known about Facebook users, creating targeted ads and messaging that could play on their individual biases, fears and loves - effectively creating a bond between them and the candidate.

The data collected via Dr Kogan's app generated an incredible 4,000 or more data points on each US voter, according to Mr Alexander Nix, Cambridge Analytica's chief executive before he was suspended on Tuesday. The output was put to work in what Mr Nix called "behavioural micro-targeting" and "psychographic messaging".

Simply put, the campaign could put out messages, news and images via Facebook and other social media platforms that were finely targeted to press the right buttons on an individual that would push him into Mr Trump's voter base.

For Mr Trump, it worked.


Doubts whether social media firms can help to fight fake news: K. Shanmugam
Data breach scandal shows Facebook has fallen short on transparency: Minister
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 23 Mar 2018

The conduct of Facebook in the data breach involving Cambridge Analytica gives the Government reason to question whether the social network can be trusted to cooperate in the fight against online falsehoods, said Home Affairs and Law Minister K. Shanmugam on Thursday (March 22).

On the fourth day of the Select Committee's hearings, representatives from Facebook as well as Twitter and Google were asked about their statements and actions, as the committee looked to their track record to determine if they will be reliable partners in countering fake news. In their submissions to the committee, the firms said there are enough laws in place to tackle the problem in Singapore, without new legislation.

Facebook vice-president of public policy for Asia-Pacific Simon Milner found himself in the hot seat, with the hearings here happening just days after revelations that political consultancy firm Cambridge Analytica had exploited the private information of 50 million Facebook users.

Dwelling for some time on this matter, Mr Shanmugam said it was his view that Facebook had fallen short of its professed standards of transparency in handling user data.

The data breach took place in 2014 and Facebook knew about it in 2015, but it was not until a few days ago that it admitted to it.

Mr Milner, who has given evidence to Parliaments in three other countries, had told British MPs last month that Cambridge Analytica did not have Facebook data.

Pointing to this, Mr Shanmugam said he should have come clean about the matter. The fact that he did not made it reasonable to conclude that Facebook had deliberately sought to mislead the British Parliament, as chair of the committee Damian Collins had suggested, the minister said.

Rejecting this characterisation vigorously, Mr Milner said he had answered truthfully based on the information he had at that point.

But he conceded that others could leave with the same impression as Mr Collins, and admitted he should have given a fuller answer in hindsight.

At one point on Thursday, Mr Milner questioned the relevance of various news articles and statements by Facebook executives he was being asked to comment on and looked to committee chairman Charles Chong to step in.

He added the overall impression created is of a firm that does not care about the problem and is not doing anything to address it, when in fact Facebook has made huge investments in the area and will double the number of people working on security to 20,000 by the year end.

"Please don't just read articles like that," he said, urging the committee to consider a more complete range of information, including the social network's paper on information operations, and comments by its chief Mark Zuckerberg.

To this, Mr Shanmugam said Facebook's conduct elsewhere and expert opinions of the firm are highly relevant in figuring out if the firm would be voluntarily helpful or if the Government would have to intervene, such as through legislation.

It was only fair for Facebook to be given the opportunity to respond, if this information could help with the Government's decision later.

He said it was his view that the company had not behaved responsibly thus far, adding that "we will have to wait and see what you do".

Mr Milner disagreed but said it was fair to hold Facebook to account: "This has been a tough Q&A, I respect that you are asking questions that need to be answered and we as a company need to be accountable to you and your colleagues and other policymakers, most importantly, the community of 2.2 billion, including some 4.1 million people here in Singapore, about how we protect their data, how we keep it secure and when things go wrong, how we tell them and you about it."

Another reason to dwell on what Facebook has done elsewhere is that online falsehoods could affect national security, and the meddling in elections in the US could well happen here, said Mr Shanmugam.

"We know our position as Singapore in the world. We are not the United States of America. If a very senior legislator in the US feels that you are not being cooperative, then how do we expect that you will cooperate with us? But these are issues that we are entitled to explore," he said.

Singapore wants technology companies to succeed and considers them partners in the fight against online falsehoods, but it did not mean the Government has to accept that their claims of what they can or have done is enough to fix the problem, he added.


The Straits Times, 23 Mar 2018

Here are edited extracts from the lengthy exchange between Law and Home Affairs Minister K. Shanmugam and Facebook vice-president of public policy for Asia-Pacific Simon Milner.

Mr Milner: "This committee is looking into the issue of deliberate online falsehoods here in Singapore. Myself and my colleague and other people on this panel have come here prepared to answer questions about and to help the committee understand it.

"I don't think it is fair to ask me detailed questions about evidence given by my colleague to a different Parliament in a different country about activities associated with that country... I am really trying to understand why we aren't talking about the issues in Singapore, about the deliberate online falsehoods here, about what our companies are doing about this... I really respectfully suggest... if you want to get to something, get to it, and let's have other people answer some questions."

Mr Shanmugam: "The questions before the UK Parliament were very relevant in exploring the degree to which you can be trusted, Facebook can be trusted to answer questions when asked, Facebook can be trusted to be a reliable partner, that a government of Singapore can depend on Facebook to tell us the truth, the whole truth and nothing but the truth in proceedings where the witnesses are sworn, or whether you will do everything you can to give lawyers' answers or lawyered answers.

"As I told you earlier, one looks at the sequence of conduct from 2015 to 2018, and the very first time you accept the responsibility for Cambridge Analytica publicly, when did that happen and why did that not happen earlier?

"And to what extent can we take seriously all these protestations that you can be completely trusted to apply your internal guidelines? It is very relevant.

"And if you thought that you could turn up here today, not answer questions on Cambridge Analytica and explain your answers today with your answers less than five weeks ago to a different Parliament - we are all sovereign Parliaments, but we look at your conduct all around the world and we have to understand.

"Second, why are we looking at these answers? We are looking at our national security, the consequences we have.

"By looking at your answers elsewhere, it is clear and you have confirmed you will not decide whether something is true or false, you will not take down something simply because it is false.

"You will take it down if there is a legal obligation on you and your argument, up to very recently, through the written representations, through the public statements, through all public positions that you have taken, in essence is that you will prefer to be regulated yourself with your internal guidelines - that is my sense of it, if I am wrong, I am wrong - and that you do not want to be regulated."


Mr Milner responding to Mr Shanmugam, who had asked for his comments on statements made in various articles on Facebook:

"I agree, if you were just to read all those articles, I would be really worried about Facebook if I just read those articles.

"But please don't just read articles like that, also read our evidence, read our information operations paper, read the whole of (Facebook general counsel Colin Stretch's) evidence to that committee, read the commitments by our CEO Mark Zuckerberg... I can't imagine any reasonable person who could read all of that and conclude that we can't work with or trust this company."

Facebook probing if data breach affected Singapore users
By Seow Bei YiThe Straits Times, 23 Mar 2018

Facebook is investigating if any of its Singapore users' personal information was inappropriately obtained and shared with British political consultancy Cambridge Analytica.

Affected users will be informed, the tech giant's Asia-Pacific vice-president of public policy, Mr Simon Milner, said yesterday at a Select Committee hearing on deliberate online falsehoods.

He also said Facebook is looking into whether there are other data breaches involving app developers.

His remarks come in the wake of recent media reports about Cambridge Analytica, which is at the centre of a scandal in which it is accused of exploiting the data of more than 50 million Facebook users for commercial and political use.

Cambridge University researcher Aleksandr Kogan had used an app to extract the users' information. By allegedly accessing user profiles, the firm could infer the political preferences of United States voters and target personalised messages at them to benefit Republican candidate Donald Trump.

Home Affairs and Law Minister K. Shanmugam repeatedly questioned Mr Milner about Cambridge Analytica yesterday, in a lengthy and, at times, heated exchange.

Mr Milner conceded at one point that Facebook "got it wrong" and should have informed users about the data breach, noting that it had a "moral obligation" to do so.

He echoed Facebook chief executive Mark Zuckerberg's post yesterday which admitted there was "a breach of trust between Facebook and the people who share their data with us and expect us to protect it".

Asked by Mr Shanmugam yesterday if Facebook could be expected to have done more in ensuring the data Cambridge Analytica took inappropriately had been deleted, Mr Milner said: "Yes, given the actions we are now taking."

Mr Shanmugam also questioned if Mr Milner had been "careful and economical" with the truth when he told British MPs last month at a Select Committee inquiry into fake news that Cambridge Analytica did not have Facebook data.

To this, Mr Milner stressed repeatedly that his answers were accurate based on what he knew at the time.

The consultancy had given Facebook a sworn affidavit saying it had no Facebook data, Mr Milner said. He later conceded that in hindsight, he should have "provided a fuller answer to the committee and made them more aware of what we understood to be true".

Mr Shanmugam also asked why Facebook did not verify the certification from Dr Kogan that he and Cambridge Analytica had deleted the data it obtained.

Mr Milner replied: "That is one of the lessons for us, in terms of why we are now going to audit all other apps and not just take their affirmation... that they have deleted data or not passed it on."

Political ads: Facebook may curb foreign currency use
By Ng Jun Sen, Political Correspondent, The Straits Times, 23 Mar 2018

Tech giant Facebook is prepared to consider banning foreign currencies from being used to pay for Singapore political advertising on its platform, its vice-president of public policy for Asia-Pacific, Mr Simon Milner, said yesterday.

"We do not as yet have that policy," he added, when responding to Law and Home Affairs Minister K. Shanmugam at the Select Committee hearing on deliberate online falsehoods. He agreed with Mr Shanmugam that Facebook should take actions to ensure only people based in the country can buy ads meant for the country, and said he expects the currency measure to be part of "the range of things that we do to preserve the integrity" of elections.

It, however, will be applied to countries that forbid foreign influence in domestic politics such as Singapore, he said.

Facebook's inability to trace sources of funding for political ads has come under fire globally, following revelations that tech giants - including Twitter and Google - could have been exploited by Russia to sway voters. Facebook, in particular, had accepted political ads in the 2016 US presidential election that were paid for in Russian roubles.

Asked about it at a US Senate judiciary hearing last year, Facebook's vice-president and general counsel Colin Stretch did not commit to banning political ads bought in foreign currencies, saying that currencies did not necessarily indicate the source country of an ad.

Mr Milner said: "It is really difficult to define what is a political ad. Most of the ads, in the case of the US, may not have been classified as political ads under the jurisdiction there because they did not endorse a candidate."

Responding, Mr Shanmugam said: "So, if we define what is a political ad for you, then you will do it?"

Yes, said Mr Milner. He later said the type of political ad would have to be narrowly defined, as a move against foreign currency payments may affect non-state organisations.

"Remember, we have international organisations, human rights organisations, that will often seek to advertise on our platform to highlight human rights abuses," he said, adding that he meant other countries.

Highlighting Facebook's recent efforts, he said his company is piloting a "transparency exercise" in Canada, and plans to use it in the US mid-term elections later this year.

It requires Facebook page owners and administrators who want to put up ads to verify their details in a note sent by snail mail to Facebook.

"It will be available globally. By shining a spotlight, we know that is often the best antidote to malfeasance and bad behaviour, and that is what we will be doing," he added.

Representatives from Google, and Twitter set out steps they took to improve their policies and add new tools to combat deliberate online untruths. Google's Asia-Pacific news lab lead Irene Jay Liu said its "fact-check tag" shows certain news stories have been checked by news publishers and fact-checking entities.

Google is studying how to discern between low-and high-quality content, using algorithms to tell facts from falsehoods, she told Minister for Social and Family Development Desmond Lee.

Twitter's director of public policy and philanthropy in Asia-Pacific Kathleen Reen said it has made "an enormous amount of progress" since the US presidential election.

She highlighted Twitter chief executive Jack Dorsey's call for proposals this month to look into bad behaviour on the platform, with the firm sharing its data and funding researchers to join the effort.

Tech giants argue against more laws to tackle fake news
However, they concede there could be gaps in Singapore's laws for quick action to be taken
By Yuen Sin and Seow Bei YiThe Straits Times, 23 Mar 2018

Tech giants Facebook, Google and Twitter yesterday argued against the need for additional legislation to tackle the threat of online untruths, saying they are already taking steps to address the issue.

The companies told the parliamentary Select Committee on deliberate online falsehoods that they have been investing heavily in technology and schemes.

This includes developing algorithms that can flag less trustworthy content and prioritise authoritative sources, as well as partnerships with non-profit organisations that help them identify and take down offensive material.

"Prescriptive legislation will not adequately address the issue effectively due to the highly subjective, nuanced and difficult task of discerning whether information is 'true' or 'false'," Mr Jeff Paine, managing director of the Asia Internet Coalition (AIC), wrote in his submission to the committee, adding later that multiple stakeholders have to be engaged instead of rushing to legislate. The AIC, an industry association of technology companies, counts LinkedIn and Apple among its members.

Ms Kathleen Reen, Twitter's director of public policy for Asia-Pacific, said in her written submission that "no single company, governmental or non-governmental actor should be the arbiter of truth".

However, Mr Paine conceded during yesterday's hearing that there could be gaps in Singapore's existing laws for quick action to be taken against online falsehoods, when quizzed further by Select Committee members Law and Home Affairs Minister K. Shanmugam and Social and Family Development Minister Desmond Lee.

Speaking to a panel of representatives from Facebook, Twitter, Google and AIC, Mr Lee questioned the ability of technology companies to self-regulate.

He cited how YouTube has not completely removed a 2016 video by banned British white supremacist group National Action after more than eight months, even though British Home Affairs Select Committee chairman Yvette Cooper flagged it multiple times over the past year.

"Their experience is something that we look at with concern, being a much smaller jurisdiction... even in clear-cut cases, there has been inaction," Mr Lee said.

Mr Shanmugam noted that there can be a difference between what countries and social media platforms may tolerate.

He referred to a post on Twitter with the hashtag #DeportAllMuslims, which was accompanied by a graphic cartoon of a topless mother, surrounded by toddlers of varying ethnicities. The picture was titled "The New Europeans". The tweet had not been taken down even after being flagged, despite its offensive nature, he said.

"This was not a breach of Twitter's hateful conduct policy. If this is not a breach... I find it difficult to understand what else can be."

He told the tech industry representatives: "The various beautiful statements you made... (have) to be tested against reality... For us in Singapore, this is way beyond what we would tolerate."

Facebook's Asia-Pacific vice-president of public policy Simon Milner pointed to difficulties in coming up with policies to tackle deliberate online falsehoods.

He highlighted that due process will be needed for a policy against online untruths, which is unlike "making a judgment on hate speech, or terrorism, or child sexual abuse - all the other areas of policy that we deal with".

"It is not that we are trying to abdicate our responsibilities, it is the particular notion of the kind of due process you require in order to be fair to people... that I think is more problematic for us than other policy areas," said Mr Milner.

He said that this is why using machine learning or proxies to nip the problem in the bud - a system that is still being tested - is what the platform considers to be the right approach.

Telcos highlight limitations to tackling online falsehoods
By Yuen SinThe Straits Times, 23 Mar 2018

As Internet service providers, telcos Singtel and StarHub said they do not have the tools to monitor and selectively block the content of third-party materials sent through their networks.

For this reason, they are unable to adequately tackle deliberate online falsehoods on their own, both telcos told the parliamentary Select Committee on fake news yesterday.

Measures they have taken, like blocking a website, are also "blunt instruments" that cannot specifically target the root of the problem.

These limitations were highlighted in their oral and written representations to the committee.

But they agreed that a multi-pronged approach should be taken in addressing the issue, including public education and legislation.

Explaining its limitations, Star-Hub's head of regulatory affairs, Mr Tim Goodchild, said that while it can, say, block customers' access to a single domain like Twitter or Facebook, it will not be able to restrict access to individual tweets or posts.

It is also possible for users to circumvent such measures, he added.

The situation is further complicated by the speed at which such posts can travel and become viral, he said when replying to Senior Minister of State for Communications and Information Janil Puthucheary, a committee member.

Mr Sean Slattery, Singtel's vice-president in charge of regulatory and interconnect matters, agreed.

He also said the role of public telecommunications licensees, like his company, is not to play "judge and jury" of content, especially in the light of the heightened privacy and security laws and push to encrypt content.

In fact, the Telecommunications Act prohibits such a licensee from familiarising itself with the content of a message.

But both telcos agreed fake news is a pressing issue. Singtel itself has been a victim of commercial scams out to obtain personal information or money. "There is very little recourse for us," said Mr Slattery.

Public education is just one element of the solution, both representatives told Dr Janil.

Unlike tech giants such as Facebook and Twitter, both telcos said extra legislation may be needed to effectively fight deliberate online falsehoods.

"They are platforms that reach billions of people. If they want to be socially responsible, they do have an obligation to contain the spread of deliberate online falsehoods," said Mr Yuen Kuan Moon, Singtel's chief executive of Singapore consumer business.

15-year-old is panel's youngest witness to date
By Ng Jun Sen, Political CorrespondentThe Straits Times, 23 Mar 2018

When 15-year-old Zubin Jain wrote to a high-level parliamentary committee and outlined his views about the fake news scourge, he did not tell anyone.

His parents were "utterly shocked" when he told them last month that he was invited to appear before the Select Committee on deliberate online falsehoods - the panel's youngest witness to date.

Said his mother Asiya Bakht, 44: "We were scared. I asked if it was okay to testify. Will he be in trouble? Has he lied? Because in the e-mail (to the committee), he said he might also have spread online falsehoods."

Her worries were misplaced: Committee members praised his bravery in stepping forward to give evidence.

The Grade 10 student at United World College of South East Asia confessed that he did not know what to expect. "I had my friend read out my submission this morning, and there were so many spelling errors that I thought they called me here to correct my spelling. That was my secret fear," he told The Straits Times yesterday.

Zubin, who blogs about economics and politics, said he decided to write to the committee as he has had arguments with friends and family whose views and beliefs were often formed as a result of them having accessed false information. And such beliefs could be hard to correct.

He once argued with an aunt who believed in homeopathy - or alternative treatment - and questioned the sources of her information: "But she said homeopathy worked for her many times. You can't really argue with that even with scientific evidence."

To this, Madam Bakht shook her head, saying: "That is him - too much fact-checking."

He told the committee that any legislation against fake news should not target individuals who do not have a clear malicious intent. "As someone who writes and blogs online, I did not want my own habits and my freedom to speak get taken away," he said.

Robust exchanges as minister grills senior Facebook exec
By Elgin Toh, Deputy Political EditorThe Straits Times, 23 Mar 2018

Technology companies felt the full force of Mr K. Shanmugam's experience as a litigation lawyer yesterday, as the Select Committee on deliberate online falsehoods held its fourth day of hearings.

The Law and Home Affairs Minister dominated proceedings after the lunch break, with a lively, robust exchange with Facebook's vice-president of public policy for Asia Pacific Simon Milner.

Grilling Mr Milner on the Cambridge Analytica scandal that Facebook apologised for in the early hours of yesterday morning, Mr Shanmugam painted a picture of the social media giant being less than candid in its disclosures to its own users, and to a British parliamentary select committee, as it covered up a major breach of its data affecting 50 million users.

Perhaps fortuitously for the committee, the scandal unfolded right in the middle of its hearings, presenting itself as Exhibit A in the deliberations. Facebook CEO Mark Zuckerberg apologised in the United States hours before Mr Milner appeared in Parliament.

A data analytics and public relations firm, Cambridge Analytica illicitly harvested the data of 50 million Facebook user profiles, and later used it to influence voters on behalf of the successful Donald Trump presidential campaign.

When Facebook found out, it got the company to agree to delete the data. But it failed to inform users of the breach - an oversight it has now said sorry for. But it gets worse: Cambridge Analytica did not delete the data, and Facebook did not verify they had done so, but simply took the company's word.

Digging up the transcript of Mr Milner's testimony to British MPs last month, Mr Shanmugam cited statements by him that Cambridge Analytica did not have Facebook data, and put it to Mr Milner that he misled British MPs. There is "no excuse for not telling the whole truth and being full, frank and honest", said Mr Shanmugam.

Mr Milner said his answers were correct and based on what he knew at the time. But he conceded that he could have given a more complete accounting of the saga: "Do I wish I had given a fuller answer? Yes."

This was "one of the worst weeks" in his six years at Facebook, he said, adding that there was "a determination from the very top" to learn what went wrong and to make sure it does not happen again.

Protracted exchanges between the two men were at times heated, such as when Mr Shanmugam reprimanded Mr Milner for being "keen to answer questions I don't ask". At other times, they drew laughter. Mr Milner praised his interrogator for his sharpness of mind as a lawyer. Mr Shanmugam praised Mr Milner, too, saying: "You do better than most lawyers."

The two went on for three hours, as other witnesses from Twitter, Google and the Asia Internet Coalition sat in what looked to be an uncomfortable silence.

But Mr Milner pushed back vigorously too. At one point, he appealed to committee chairman Charles Chong to ask if it was the best use of the committee's time for Mr Shanmugam to go over in fine detail, statements that Facebook executives made to panels in other countries and to ask for Mr Milner's views. At parts of the exchange, the two also apologised to each other for the rise in temperature. Striking a conciliatory tone near the end, Mr Shanmugam said: "We look at you as partners." Mr Milner replied: "I appreciate hearing that."

What was Mr Shanmugam's goal? It appeared it was to show that while companies like Facebook controlled platforms on which the bulk of citizen discourse takes place, they did not show a level of responsibility equal to the power and influence they had.

They hold sway in modern democracies, and the content they carry can change election outcomes, divide communities, or become a channel for foreign state-sponsored disinformation campaigns. Yet, they are reluctant to take a firm stand on fake news.

Facebook did not want to be an arbiter of what was true, said Mr Milner. Indeed, what that meant is that even a news item as ridiculous as one in the 2016 US election campaign which said presidential hopeful Hillary Clinton ran a paedophile ring, was unchallenged.

Mr Milner explained that there have been instances when rumours turn out to be true. He added that Facebook will take down posts if it received a court order.

Other committee members also cited egregious posts that tech companies were slow to act on, including a neo-Nazi video that YouTube did not block for months, even after a panel of British MPs contacted YouTube.

Two points were clear from the exchanges. First is the impression that it may not be in the financial interest of social media firms to act. That could explain why Facebook did not inform users of the data breach, and why it was not keen to take down falsehoods - since falsehoods tend to drive traffic, and traffic draws advertising revenue.

Second, as the social media firms are US-based, they have a different instinct on what is acceptable speech or action. The US enshrines free speech - to the point that someone can even burn the Quran.

Against this backdrop, and if social media companies continue to be lethargic in their actions, Singapore may have little choice but to safeguard its own interests and introduce laws to compel these companies to abide by the social norms here and remove deliberate falsehoods.

Day 5 - 23 Mar 2018

Legislation against online falsehoods should not be so broad that it endangers the work of journalists, says SPH
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 24 Mar 2018

Any legislation to counter deliberate online falsehoods should not be so broad and sweeping that it deters the legitimate sharing of information and hampers the work of journalists, editor-in-chief Warren Fernandez said yesterday.

"This is a concern for... the public, not just a concern of journalists," Mr Fernandez told a parliamentary hearing when presenting the views of Singapore Press Holdings (SPH), where he heads the media company's English/Malay/ Tamil Media Group.

Any proposed new laws should differentiate between deliberate and inadvertent spread of falsehoods, and take into account the impact of the content, such as whether it affects social cohesion or national security, he added.

The parliamentary Select Committee tasked to look into the issue of deliberate online falsehoods heard from editors of SPH and broadcaster Channel NewsAsia as well as other media players.

The media companies supported introducing laws to counter the problem, with SPH proposing legislation for distributors of online content as a possible solution.

Despite this position, it is natural for editors to have concerns about any new legislation, especially without knowing what form it may take, said Mr Fernandez, who is also editor of The Straits Times.

"There are already quite a number of existing laws which give the Government powers, as you have used previously. So, any further legislation is something that naturally causes concern in our minds because we have no sight of what shape or form that legislation might entail," he said.

A law that is too broad could scare people from offering information, and this could impinge on the work of journalists, who have an important role to play in battling disinformation, he said.

In addition, he noted that the news-gathering process is rarely straightforward and often involves verifying information that comes in dribs and drabs. This meant journalists and editors have had to piece together and interpret information, while making judgment calls, which could end up being labelled as deliberate falsehoods, when they had no intention to mislead at all.

Instead, Mr Fernandez said, any new law should focus on online false news that is deliberately untrue.

Noting that existing laws are mostly limited to content creators and providers, he urged the Government to "focus your energies on ensuring a level playing field between us and distributors of content who should be held to account and should be held responsible for the information that they put out and spread".

During the hearing, the editors were also asked about their views on freedom of the press and censorship. Committee member and Nominated MP Chia Yong Yong asked what would constitute constraints on that freedom.

Mr Fernandez said journalists hold freedom of the press dearly, and are legitimately concerned about any attempt to constrain it.

But they also understand that freedom is not absolute and has to take into account Singapore's context, as there are other freedoms that need to be safeguarded, he said. Singaporeans would expect media outlets to help maintain societal harmony on issues of race and religion, for instance.

Citing cartoons of Prophet Muhammad published in some countries, Mr Fernandez said media outlets in Singapore would not publish them.

Making such editorial judgments, however, is about exercising responsibility, he said. Editors take inputs from many sources, including newsmakers and readers, to help inform their judgments.

"Before we publish anything, we would want to make sure it is not libellous, or unfair, or biased. And that process of making that judgment, I don't think it is censorship. It is exercising responsibility," he said.

Independent fact-checking council can combat deliberate online falsehoods: SPH and CNA
By Yuen Sin, The Straits Times, 24 Mar 2018

The formation of an independent fact-checking body was proposed yesterday by media company Singapore Press Holdings (SPH) and broadcaster Channel NewsAsia (CNA) as a way to combat the problem of deliberate online falsehoods.

It should include media companies and industry practitioners, and other interested parties, said SPH, adding that it is open to working with other media organisations to form the alliance.

CNA said the mandate of such a body "must include identifying a DOF (deliberate online falsehood) and recommending appropriate remedial actions".

It added that for a piece of information to be identified as deliberate untruth, it must demonstrate intent to achieve special goals, like compromising national security.

The two companies gave the suggestions in their written submissions to the Select Committee on deliberate online falsehoods.

At yesterday's hearing, Senior Minister of State for Communications and Information Janil Puthucheary, a committee member, suggested to the companies' representatives that the Government can step in to address egregious falsehoods on such issues as national security or public health when the proposed fact-checking body cannot move fast enough.

Mr Warren Fernandez, editor-in-chief of SPH's English/Malay/ Tamil Media Group and editor of The Straits Times, agreed.

"But that does not preclude having a fact-checking body for other ends of that continuum," he said.

Mediacorp editor-in-chief Walter Fernandez added that it is premature to dismiss at this point in time the ability of this proposed body to move quickly against falsehoods in critical situations.

Mr Goh Sin Teck, editor of SPH's Lianhe Zaobao and Lianhe Wanbao Chinese newspapers, said: "In times of emergency, it might be better to take down (an alleged falsehood), but there should be a process for a paper to go back to the independent body, to check if it is indeed fake news."

Mr Warren Fernandez noted the Edelman Trust Barometer had showed the conflation of mainstream media with social media has led to a decline in trust in the media in many countries. He noted that there had been a "constant drip feed" of "potshots and attacks" against the mainstream media by some online sites and bloggers. These appeared aimed at undermining the trust in the mainstream media, in an attempt to divert traffic and advertising to their sites.

For example, on occasions when SPH publications take time to verify and cross-check sources for an online news article, these sites may point to delays in putting out information as signs SPH is deliberately withholding information. In fact, there is no intention to do so at all, he said, adding: "Some of this may come from different standards that we hold when it comes to what is considered credible and reliable news."

He added that in his own experience, there has been "absolutely no transparency" when it comes to getting answers from social media companies on how their algorithms on ranking content are arrived at, or changed at their will.

Similarly, these companies might not be quick enough to act to take down deliberately false content unless the authorities had the powers to compel them to do so, just as it could with the local media.

"If (the authorities) have no powers to compel them to... it is not going to happen," he said.

Mr Walter Fernandez said it is "completely wrong" to think they cannot be regulated because they are "global, complex and powerful".

In fact, they can nip the problem of deliberate falsehoods in the bud quite effectively if they put their minds to it, he added.

Call for clear definition of deliberate untruths
By Ng Jun Sen, Political Correspondent, The Straits Times, 24 Mar 2018

There needs to be a clear understanding of the definitions of deliberate untruths when new laws are drafted against the scourge, representatives from the Singapore Press Club (SPC) and the Singapore Corporate Counsel Association (SCCA) told a parliamentary committee yesterday.

"There will be a lot of grey areas," said SPC president Patrick Daniel. "Those words that you use - deliberate online falsehoods - need clearer definitions. I would submit that in the current codes of practice and legislation, it is not so clear."

Mr Daniel, citing a case that took place in 2012, said a radio station was fined under the Broadcasting Act, which has a clause that says relaying false news is a breach of its code of practice.

"The mistake was that instead of putting up today's news, yesterday's tape was played erroneously. That was deemed false under the clause, when you can in fact argue that it is dated news," he said.

Six representatives from SPC, SCCA and law firm Allen & Gledhill shared their views on the issue, recommending that the committee consider different treatment for different types of untruths, and a way to arbitrate any disagreements to the definition of fake news.

Senior Minister of State for Communications and Information as well as Education Janil Puthucheary sought clarifications from the panellists on their submission, which comprises the views of participants at a recent forum on deliberate online untruths.

SCCA's president emeritus Angeline Lee said: "From our perspective, it is not a question of whether there is sufficient legislation or not, it is the interpretation of what deliberate online falsehoods are... If there are different interpretations, then certainly, it is cause for a lot of discourse."

To this, Dr Janil said other witnesses had given evidence to show how to define the term: by its intent; that it is on an online platform; has to be demonstrably false; and that it must have a significant impact, like the article's ability to affect national security.

"So if we use that framework, you would support further intervention to make sure such falsehoods are stopped?" asked Dr Janil.

Yes, as long as it is nuanced, Ms Lee said, agreeing with Dr Janil that such nuance would not be needed for matters concerning race, religion or the country's sovereignty.

Allen & Gledhill partner Stanley Lai, who helped put together the report, recommended a focus on prosecuting sources of fake news, as "take-down notices" alone may be insufficient. "The strongest measure is to make sure there are equally prohibitive consequences for the creation of the article in the first place," said Dr Lai.

Deliberate fake news doesn't warrant protection: Law Professor Thio Li-ann
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 24 Mar 2018

Deliberate online falsehoods harm society and undermine democracy, and belong to a category of speech that does not warrant protection, constitutional law expert Thio Li-ann said yesterday.

In fact, the spread of disinformation impedes public debate and destroys the very reason for free speech itself, she added. Professor Thio was giving her views at the Select Committee hearings on online untruths, particularly how its regulation can affect free speech.

She said it is misleading to approach the regulation of deliberate online falsehoods simply as a limitation on free speech. Describing it as a complex issue, she said speech is a means to an end, and the reason to safeguard it is to protect the free and open political debate that is at the core of democratic society.

When considering regulation on speech, it is thus crucial to decide what kind of speech deserves constitutional protection, she added. False news, which is spread to say, mislead people, manipulate election results and turn groups against one another, clearly does not belong in this category, she said.

Citing the concept of the marketplace of ideas, she said the belief was that truth would emerge from having a wide range of views. But the townhall type of democracy is very different from an online democracy, she added.

In the marketplace of ideas, people are assumed to have equal access to information and equal opportunity to disseminate information. But social media platforms allow people to customise what they want to see and hear and use algorithms to promote certain content and downplay others, she said.

"Concerns about fake news basically have at its heart a distrust in the public's power of judgment or fears it will be duped or gullible," she said in her written submission.

"However, considering the consequences that deliberate online falsehoods could have on the conduct of national elections or the economy… and how the Internet has altered our communications universe, there is a legitimate need to regulate this to mitigate the effects of such false speech."

US-based NGO Human Rights Watch fails to show up for hearing
Human Rights Watch also taken to task for using false examples in report on Singapore
By Jose Hong, The Straits Times, 24 Mar 2018

Human Rights Watch (HRW) was yesterday accused of using falsehoods to advocate for political change in Singapore, and labelled a "radical left-wing" organisation.

The United States-based non-governmental organisation (NGO) came under fire during a Select Committee hearing on deliberate online falsehoods, with the People's Action Party Policy Forum and Israel-based academic Gerald M. Steinberg calling it untrustworthy.

Representing the PAP policy platform, Sembawang GRC MP Vikram Nair criticised the NGO's report "Kill The Chicken To Scare The Monkeys" - Suppression Of Free Expression And Assembly In Singapore, published last December, saying it was a type of deliberate falsehood.

Professor Steinberg, who gave evidence via video conference, said HRW hires people who are "ideologically committed to a certain position" and selectively looks for evidence that proves the case that they are trying to make.

Select Committee chair Charles Chong noted during the hearing that HRW was invited to give oral evidence. It was initially willing to come, but later said its staff member could not make it after being told it would be asked about its report.

Mr Nair said that HRW's 133-page report "attempted to legitimise examples that were clearly false".

He cited several examples, including the case of blogger Roy Ngerng, who was sued by Prime Minister Lee Hsien Loong for defamation after he wrote a blog post about the Government's handling of the Central Provident Fund.

Mr Nair said in those cases, the individuals had deliberately made false allegations. "HRW seems to suggest these are acceptable and that Singapore's laws and political system should allow such falsehoods to be freely made," he wrote.

The rights group set out about 60 recommendations in the report, including calling for the repeal or amendment of restrictions on rights to peaceful assembly and speech here. The report drew on interviews with 34 activists, lawyers, journalists, academics and opposition politicians, and looked at laws such as the Public Order Act and Sedition Act.

Mr Nair also said the rights group seemed to suggest that falsehoods are a legitimate part of public discourse. "But we are of the view that false news, in fact, undermines public discourse," he said.

He added: "This is an example of an entity that is outside Singapore, whose motivations are unclear, but which is generally putting forward information to change laws in Singapore."

Prof Steinberg, who is president of the organisation NGO Monitor, said that while HRW claims its investigations are rigorous and objective, "you see very clearly that they focus on a few issues, and they do that because those issues conform to their… radical left-wing ideology".

Mr Chong said the Parliament Secretariat had offered to fund the costs of HRW's representative flying in, or arrange for video conferencing at any time between March 15 and 29. But the group has said it is unavailable to do so.

He added that the invitation to HRW still stands, should it decide it is willing to give oral evidence.

In a statement yesterday, the Law Ministry noted that HRW has chosen not to appear before the committee to defend the report, which it said contains "serious inaccuracies, misimpressions, untrue statements".

The ministry said: "HRW's stance is disappointing, but not surprising. HRW has a pattern of issuing biased and untruthful statements about Singapore. It knows that its report will not withstand any scrutiny, and has therefore chosen not to come to Singapore to publicly defend its views.

"HRW, by its conduct, has shown that it cannot be taken seriously as a commentator or interlocutor on issues relating to Singapore."

Law Ministry replies to Human Rights Watch statement
By Yuen Sin, The Straits Times, 28 Mar 2018

The lack of enthusiasm by Human Rights Watch (HRW) to defend its report is "obvious", the Ministry of Law (MinLaw) said yesterday, in its response to the group's statement that it was "ironic and absurd" for the ministry and members of the People's Action Party to suggest this.

HRW was offered various arrangements to give oral evidence to the parliamentary committee, and these included appearing on a day a hearing had not been scheduled, and holding a video conference at any time during a 14-day period.

But each time, HRW said it was unable to do so, MinLaw said.

The US-based non-governmental organisation issued a statement earlier yesterday, saying it had offered to send its staff member to give oral evidence to the Select Committee on deliberate online falsehoods.

But, it added, the committee did not confirm a date that could work for the staff member until other commitments had been made.

MinLaw, in giving the timeline of subsequent events yesterday, said the March 23 slot was confirmed on March 13, three days after HRW said its representative would be able to appear on that date.

HRW was also informed by Singapore that its representative should be prepared to address questions on its report "Kill The Chicken To Scare The Monkeys" - Suppression Of Free Expression And Assembly In Singapore.

Last Friday, MinLaw said the report, published last December, contains "serious inaccuracies, misimpressions, untrue statements" .

On March 14, HRW told the committee its representative was no longer able to attend the hearings on March 23. In response, the Parliament Secretariat said it can appear on any of the eight days scheduled for the public hearings.

The following day, the offer was extended to any of the 14 days between March 15 and 29, and the option of video conferencing was also offered on any of these days.

On March 19, HRW informed Singapore it will not take part in the hearings, either in person or via video conferencing.

MinLaw said yesterday: "Their latest statement leaves out any explanation for why they are unable to attend through video conferencing, from an overseas location - at any time over a period of 14 days from 15 to 29 March."

HRW said yesterday it had sent a letter to four Cabinet ministers, including Prime Minister Lee Hsien Loong, last Oct 30, asking for their response to the findings of its report. It has yet to get a response.

It also said it has offered to meet government officials in Singapore or elsewhere at a mutually convenient date to discuss the report.

Yesterday, the Select Committee said HRW's statement suggests it is now prepared to give evidence in the hearings. The committee also said it can hear from HRW on any date in May, or after May, after Parliament reopens. Parliament will be prorogued next month. It asked HRW to respond by noon tomorrow.

Do online untruths have a place in a democracy?: Gillian Koh
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 24 Mar 2018

Online falsehoods have a place in democratic discourse, said public policy researcher Gillian Koh, as she argued for people to be engaged so that they can come round to seeing that it is false.

Her assertion yesterday drew a swift challenge from Law and Home Affairs Minister K. Shanmugam, who said such untruths serve no purpose in a democracy. Allowing them to circulate would lead to people being misled, he added.

The exchange began with a discussion about Dr Koh's proposal to counter foreign interference and disinformation during election periods. Asked about non-election periods, she said existing laws to curb foreign influence in politics were sufficient to deal with the problem.

Besides, she said, engaging people over online falsehoods allows "right-thinking" people to debunk them, and subsequently educate others about the dangers.

She added: "You declare it is false, but people still need to be persuaded... all I am saying is that we allow for a process of deliberation."

But Mr Shanmugam said her suggestion that people can be convinced is contrary to scientific research cited by others, which shows that people are hardwired to believe falsehoods, and countering false news with truth could reinforce the untruths.

Mr Shanmugam asked Dr Koh, who is deputy director for research at the Institute of Policy Studies, "what purpose is served" by letting online untruths be purveyed.

He cited the claims about Indonesian President Joko Widodo being Christian, former US president Barack Obama being Muslim and former US presidential candidate Hillary Clinton running a paedophile ring out of a pizza parlour.

Dr Koh agreed there is no purpose when falsehoods are deliberate and clear. But she said it is not always clear-cut that something is false, and it could take time to verify the facts.

Mr Shanmugam countered that the Select Committee is looking only at deliberate online falsehoods. "Let me assure you we are dealing with deliberate online falsehoods. Let's stick to that," he said.

New laws may stifle creativity:
The Straits Times, 24 Mar 2018

New laws against deliberate online falsehoods may risk stifling the flow of ideas and creativity if they are too "blunt and hard-handed".

With a greater appetite for diverse viewpoints among Singaporeans today, such laws may also erode trust in local media and prompt them to turn to foreign outlets instead, local news site told the Select Committee on deliberate online falsehoods yesterday.

Responding, Senior Minister of State for Communications and Information Janil Puthucheary assured the site's managing editor Martino Tan that any new legislation will be carefully written and specifically scoped so that it leads to "an increase of trust in discourse and discussion".

Mr Tan and Mothership's managing director Lien We King had noted that social media companies have been "mopping up majority of the world's advertising revenues".

They agreed with Dr Janil that steps have to be taken to fix the "flawed marketplace of ideas" brought about by the spread of social media platforms. Doing so will allow more credible and accurate information to enrich space for discourse, said Dr Janil.


Worries over falsehoods among different groups
The Straits Times, 24 Mar 2018

Concerns about falsehoods circulating among different groups of people came up during the hearing yesterday.

While fake news have existed for a long time, senior citizens could be accessing social media only recently and are "newcomers to the game", said Lianhe Zaobao and Lianhe Wanbao editor Goh Sin Teck.

Mr Goh stressed the importance of media literacy for this group, and said the type of falsehoods he sees range from health-related and lifestyle myths to more sensitive falsehoods about China.

He noted that such falsehoods spread as this group of people "think since someone shared this with me online, it must be true, so I will spread it to my friends so they can be enlightened too".

Mr Goh and Berita Harian editor Mohamed Sa'at Abdul Rahman said traditional non-English media play an important part in containing the spread of sensitive fake news.

Mr Mohamed Sa'at said his newspaper takes extra care when it comes to sensitive topics.

"From time to time, there may be racial or religious issues. For this, we take great care," he said, adding that such topics are emotive and can be exploited to influence opinion.

Problems with the marketplace of ideas in age of social media
By Elgin Toh, Deputy Political Editor, The Straits Times, 24 Mar 2018

In a marketplace of ideas, everyone can speak his mind without constraint, and through fair contest, the truth emerges.

This theory, commonly used to defend free speech, was examined yesterday, as the Select Committee on deliberate online falsehoods held its fifth day of hearings.

National University of Singapore law professor Thio Li-ann was the witness most hesitant yesterday about the marketplace thesis, which was formulated by American judges in the early 20th century, she noted.

"I don't think all the ideas are wrong but I doubt that they are adequate," she told the committee.

While most witnesses and committee members who spoke about the marketplace of ideas did not have principled disagreements with it, they noted a slew of difficulties the marketplace faces - especially so in the new landscape with social media.

The first problem is people.

Prof Thio cited four groups who impede the marketplace: the Impetuous, who are undiscerning about the information they receive, and do not respond to sound argument; the Ideologues, who only want to advance their own point of view, and seek to shutdown other views; the Iagos (a character from Shakespeare's Othello), who peddle lies out of mischief; and the Irrational, who react emotionally, not intellectually, to ideas.

Law and Home Affairs Minister K. Shanmugam called Prof Thio's submission "one of the clearest" received by the committee.

With the technological means afforded by social media, those with nefarious intentions could put out falsehoods and appeal to people's feelings, to the detriment of the marketplace, he said.

"Is there any rational purpose of free speech or the marketplace of ideas that is served by having such falsehoods?" he asked.

During the testimony of Mr Gaurav Keerthi, founder of debating websites and, Senior Minister of State for Communications and Information Janil Puthucheary also cast doubt on the thesis.

He said the social media space is a "poor proxy" for the marketplace because information is not spread in a transparent way but is manipulated by algorithms; and besides, the success of an idea is determined by whether a post goes viral, not how true it is.

Agreeing that the quality of conversations on social media is often found wanting, Mr Keerthi said this was why he chose to set up his own websites, to build a more ideal platform for the proper discussion of issues.

Before the advent of social media, mainstream media outlets like newspapers and television broadcasters were able to perform a mediating role in the marketplace to a much more satisfactory degree. But these traditional outlets are under threat, as the committee heard yesterday from representatives of Singapore Press Holdings and Mediacorp, the two main media companies here.

Mainstream media has strict standards when it comes to the verification of facts before they publish news stories. This requires considerable resources, noted Straits Times editor Warren Fernandez. With competition from social media, the traditional business model has been disrupted, while social media does not have the same standards - and allow falsehoods to spread.

"If we cannot sustain our news operations, we cannot play the role that we've been playing all these decades... and as a result of it, I think society will be more poorly served," said Mr Fernandez.

A further challenge to the marketplace of ideas is the way social media blurs the line between what is private and public.

Mr Shanmugam made this argument during his discussion with Prof Thio. He noted that in the past, people would express their private thoughts to their friends - including rumours that may not be true. The public sphere is reserved for serious discussion.

"What the Internet has done, is to effectively merge the two. So every time I have an opinion in private - it can be crazy, it can be completely false, I could be malicious - but now I have the means to bring it into the public sphere very quickly and gather like-minded people to shout the same thing," he said.

On the whole, speakers brought out two arguments clearly.

One, it is questionable if a true marketplace of ideas can emerge on social media.

Two, even if it could emerge, getting rid of deliberate online falsehoods would still make sense, as it would assist the marketplace in finding the truth, since falsehoods serve only to obscure the truth.

If there can be general acceptance of these conclusions, then the action that should follow is to decide how these falsehoods can be removed in a way that is timely and is seen by people as fair and reasonable.

The need to be fair and reasonable requires due process.

After a post is ordered to be taken down, the infringing party should have the right to appeal to an independent body - say, the courts.

That day in court can, in a sense, serve as a way to participate in the "contest of ideas". And if a judge can be persuaded that a post is not false, the person can then re-join the marketplace and engage in the real contest taking place there.

* Support for laws against fake news: REACH survey
Those in poll point to WhatsApp, Facebook as fake news sources
About half of those in REACH survey say they have come across untruths in the two apps
By Yuen Sin, The Straits Times, 27 Mar 2018

Facebook and WhatsApp were identified as the two main sources of bogus news here by people who responded to a government survey on attitudes towards fake news.

About half of the Singapore residents polled who have come across fake news said they had seen it on the two popular applications.

These findings by government feedback unit REACH come as the spread of deliberate online falsehoods is being scrutinised by a parliamentary Select Committee here.

The committee of ministers and MPs has conducted five days of public hearings, where they heard from experts and technology companies on the role played by social networks and messaging platforms in the problem. It will hear from non-governmental organisations and civil and human rights groups today.

REACH polled 2,504 Singapore residents aged 15 and above via phone interviews held between May 8 and May 19 last year, and between Feb 12 and Feb 20 this year. The respondents were randomly selected, and the sample was weighted by gender and age to ensure that it was representative of the national population.

Of those polled, 77 per cent said they had come across inaccurate news online at least occasionally.

A smaller number, though, about three in 10, said they had seen fake news about Singapore in the past one year. The majority of them, about 70 per cent, also reported that they were not always able to tell if a piece of news was false at the time they read it. The survey did not address whether they went on to try and verify the information.

Despite this, about half of all respondents to the survey said they were confident of their ability to discern real from fake. But their estimation of others' ability was lower, with 32 per cent of respondents saying they believed other people would be able to recognise fake news.

Commenting on this, Assistant Professor Edson Tandoc of Nanyang Technological University's Wee Kim Wee School of Communication and Information, said: "Self-confidence does not always correspond with actual ability. In fact, it becomes risky when the individual's confidence is high but actual ability is low."

The survey also found that most people are concerned about the spread of fake news online and supported strengthening laws and regulations to curb the problem.

Of particular concern among eight in 10 respondents was the spread of deliberate fake news by individuals or companies who were doing so for profit. When asked if they supported stronger laws against fake news, the majority of those polled, eight in 10, said they agreed or strongly agreed.

And nine in 10 felt there should be more effective laws to require the removal or correction of the reports, and also to prosecute those who publish fake news deliberately "if their actions have serious consequences".

On this, Prof Tandoc said the support for laws and regulations was consistent with the high level of public concern about fake news.

He added that this was also reflected in focus group discussions his school has conducted - during which people had expressed general support for government action against fake news - though they had "different ideas of how this should be carried out", such as whether it should be done by existing laws or new laws.

REACH chairman and Minister of State Sam Tan, noting that fake news can have "real-life consequences", said in a statement: "Singaporeans are aware of the danger and understand that more needs to be done to tackle the issue, both in terms of regulation and education."

Cannot assume everyone can tell true from fake news

I read with some trepidation the discussions and comments by academics and the like suggesting that fake news be allowed for the sake of discussion and the opportunity to debunk it.

The implied reason given was that society at large is mature enough to review, investigate and conclude the falseness of such content, even to the extent of communicating to others the outcome of the investigation.

Sadly, their circle of friends and colleagues are of like intellect with good access to information.

Aren't we being myopic to think that all sections of society are as rational, analytical and armed with a plethora of information to differentiate falsehoods from truth?

Bernard Yeo Boon Yeow
ST Forum, 28 Mar 2018

No comments:

Post a Comment