Skip to main content
Best News Website or Mobile Service
WAN-IFRA Digital Media Awards Worldwide 2022
Best News Website or Mobile Service
Digital Media Awards Worldwide 2022
Hamburger Menu

Advertisement

Advertisement

Singapore

Malicious foreign actors playing 'long game' using credible-looking websites and gen AI: Analysts

Singapore has blocked 10 websites set up by foreign actors that authorities say can be used to mount hostile information campaigns.

Malicious foreign actors playing 'long game' using credible-looking websites and gen AI: Analysts

A person using a mobile phone and computer. (Photo: iStock/Milan_Jovic)

New: You can now listen to articles.

This audio is generated by an AI tool.

SINGAPORE: Nefarious foreign actors are playing the long game in planning hostile information campaigns, gaining the trust of unsuspecting netizens with websites that seem legitimate and regularly populating them with content, experts said on Tuesday (Oct 22).

Creating fresh content is also easier and more widespread these days, due to generative artificial intelligence (AI) tools that lets such entities easily repurpose news from credible news platforms, they explained.

When the time comes to use the websites to attack targets, it would be less likely to trigger any suspicions or red flags, said the experts.

The Singapore government on Tuesday announced it has blocked 10 inauthentic websites that may be used as part of hostile information campaigns.

LONG GAME TO BUILD CREDIBILITY

Mr Benjamin Ang, who heads the Centre of Excellence for National Security at the S Rajaratnam School of International Studies (RSIS), noted that this recent case involves the “specific tactic of setting up news sites that appear to be local, with local sounding names, that feature local news to make themselves look credible”.

However, they are actually owned by foreign entities that can or have used them for influence campaigns, he said.

“We can think of them as digital ammunition which can be loaded and launched against us when the time is right, so it is prudent to defuse them before they are fired at us,” he said.

Singapore Management University’s Associate Professor of Law Eugene Tan explained that such sites repurpose content from credible news outlets by feeding them into bots to create fresh stories.

Noting that the whole effort revolves around creating a network of such websites, Assoc Prof Tan said: “The ordinary reader will never be able to discern that they are part of the same network.”

“The threat is now so sophisticated. They are in it for the long game. Some of the websites have been around for years, and they're trying to get around some of these telltale signs,” he noted.

For example, a website involved in a hostile information campaign that was created just a day before would raise red flags among intelligence agencies, in contrast to one that has been around for a few years.

“Foreign actors don't need instant gratification. They know that they just have to wear down the target,” said Assoc Prof Tan.

Dr Shashi Jayakumar, executive director of security consultancy SJK Geostrategic Advisory, said: "These websites may not on the surface seem to have actively ramped up operations in a manner that might immediately subvert social resilience or cohesion in Singapore.

"However, the websites in question could be pre-emptive nodes that could be triggered as and when necessary."

He added that those behind information operations will always want an avenue and an option to influence Singaporeans, “particularly given rising geopolitical contestation”.

PUTTING UP A FRONT

Assoc Prof Tan said that based on the resources needed and the effort that goes into such a long-term strategy, they are “likely to be state actors or state-affiliated actors”, who often use intermediaries such as genuine public relations firms.

Mr Ang noted that the networks in this case appear to be owned by public relations companies, which is consistent with a pattern observed globally and in the Southeast Asian region.

In its Adversarial Threat Report in November last year, Meta highlighted the rise of coordinated inauthentic behaviour online, which involves coordinated efforts to manipulate public debate for a strategic goal.

“When we investigate and remove these operations, we focus on behaviour rather than content – no matter who’s behind them, what they post or whether they’re foreign or domestic,” it said.

According to a report by South Korea’s National Cyber Security Center Joint Analysis Team, such networks typically use a newswire service, which helps distribute press releases to multiple media outlets.

These newswire services are often operated by public relations firms that run their own platforms, directly distributing the clients’ materials to affiliated outlets after editing them, while monitoring and providing the results to the customer.

ROLE OF REGULATION

The government on Tuesday said it will review the Foreign Interference (Countermeasures) Act (FICA), to see how it could be used to take pre-emptive action against websites.

“Regulation is a necessary part of the solution, but it's not sufficient,” said Assoc Prof Tan, adding that digital literacy among the population is still most crucial.

“The government can put the laws in place, but if people believe the falsehoods, then there's nothing the laws can do about that.”

Mr Ang said that regulation will always be needed to give platforms and internet service providers “a legal basis” to assist and act in such circumstances.

“But we all have a part to play in being careful about what we read and share, especially if the content is emotive, regardless of how many websites seem to be displaying it,” he said.

Source: CNA/fk

Advertisement

Also worth reading

Advertisement