Advertisement

How to deal with the risk of online falsehoods

Deliberately false online content is harder to spot than you think. Here are seven tactics to counter the threat of disinformation and control the potential damage your business may incur.
IMAGE BY OTTO STEININGER/IKON IMAGES
IMAGE BY OTTO STEININGER/IKON IMAGES

Online falsehoods can have a negative impact not only on individuals but on businesses as well. The spread of false information can carry financial and reputational risks.

Unlike misinformation, which is incorrect and misleading but may not be intentional, digital disinformation campaigns targeting companies are always intentional and harmful. And they are rampant.

In one example, an article published on a financial website contained several falsehoods about Farmland Partners Inc., a publicly traded US company, and caused a 39% drop in its stock price in one day, the company said. In a 2021 settlement with the company, the author acknowledged that the article was part of a “short and distort” scheme, a stock manipulation that enables a scammer to profit when a company’s stock falls.

In another instance, one company’s share price tumbled 28% before trading was temporarily halted and another company’s share price dropped 16% after a trader in Scotland tweeted false information to benefit from the price swings, according to the US Securities and Exchange Commission.

Share price manipulation is just one of many risks businesses face when they are hit by a disinformation effort. Investors and consumers may rely on online sources for their news, and they may have difficulty separating fact from fiction. Seven in ten respondents in a survey the UK Office of Communications (Ofcom) conducted this year were confident they could spot incorrect information, but only two out of ten were able to identify the signs of a real social media post without making mistakes.

Financial gain or an attempt to harm a competitor are among the motivations behind disinformation campaigns. Falsehoods may be spread by activists or disgruntled employees who have a grievance or disagreement with a company’s investments or public political position, said Joseph Nocera, partner and leader of the PwC Cyber & Privacy Innovation Institute.

Disinformation campaigns that target a business’s reputation can be particularly costly.

“You can spend generations building up a business, constructing a unique selling proposition, building trust, creating a reputation for buying local or being reliable, supporting the community, or being inclusive,” said Ian Thomson, director of Lloyds Banking Group Centre for Responsible Business at the University of Birmingham in the UK. But while company reputations are “massively valuable, they can be so fragile”, Thomson said. “Suddenly, one tweet can undo much or all of it. In some ways, reality doesn’t have to matter.”

Tactics to counter online disinformation

When an entity is identifying, assessing, or mitigating reputational, geopolitical, or financial risks that disinformation poses, “the finance team’s expertise can be valuable in scenario planning with the risk, technology, and cybersecurity executives”, Nocera said.

Here are seven tactics for finance professionals to counter the threat of deliberately manipulated content:

Understand what you’re up against

Digital disinformation campaigns can be launched by unscrupulous governments, organised crime, or opportunistic scammers, according to a PwC report on cybersecurity and fraud. They are inexpensive, spread falsehoods easily, and can cause considerable damage ranging from significant financial losses to reputational damage.

The disinformation may consist of fake news stories (which may be spread in forms that include text, audio, and video) or postings from fake social media accounts, as well as falsified marketing on social media platforms and search engines, according to the PwC report. Another threat is deepfakes, which are digitally altered videos or audio files that seem realistic and are designed to spread false impressions or information.

State actors are a very active part of the disinformation effort. They may include countries that want to punish or embarrass a company with which they have a disagreement or seek to retaliate against for actions or comments that are considered hostile to the nation-state. These efforts can be very sophisticated — and may even be outsourced to experienced third parties. According to the PwC report, commercial disinformation-as-a-service (DaaS) purveyors are hired to spread false facts on websites, news outlets, and social media. (See the sidebar, “The Rise of Computational Propaganda”.)

Develop a robust and comprehensive response plan

Thomson recommended creating an early warning system to spot issues before or as soon as they arise. “There should be an ongoing effort to monitor social media and other outlets so that the organisation can learn about problems as soon as possible,” he said. This responsibility would normally fall to the communications team, either a designated person or a full crisis response team if the organisation is big enough. However, Thomson noted that there is a case for the finance team to be involved, especially those who undertake or supervise reporting or investor or stakeholder engagement activities within the organisation, as it can help shape any discussions or the content of future reports.

Proactive efforts will serve the organisation well when a crisis occurs. “You don’t want to be making it up on the fly,” Nocera said. Instead, businesses should brainstorm potential scenarios and plan the best responses to each one, perhaps even practising them in a tabletop exercise. “This gives you a chance to work out the kinks and practise communications and decision-making,” he said.

For each scenario, the point will be to consider the direct or indirect financial cost and any potential impact on share price.

Rely on professionals when making official statements

Most entities require all employees to go through company spokespeople and not make official statements without advice from the communications team. Dealing with disinformation is a skill, and those without experience in this area easily misspeak, Thomson said. “Public communications need to be skilfully and professionally handled,” he said.

Meanwhile, finance leaders can better educate themselves on threats by getting to know others in the company who are monitoring those dangers, Nocera said, such as teams involved in monitoring of media and social media. This will help finance team members get up to speed quickly when an incident does occur.

In the past, many believed that ignoring falsehoods might make them fade away more quickly. Today, the rapid spread and impact of damaging disinformation call for a response. If not, “the company can become a toxic brand”, Thomson said.

Also, a lawsuit may sound like the best remedy, but it can take years to resolve. “You don’t want to win a case after the company fails,” he said.

Get the facts first

Thomson advised gathering all the details about the company’s actual actions before making any statement about even the most outlandish disinformation.

Consider a company accused of corruption in awarding a contract, for example. It may learn after some digging that while nothing illegal occurred, some corners may have been cut or there may have been compliance lapses. Armed with the facts, the company can now decide how to frame its response, Thomson said. This is another aspect where the finance team’s knowledge can help pinpoint areas to investigate and determine the extent and potential impact of any issues.

“It’s the cover-up that kills you,” Thomson added.

Help stop the damage

If a disinformation campaign affects the company’s stock price or causes other financial damage, the company should make official disclosures and corrections as quickly as possible to limit any negative impact, Thomson said. Working with the advice of internal media professionals, finance can play an important role.

The finance executive may be called on to help prepare the CEO to speak publicly about the impact or to participate in a press conference, offering details to debunk the falsehoods, Nocera noted. “The finance team has credibility, and they’re known for following professional ethics,” he said. “They are considered the fact checkers,” so their statements on financial issues should carry weight.

Seek out the source

Thomson recommended identifying the source of the disinformation as part of the response effort. Find out who spread it, what organisations or groups they may be associated with, who funds those groups, where they post, and what else they have published or said in the past. Also look for any potential corroboration of their accusations and the sources involved.

“Someone might attribute something to a Harvard study, but then you find out there is no such study,” Thomson said. In other cases, the material being cited may be seriously out of date. A fuller picture of the source and their motivations can help the company justify and support its own case.

When investigating these sources, domain registration details, which may include contact information and other details about the site owner, are one potential way to spot a bad actor, according to a study conducted by professors from University College London and two US universities. Limited registration details — such as the use of a holding company, corporate entity, or LLC rather than a person’s name — can point to a potential fake news site, the study found.

Evaluate the actual damage

While it may seem that a negative report has spread everywhere, Thomson noted that social media can be an echo chamber. “You may think everyone knows, but the disinformation may have only been retweeted among a group that thinks all business is evil, for example,” he said. Instead of engaging with a group whose minds may be tough to change, it may be a better idea to use social media or other channels to reinforce positive messages about the company with a wider audience.


The rise of computational propaganda

A 2020 Oxford Internet Institute report found evidence in 48 of the 81 countries studied that state actors were working with private companies or strategic communication firms that provide social media manipulation campaigns called “computational propaganda”. These campaigns use algorithms, automation, and big data to shape opinion, and the report said manipulation continues to be “a pervasive part of public life across all regime types”.


Anita Dennis is a freelance financial writer based in the US. To comment on this article or to suggest an idea for another article, contact Oliver Rowe at Oliver.Rowe@aicpa-cima.com.


LEARNING RESOURCES

Risk Scenarios

Learn key risk scenarios to help plan for various threats to organisational reputation and governance.

Find this course in the AICPA Store and the CGMA Store.

COURSE

Managing Business Risks

This course teaches about identifying and understanding risks to reduce a business’s exposure to threats that could damage assets and reputation.

Find this course in the AICPA Store and the CGMA Store.

COURSE

Managing Risk Analytics

This course teaches valuable analytical tools for managing risks and making informed decisions.

Find this course in the AICPA Store and the CGMA Store.

COURSE