Billionaire files lawsuit against Facebook and AI scams
A billionaire has filed a lawsuit against Facebook, seeking the removal of his face from AI scams. A lawsuit filed by a mining executive poses a significant challenge to the legal protections enjoyed by tech companies, as it claims that Facebook’s ad systems, which rely on artificial intelligence, contribute to the proliferation of scams.
Every day, nonstop, a dedicated team of cybersecurity experts diligently searches through Facebook, seeking the image of their employer, an ambitious Australian billionaire who is resolute in his mission to bring the social media behemoth to justice.
Similar to American financiers like Bill Ackman, Andrew Forrest has expressed his frustration with Facebook-parent Meta Platforms for not taking sufficient action to combat scam advertisements that exploit his image to promote fraudulent investment schemes.
However, in contrast to other opponents of Meta, the mining executive has made a commitment to allocate significant funds towards his legal campaign. In his U.S. federal lawsuit against Meta, he claims that the company’s ad systems, powered by artificial intelligence, contribute to the creation and amplification of false ads.
This case stands out as a rare instance where there is potential to challenge the wide-ranging legal protections granted to technology companies that facilitate user-generated content. In a recent development, a federal judge denied Meta’s attempt to have the lawsuit dismissed. The company has filed an appeal.
The protections have successfully defended against multiple court challenges in recent years, and legal experts note that this case marks one of the initial instances where AI has been involved in litigation concerning the 1996 law, commonly known as Section 230.
Forrest has expressed a strong determination to see the fight through, regardless of the financial implications. He firmly believes that Meta should take greater responsibility in regulating its platform.
“It is truly shocking that a board of directors would prioritize their own corporate profits over the well-being of innocent, vulnerable individuals who have lost their life savings,” Forrest expressed with disbelief.
Meta did not provide any comments in response to the requests.
There has been a rise in fraudulent celebrity ads on social media platforms like X and Snapchat, with cyber security and blockchain researchers noting an increase in crypto scams. The scams on Meta apps entice individuals with advertisements, coaxing them to join chat groups on platforms like WhatsApp or other channels, where they are then directed towards questionable investment opportunities. The schemes have resulted in significant financial losses for small investors.
Eric Goldman, a law professor at the Santa Clara University School of Law, expressed his surprise at the judge’s ruling in the Forrest case. He noted that a previous ruling in 2009 had found Google not liable for ads promoting scam ringtones.
One crucial aspect of Forrest’s case revolves around how the courts will perceive AI-generated content. The question is whether they will consider it to have originated from the user who provided the inputs (in this case, a scammer) or from Meta’s model. In 2009, the judge ruled that the plaintiff needed to prove Google’s role in the “creation or development” of the words used in the ads.
Meta and other platforms have been adept at fending off challenges to Section 230, but they now find themselves under scrutiny from various angles as demands to hold them accountable for specific user content intensify. This year, investment firms filed a shareholder lawsuit alleging that Meta did not adequately safeguard users from instances of human trafficking and child sexual exploitation.
From 2018 to 2019, three individuals known as Jane Does filed lawsuits against Facebook in Texas courts. They claimed that they were enticed into engaging in sex trafficking when they were underage by individuals they had connected with on Instagram and Facebook. In 2021, the Supreme Court of Texas dismissed the claims of negligence and product liability, citing their classification under Section 230. The Supreme Court chose not to hear the case in 2022.
Forrest, with a net worth of approximately $14 billion, has already incurred legal fees exceeding $5 million. The second wealthiest individual in Australia, who built his fortune through his leadership of an iron-ore mining company, is widely regarded in the country as a determined and tenacious business executive who is not afraid to engage in lengthy legal disputes.
Fortescue Metals Group, the company in question, has been involved in a long-standing dispute with Australia’s indigenous Yindjibarndi people. The Yindjibarndi people claim that the company has not adequately compensated them for the use of their land, a claim that Fortescue denies.
In 2014, Forrest initially brought up the matter of fake profiles to Facebook after his private security team noticed the problem.
Executives instructed him to create an official account for the purpose of comparing it to the fake accounts. Despite our efforts, the platform continued to be plagued by the presence of fake profiles.
Feeling dissatisfied with Meta’s lack of action, Forrest made the decision to allocate additional resources to the fight.
In 2019, he established a mission-control-style room in Perth, Australia, where a team of cybersecurity professionals diligently search Meta platforms for fake profiles and advertisements that use his image. They promptly report any findings to the company.
The cyber monitoring center operates 24/7, with each staff member equipped with a pair of monitors. At the front of the room, there are four large screens measuring 8 feet by 4 feet. These screens display a continuous scan of Meta sites and chat rooms. He has invested a significant amount of money in running the team thus far.
The deceptive advertisements exploit Forrest’s identity and likeness to entice users who interact with them into exclusive chat rooms. “Make sure to review this information before the financial industry professionals take their next action,” states one of the Facebook advertisements mentioned in the legal case, according to Andrew Forrest.
The billionaire also collaborated with Australian authorities, deploying members of his private security team to assist the Australian Federal Police in locating the international crime syndicates believed to be responsible for the advertisements.
According to Forrest, Meta’s AI ad tools have allowed scammers to easily create and spread deceptive advertisements on a large scale. In October, Facebook introduced new tools that enable certain advertisers to create multiple versions of text by providing an original copy.
In May, the company made an announcement about the introduction of advanced generative-AI ad features. These features enable advertisers to generate various image variations based on user prompts.
According to his lawyer, Simon Clarke, there is a noticeable increase in false advertisements whenever Forrest’s name appears in the news.
It has been a challenging journey since the investment scams began in 2019. “They have only grown and become more noticeable with the rise of artificial intelligence,” Forrest remarked.
Julie Young
Julie Young is a Senior Market Reporter and Analyst. She has been covering stock markets for many years.