Lagos, Nigeria – January 14, 2026 – In a significant ruling that strengthens accountability for global tech giants operating in Nigeria, the Lagos State High Court sitting at Tafawa Balewa Square (TBS) has ordered Meta Platforms Inc., the parent company of Facebook, to pay $25,000 in damages to prominent human rights lawyer and Senior Advocate of Nigeria (SAN), Mr. Femi Falana, over an alleged invasion of his privacy.
The judgment, delivered on Tuesday, January 13, 2026, by Justice Olalekan Oresanya, followed a $5 million lawsuit filed by Falana in early 2025. The case centered on a misleading video published on Facebook under the page “AfriCare Health Centre,” which falsely portrayed Falana as suffering from prostatitis—a prostate condition he has never had—and suggested he had been battling it for over 16 years, complete with fabricated claims of pain, fatigue, and urinary issues. The video used his name, image, motion pictures, and a purported voice to promote what appeared to be a health product or service, leading to widespread misinformation about his health status.
Falana, represented by his counsel Mr. Olumide Babalola, argued that the publication constituted a direct violation of his constitutional right to privacy under Section 37 of the 1999 Constitution of the Federal Republic of Nigeria (as amended). He further contended that the content was false, offensive, and disturbing, painting him in a false light, tarnishing his hard-earned reputation, and causing him significant mental and emotional distress. The suit was brought pursuant to the Nigeria Data Protection Act (NDPA) 2023, specifically sections 24(1)(A) and (E), as well as Section 34(1)(D), alongside the Fundamental Rights Enforcement Procedure Rules, 2009.
Delivering the judgment, Justice Oresanya held that Meta, as a global technology company that hosts pages and content for commercial benefit through monetization, owes a duty of care to individuals affected by disseminated information on its platform. The court explicitly rejected Meta's defense that it functions merely as a neutral "hosting" or "intermediary" platform, emphasizing that where a company monetizes content, controls distribution algorithms, and where harm from misinformation is reasonably foreseeable—particularly involving sensitive personal data such as health information—it cannot escape liability.
The judge ruled that Meta acts as a joint data controller with page owners, given its role in determining the means and purposes of content processing, monetizing pages, and influencing reach through algorithmic systems. Consequently, Meta was held vicariously liable for the offensive video. The court found that Meta breached Section 24 of the NDPA by processing Falana’s personal data in a manner that was inaccurate, harmful, lacked a lawful basis, and was unfair. The dissemination of false health information was deemed unlawful processing per se.
Justice Oresanya stressed that platforms bear a heightened duty to ensure accuracy and integrity when the risk of inaccuracy is foreseeable, especially concerning sensitive personal data like medical information. As a resource-rich global entity, Meta was expected to implement effective content-review mechanisms, rapid takedown processes, and proportionate safeguards to mitigate harm from misinformation. Its failure to do so amounted to regulatory non-compliance under Nigerian law.
Importantly, the court affirmed that Falana’s status as a public figure does not deprive him of his right to privacy. “The fact that the claimant is a public figure does not rob him of his right to privacy,” the judgment stated. It clarified that the publication of false medical details intrudes into private life, regardless of public standing, and that health data enjoys heightened protection—even for prominent individuals.
Reacting to the decision, Olumide Babalola described it as a major development under the NDPA that weakens the traditional “mere platform” defense often invoked by Big Tech companies. “This reinforces a standard of platform accountability under Nigerian law, aligning with emerging global jurisprudence,” he said. Babalola added that the ruling settles a key misconception in Nigerian legal practice by affirming that health data remains protected, even for public figures, and marks progress in holding digital platforms responsible for foreseeable harms arising from unverified or misleading content.
The case originated in early 2025 when the deceptive video surfaced on Facebook, prompting Falana to demand its immediate removal and substantial compensation for the reputational damage and emotional distress. Although he sought $5 million in general damages, the court awarded $25,000, reflecting a compensatory approach while establishing a precedent for future claims against tech firms in similar circumstances.
This ruling comes amid growing global scrutiny of social media platforms' roles in spreading misinformation, deepfakes, and unauthorized use of personal likenesses. In Nigeria, it aligns with the broader enforcement of the NDPA 2023, which aims to protect personal data and impose obligations on data controllers and processors. Legal experts view the decision as a step toward greater accountability for foreign tech companies operating in the country, potentially influencing how platforms handle content moderation, verification, and user complaints in Africa.
Falana, a renowned advocate for human rights, democracy, and anti-corruption efforts, has built a distinguished career challenging injustices. The false portrayal of his health not only risked ridicule but also exploited vulnerable individuals seeking medical solutions, highlighting the real-world dangers of health-related misinformation on social media.
Meta was represented in the suit by Mr. Tayo Oyetibo (SAN). The company has not yet issued a public response to the judgment as of the time of this report.
This landmark case underscores the evolving intersection of privacy rights, data protection, and digital responsibility in Nigeria, sending a clear signal that platforms profiting from user-generated content must prioritize safeguards against harm.

