July Report on Misinformation
July Report on Misinformation
31 July 2025 Rangpur
This alarming trend is being monitored by the Centre for Governance Studies (CGS). Since November, CGS has been regularly publishing monthly statistical analyses on misinformation in Bangladesh.

The evaluation of misinformation patterns for July 2025 reveals a substantial presence of misleading narratives, with 296 documented cases. The misinformation landscape in July was overwhelmingly dominated by political narratives, with 220 instances recorded in this category, far exceeding the combined total of all other categories. This dominance reflects a highly politicized information environment, where political discourse is a prime target for manipulation. The primary targets were political figures, with 66 cases, highlighting concentrated efforts to influence political actors. Social media was identified as the primary source of misinformation, responsible for 289 of the cases, highlighting the critical role of digital platforms in facilitating the rapid dissemination of false information. These findings demonstrate that the challenges in curbing misinformation, particularly within politically sensitive contexts, and emphasize the need for enhanced monitoring and intervention strategies to mitigate its impact.

This alarming trend is being monitored by the Centre for Governance Studies (CGS). Since November, CGS has been regularly publishing monthly statistical analyses on misinformation in Bangladesh. The organization tracks and identifies false information from eight prominent active fact-checking websites in the country: Fact Watch, Rumor Scanner, Ajker Patrika, AFP Fact Check (Bangla), Dismislab, NewsChecker, Fact Crescendo, and Boom BD. CGS conducts daily monitoring of these platforms to compile and verify cases of misinformation. All tracked and verified misinformation data are available to the public through the dedicated platform:https://factcheckinghub.com/.

My alt textIn July, misinformation was most prevalent in the political domain, with a total of 220 recorded instances, constituting the overwhelming majority of cases. After a training aircraft crashed in the Milestone School and College area, a photocard featuring a statement allegedly made by Shafiqul Alam, Press Secretary to the Chief Adviser of the interim government, began circulating on Facebook.However, the photocard was fake. There is no record of such a statement being made by Shafiqul Alam, and a review of his verified Facebook account yielded no such statement. Additionally, the photocard was dated July 19, two days before the crash occurred. The photocard claimed that the Awami League and international forces were involved in the crash. Other categories, such as online hoaxes (28 cases) and entertainment-related misinformation (21 cases), while significantly smaller in scale, demonstrate the persistence of non-political falsehoods that exploit public curiosity and popular culture. Religious misinformation accounted for 13 cases, indicating an ongoing use of faith-based narratives as tools for influence or division. In one case, an X (Twitter) account recently circulated news that a Hindu family was forced to convert to Islam under pressure. However, upon fact-checking, it was revealed that the news is three years old, and the family converted to Islam voluntarily. Economic (6 cases), diplomatic (5 cases), and environmental (3 cases) misinformation remained comparatively marginal, suggesting either a lower frequency of such narratives.

The targeting patterns further underscore the political focus of misinformation campaigns. Political figures (66 cases) and political parties (65 cases) together formed the largest proportion of targets, indicating sustained attempts to shape political perceptions and undermine reputations. Law enforcement and defense institutions were also notable targets (41 cases), a trend that may aim to erode public trust in state security structures.Take, for example, a viral video falsely claims Bangladesh’s new Army Chief made controversial remarks about the former Prime Minister. Upon reviewing, the video is AI-generated, with fake media logos and uniform inconsistencies. On top of that, no new Army Chief has been appointed recently; the current Chief’s term began in June 2024 for three years. The video disseminates misinformation and is not based on reality. Unspecified targets (61 cases) reveal the presence of broad, diffuse misinformation narratives that circulate without directly naming specific individuals or institutions. Celebrities (23 cases) and entertainment-linked misinformation serve as a means of leveraging high-profile public figures for wider reach. Religion itself (12 cases), along with public institutions (13 cases). To illustrate an example of targeting a public institution, false social media posts claimed a terrorist attack hit the US Embassy in Belarus and damaged the Bangladesh and Norway embassies. In reality, the images are from Russian missile strikes in Kyiv, Ukraine.The interim government (10 cases), were also subject to significant targeting, though religious figures (2 cases) and private organizations (3 cases) were relatively minor focuses.

The source distribution highlights the overwhelming role of social media in the dissemination of misinformation, accounting for 289 cases compared to just 7 originating from online news portals. This disparity underscores the pivotal role of decentralized, user-generated platforms in disseminating and amplifying false narratives. Social media’s scale, speed, and participatory nature make it both a powerful medium for misinformation and a challenging space for regulatory or corrective measures, making timely detection and intervention challenging. This overwhelming dominance illustrates a trend consistent with global patterns in digital misinformation dynamics.

Overall, the July data depicts an information disorder heavily skewed towards political, religious, and entertainment content, amplified primarily through social media, and strategically targeting political and religious figures, political parties,law enforcement and defense institutions, and influential public figures. This concentration suggests that misinformation is functioning less as a random byproduct of the digital ecosystem and more as a deliberate instrument within the broader contest for political influence and public opinion.

From a strategic perspective, these findings highlight several key implications. First, interventions must prioritize social media monitoring and regulation, incorporating advanced fact-checking mechanisms and digital literacy campaigns tailored to political content. Second, given the concentrated targeting of political and security institutions, safeguarding information integrity around these actors is critical to maintaining social stability and democratic processes. Third, addressing the substantial proportion of unspecified target misinformation requires broader public awareness initiatives that promote critical consumption of information regardless of the purported subject. Fourth, it is essential to develop a comprehensive understanding of Artificial Intelligence (AI) and its implications.Integrating temporal data would further refine understanding and policy responsiveness. Overall, the July misinformation landscape reveals a politically charged, socially mediated challenge demanding multifaceted, data-driven strategies for effective mitigation.