OSINT Pulse December 2025 | A year that rewired the information space

Advanced tools like Veo and Nano Banana significantly raised the quality of synthetic video and imagery.

By -  Dheeshma
Published on : 31 Dec 2025 12:11 PM IST

OSINT Pulse December 2025 | A year that rewired the information space

Hyderabad: This year began with a clear signal that the information ecosystem was about to change, and not necessarily for the better.

Meta announced it would stop funding third-party fact-checking programmes, starting in the United States. Almost immediately, fact-checking began to be framed as censorship in political and online discourse. Around the same time, long-used verification features on Facebook, including advanced photo and video search filters, quietly disappeared.

Much of the conversation then shifted toward platform-led alternatives like Community Notes.

While presented as decentralised and participatory, its limits became clearer as the year progressed. Notes were slow to surface, inconsistent and often ineffective during fast-moving or coordinated misinformation campaigns.

What was positioned as a replacement rarely functioned like one.

Generative AI moved from the margins to the mainstream

As platform support for verification weakened, synthetic content surged.

Over the past 12 months, AI-generated material has become easier to produce and harder to spot. Advanced tools like Veo and Nano Banana significantly raised the quality of synthetic video and imagery.

At the same time, AI platforms and products such as Perplexity, Gemini and ChatGPT introduced free plans for India, widening access at an unprecedented scale.

The result was not just more AI content, but more people using it casually. Synthetic media stopped being novel and became part of everyday online behaviour.

A noticeable shift in OSINT skill levels

Another quiet change this year was the baseline skill level within the OSINT community.

This did not start in 2025, but it became more visible over the past year. Techniques that once belonged in introductory OSINT workshops are now closer to basic digital literacy, particularly among younger audiences. Reverse searches, platform mechanics and basic metadata checks are often already familiar.

This shift has implications for how training, documentation and collaborative workflows will be designed.

Synthetic attacks on Indian public figures

One of the more concerning trends this year was the targeted use of AI-generated audio and video against Indian politicians and uniformed service officials.

Several attacks originated from Pakistan-linked handles after Operation Sindoor and intensified as officials increased their public communication. Speeches, interviews and press interactions provided enough material to replicate voices and mannerisms with convincing accuracy.

A detailed year-ender examining how AI was used for identity theft of public figures in India in 2025 is available here.

Children, generative tools and permanent digital footprints

Another trend that accelerated quietly this year was the use of children’s photos and videos in generative AI tools.

Parents and creators increasingly use images of their kids to generate stories, avatars or stylised videos. Often framed as harmless or creative, this raises unresolved questions around consent, permanence and misuse. Once a child’s likeness enters generative systems, control over where it appears and how it is reused is effectively lost.

Tool Update: An OSINT-focused LLM in the making

Journalist and technologist Tom Vaillant shared early work on building a public LLM tool for open source researchers, including the Bellingcat community.

The proposed model is designed to integrate the Bellingcat toolkit, an OSINT knowledge base and documentation for listed resources, with an emphasis on open access and reducing research friction. The projected timeline is April 2026.

Early testing shows the tool responding to prompts such as:

- How do I investigate a suspicious domain?

- What tools can I use to verify an image’s authenticity?

- How can I trace the origin of a social media account?

- What’s the best way to archive web content for investigation?

It is still early, but the direction reflects a broader move toward embedding verification support into tools rather than relying entirely on individual expertise.

An early testing prototype can be accessed here.

Newsroom verification at scale: AP verify

Another notable development came from The Associated Press, which launched AP Verify, a newsroom verification tool developed in collaboration with Rutgers University.

Shared by Aimee Rinehart, who works on generative AI solutions for news at AP, the tool brings multiple verification functions into a single dashboard. These include reverse image searches, geolocation, digital footprint tracking, transcription, analysis and team-based storage.

The project involved multiple teams across the organisation, including editorial, product, legal, technology, finance, partnerships and revenue. Verification is built into newsroom workflows instead of being handled case by case.

Those interested can request a demo.

Finding eyewitness content on X using InVID

AFP’s Hong Kong–based journalist Sophia Xu shared a practical walkthrough on using the InVID WeVerify plugin to run advanced searches on X.

Using a fire at Sydney airport in November 2024 as an example, the tutorial shows how refining searches by time, keywords, and context can surface original eyewitness videos posted in the immediate aftermath.

Here is a link to the tutorial.

Looking back, this year was less about the arrival of new tools and more about structural change.

Platform support for verification weakened. Synthetic content became routine. Investigative skills quietly levelled up. And more of the responsibility for sense-making shifted onto journalists, researchers and communities.

The work continues, but the terrain has clearly changed.

See you next month!

Dheeshma Puzhakkal

(This is part of a monthly report written by Dheeshma Puzhakkal, Editor of NewsMeter’s Fact Check team, on emerging developments in OSINT and AI, with a focus on what matters most to Indian readers and OSINT professionals. For comments, insights and leads, please email at dheeshma.p@newsmeter.in. We do not have a financial relationship with any of the companies or tools mentioned as part of this series.)

Next Story