Thursday, May 7, 2026
Digital Pulse
No Result
View All Result
  • Home
  • Bitcoin
  • Crypto Updates
    • Crypto Updates
    • Altcoin
    • Ethereum
    • Crypto Exchanges
  • Blockchain
  • NFT
  • DeFi
  • Web3
  • Metaverse
  • Analysis
  • Regulations
  • Scam Alert
Crypto Marketcap
  • Home
  • Bitcoin
  • Crypto Updates
    • Crypto Updates
    • Altcoin
    • Ethereum
    • Crypto Exchanges
  • Blockchain
  • NFT
  • DeFi
  • Web3
  • Metaverse
  • Analysis
  • Regulations
  • Scam Alert
No Result
View All Result
Digital Pulse
No Result
View All Result
Home DeFi

Neural Defend and Zee News Launch Deepfake Verification System for News Media

Digital Pulse by Digital Pulse
October 23, 2025
in DeFi
0
Neural Defend and Zee News Launch Deepfake Verification System for News Media
2.4M
VIEWS
Share on FacebookShare on Twitter


Deepfake detection firm Neural Defend and India’s Zee Information have teamed as much as launch the nation’s first AI-powered, deepfake verification system for information media.

The partnership will allow Zee Information shoppers to add suspicious movies, audio clips, or photographs and have Neural Defend’s know-how decide inside seconds whether or not or not the fabric has been artificially manipulated.

Based in 2024 and headquartered in San Francisco, California, Neural Defend made its Finovate debut at FinovateEurope 2025 in London. Piyush Verma is CEO.

Deepfake detection specialist Neural Defend has teamed up with Mumbai-based Zee Information to launch India’s first deepfake verification system for information media—powered by AI. The brand new answer empowers people with direct entry to superior verification know-how, enabling them to authenticate movies, photographs, and audio recordsdata in actual time.

“Our aim was to make sure deepfake detection is quick, correct, and easy for each citizen,” ZMCL Chief Know-how Officer Vijayant Kumar stated. “By integrating Neural Defend’s superior AI with Zee Information’ platforms, we’ve created an answer that may detect even essentially the most subtle manipulations inside seconds. This isn’t solely an innovation for immediately, however a future-proof safeguard for tomorrow’s info ecosystem.”

The partnership will allow people to add suspicious movies, photographs, or audio clips and have Neural Defend’s know-how analyze the recordsdata and make sure their authenticity—or determine the recordsdata as artificially manipulated—inside seconds. At a time when the common viewer is struggling to distinguish more and more subtle manipulated content material, together with video, from non-manipulated content material, the collaboration between Neural Defend and Zee Information offers media shoppers new instruments to assist them “separate truth from fiction in an age the place misinformation spreads quick,” stated ZMCL Advertising Head Anindya Khare.

“Whereas Gen Z and youthful viewers are significantly susceptible to being misled by faux movies and audio, this initiative ensures a secure and credible area for everybody,” Khare added. “For advertisers and companions, it creates essentially the most dependable surroundings to have interaction with audiences—the place superior know-how and authenticity come collectively. That is the way forward for brand-safe and accountable media.”

Mumbai-based Zee Information is without doubt one of the main Hindi information channels in India with greater than 52 million viewers. The corporate is owned by Indian media conglomerate Essel Group and is the flagship channel of Zee Media Company. Zee Information is publicly traded on the Bombay Inventory Change (BSE) and the Nationwide Inventory Change (NSE), and has a market capitalization of $75 million.

Based in 2024 and headquartered in San Francisco, California, Neural Defend made its Finovate debut at FinovateEurope 2025 in London. On the convention, the corporate demonstrated its agentic AI-powered deepfake detection answer that may be built-in into any video, audio, or picture verification platform to supply real-time identification verification to EKYC companies, verification firms, banks, funds service suppliers, fintechs, and extra. Neural Defend’s know-how leverages proprietary, multi-layered AI to identify even delicate alterations and manipulations with precision. The answer additionally boosts safety for video and audio calls by immediately detecting and mitigating deepfakes in actual time.

Picture by Mika Baumeister on Unsplash


Views: 7



Source link

Tags: deepfakeDefendLaunchmediaNeuralNewsSystemVerificationZee
Previous Post

oqVRKd ($5,000 Bonus & 20% Off)

Next Post

DraftKings Turns to Polymarket for New Predictions App

Next Post
DraftKings Turns to Polymarket for New Predictions App

DraftKings Turns to Polymarket for New Predictions App

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Facebook Twitter
Digital Pulse

Blockchain 24hrs delivers the latest cryptocurrency and blockchain technology news, expert analysis, and market trends. Stay informed with round-the-clock updates and insights from the world of digital currencies.

Categories

  • Altcoin
  • Analysis
  • Bitcoin
  • Blockchain
  • Crypto Exchanges
  • Crypto Updates
  • DeFi
  • Ethereum
  • Metaverse
  • NFT
  • Regulations
  • Scam Alert
  • Web3

Latest Updates

  • Crypto Founder Reveals What Keeps Driving Up The Bitcoin Price
  • 8 Leading Free AI Crypto Trading Bots in 2026: Beginner-Friendly Options Reviewed
  • Bitcoin Faces Massive Long Liquidation Imbalance As $15 Billion Sits Below Price

Copyright © 2024 Digital Pulse.
Digital Pulse is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Bitcoin
  • Crypto Updates
    • Crypto Updates
    • Altcoin
    • Ethereum
    • Crypto Exchanges
  • Blockchain
  • NFT
  • DeFi
  • Web3
  • Metaverse
  • Analysis
  • Regulations
  • Scam Alert

Copyright © 2024 Digital Pulse.
Digital Pulse is not responsible for the content of external sites.