Think you can trust a Voice Call? Think again-read this Now!
- ByDivya Adhikari
- 06 Sep, 2025
- 0 Comments
- 2

In India, scammers are using AI to trick victims like never before. A shocking incident in Hyderabad highlights how dangerous these scams have become. In March, a 72-year-old homemaker received a WhatsApp message that appeared to come from her sister-in-law living in New Jersey. The message urgently requested money.
When she called to confirm, the voice on the line sounded familiar and said “Yes.” Trusting the voice, she transferred ₹1.97 lakh through Google Pay. Only later did she realise she had been scammed.
Experts warn that AI-powered fraud, including voice cloning, deepfakes, and OTP manipulations, is rising rapidly in India. Even familiar voices and faces cannot be trusted anymore. Criminals exploit technology to make their deception seem authentic, making detection extremely difficult.
Victims are advised to double-check requests for money, confirm through multiple channels, and never rely solely on voice or video. Banks and authorities are also strengthening AI detection and fraud-prevention measures.
This incident is a warning that AI scams are no longer science fiction—they are real, widespread, and financially devastating. Awareness is the first line of defence.
Tags:
Post a comment
India's Operation Sindoor still on: High alert must continue!
- 25 Jul, 2025
- 2
Saiyaara soars, Nepo baby finally gets it right!
- 25 Jul, 2025
- 2
UGC-NET June 2025 results out: Who qualified, who didn't?
- 22 Jul, 2025
- 2
What happens if a Vice President resigns? Here's what the...
- 22 Jul, 2025
- 2
CA Exams in Punjab and Jammu postponed due to Heavy...
- 03 Sep, 2025
- 2
Categories
Recent News
Daily Newsletter
Get all the top stories from Blogs to keep track.