Back to all tools
Free • No signup required
🎣Scams & Fraud

AI Voice Clone Detector

Worried a phone call might be using a cloned voice? Describe the call and get help identifying signs of AI voice cloning scams.

🔒
No data stored
Instant results
🆓
100% free

Frequently Asked Questions

Very realistic. Modern AI can clone a voice from just a few seconds of audio from social media, voicemails, or videos. The clones can sound nearly identical to the real person, making these scams particularly convincing and dangerous.
The 'grandparent scam' is most common - scammers clone a grandchild's voice and call pretending to be in jail or an accident needing bail money. CEO fraud targets businesses with cloned executive voices authorizing wire transfers. Fake kidnapping calls demand ransom.
Hang up and call them back on a number you know is real. Ask a personal question only they would know (not something on social media). Request a video call. Create a family code word for emergencies. Trust your instincts if something feels wrong.
From social media videos, YouTube content, TikToks, voicemail greetings, podcast appearances, or any public audio. Even a few seconds is enough for modern AI. This is why limiting public audio of yourself and family can reduce risk.

Related Tools

Learn More