"The promise of realtime deepfakes for fraudsters is that they can use the tech to engage with a victim in the moment. Rather than some scripted video which may or may not be tailored to the victim, realtime deepfakes allow a scammer to talk directly to their mark and improvise on video calls or livestreams. They can appear just as human as the person they are impersonating, potentially fooling not only people but also the automated systems that require someone to prove their identity to open an account with a financial institution, for example.
For months 404 Media has monitored the spread of deepfake technology throughout fraud-focused Telegram channels. For much of that time, the results were not impressive. Some involved using AI to animate a photo in an attempt to bypass cryptocurrency exchanges’ identity verification processes and the videos were stilted and unnatural. Others looked more realistic, but it was unclear whether the advertisements were scams—fraudsters on Telegram asked for hundreds of dollars for access to their tool that allegedly bypassed know-your-customer (KYC) verification checks. Some fraudsters also advertised access to tools that let a phone user replace their camera’s input with a file from their phone’s gallery, meaning they could upload the deepfake video to services that ask for a selfie. 404 Media has also seen Instagram accounts where a real person consistently deepfakes themselves to appear as a different gender in order to catfish people."
https://www.404media.co/the-age-of-realtime-deepfake-fraud-is-here/