![]()
Intel has developed a deepfake detection tool called FakeCatcher which is claims is 96% effective at flagging altered videos. More importantly, Intel says FakeCatcher can operating in real-time with results reported within milliseconds.
Deepfake videos are produced using AI algorithms which can digitally stitch one person’s face onto another’s
Source: Hot Hardware – Intel Combats Growing Deepfake Threat With A Real-Time Detector That Looks At Blood Flow