Intel announces FakeCatcher, a service that detects deepfake videos with 96% accuracy

Deepfake clips have become so prevalent and well-crafted that intel believes there is a need for FakeCatcher, a specialized detector to combat them.

The announcement comes after EU authorities earlier this year called on tech companies to fight deepfake videos and disinformation more aggressively, hinting at the preparation of a package of laws that would include stiff penalties. Called the Digital Services Act, it requires companies running online platforms to hand over information relevant to fighting disinformation and intervene for the timely removal of abusive content. This includes the development of an effective mechanism for detecting videos and images altered by deepfake technology.

In other words, the new FakeCatcher is nothing more than the free version of the service that Intel has developed rather with valuable customers in the tech company sphere in mind.

Read:  Fascinating photo of a supernova could solve a curious mystery

Intel claims the product has a 96% accuracy rate and works by analyzing blood flow in facial blood vessels. Although, such details are normally invisible to the average viewer, the software can detect these subtle differences by analyzing videos uploaded for analysis. “The verdict is given in virtually real time, with FakeCatcher users finding out on the spot whether or not the clip submitted for analysis is a fake.

Ilke Demir, a senior researcher at Intel Labs, designed FakeCatcher in collaboration with Umur Ciftci of the State University of New York at Binghamton. The service uses Intel hardware and software, runs on a dedicated server, and features a web interface for uploading videos for analysis. In other words, no local software installation or dedicated PC is required for the analysis part.

Read:  Galaxy Z Fold 4 receives unofficial list of specifications, confirming a much improved design

FakeCatcher, analyzes 32 locations from the face for inconsistencies in the way blood flow is partitioned. For example, if a face is “transplanted” and artificially modeled on another person’s face, Deep Learning algorithms created by Intel will be able to detect anomalies that expose the fake created.

A cat-and-mouse game will likely ensue between software developers assisting in the creation of deepfake clips and those providing detection solutions, and it will only be a matter of time before the details analyzed by FakeCatcher algorithms are included in the processing of even harder-to-detect deepfake clips.

The Best Online Bookmakers April 18 2024

BetMGM Casino

Bonus

$1,000