Deepfakes wreak havoc in news

An+AFP+journalist+views+a+video+on+January+25%2C+2019%2C+manipulated+with+artificial+intelligence+to+potentially+deceive+viewers%2C+or+%26quot%3Bdeepfake%26quot%3B+at+his+desk+in+Washington%2C+D.C.+%28Alexandra+Robinson%2FAFP%2FGetty+Images%2FTNS%29

TNS

An AFP journalist views a video on January 25, 2019, manipulated with artificial intelligence to potentially deceive viewers, or "deepfake" at his desk in Washington, D.C. (Alexandra Robinson/AFP/Getty Images/TNS)

Seeing was no longer believing. Technology had officially tricked humanity’s very own eyes through a facial mirroring technology known as a “deepfake”.

Deepfakes were manipulated videos, produced by artificial intelligence (AI) that yield fabricated images that seemed real according to CNBC.

Deepfakes were not necessarily a bad thing. Filmmakers used deepfakes if an actor passed away or was simply unavailable.

In the movie Fast and Furious 7, Paul Walker, who played Brian O’Conner, passed away before filming a large duration of the movie. The directors decided to use a convincing deepfake to save the production, according to Vanity Fair.

However, it was not deepfakes that were considered dangerous. It was the way people had taken advantage of it.

Recently, a deepfake clip surfaced on the internet. It depicted a video of President Volodymyr Zelenskyy telling the Ukrainian forces to “stand down” and surrender to Russia, according to NPR News. This was blatant disinformation and was quickly taken down, but not before hundreds of thousands of people saw it.

Deepfakes like these were considered a threat to national security. The Pentagon had to create a department to fight against them.

According to CNN Business, the Pentagon recently created the Defense Advanced Research Projects Agency (DARPA). Through military funding, DARPA created a robust algorithm to spot and target deepfakes.

And it was not helping that, with advanced AI, almost anyone could create a convincing deepfake.

Since the technology that helped create deepfake was data hungry, even a short deepfake video required thousands of real pictures. But with advances in AI, anyone could make one, according to Cyber News.

While the spread of deepfakes was hard to control, media viewers were the ones who needed to be on the lookout more than ever, especially in an age where everything humans see, may not be real.

According to CNN Business, audio and video had functioned as a bedrock of truth. If deepfakes made people lose trust in their eyes and ears, it could take the war of misinformation to a whole new level.