NEWS FROM THE DESK OF THE PUBLISHER

Deep Fakes and our Election

 Years ago CV Weekly did a story on “deep fakes,” which (according to Wikipedia) are images, videos, or audio that are edited or generated using artificial intelligence. The idea that a public figure can be impersonated on a broad scale – that is on television or online – is a scary proposition.

Deep fakes aren’t limited to politics. We often warn our readers that the elderly are often targeted by “bad actors” who use the voice of a loved one to try to get the victim to do something (typically to send money).

However, this political season has also been plagued by deep fakes. For example, according to cyberscoop.com, Taylor Swift said, shortly after the debate ended, that she would support Democratic Vice President Kamala Harris for president. She chided the campaign of former president Donald Trump for the use of artificial intelligence and deep fake images to falsely claim that she was supporting his campaign.

“Recently I was made aware that AI of ‘me’ falsely endorsing Donald Trump’s presidential run was posted to his site,” Swift wrote in an Instagram post officially endorsing Harris. “It really conjured up my fears around AI, and the dangers of spreading misinformation.”

Spreading misinformation – or outright lies – is more common that we realize and has been going on for decades. For example, how many TV commercials have we seen that promoted any kind of product, from cars to cereal, making any kind of claim? Advertisers weren’t chastised much less withdrawn from media.

But it seems to me that things are much worse now. I mean, isn’t there a difference between hawking soda and creating the image of a person who spews lies and misinformation?

As social media sites and email became commonplace, lawmakers in at least 17 states put into place laws that specifically refer to online impersonation done with an intent to intimidate, bully, threaten or harass a person through social media sites, email or other electronic or online communications. These states are California, Connecticut, Florida, Hawaii, Illinois, Louisiana, Massachusetts, Mississippi, New Jersey, New York, North Carolina, Oklahoma, Rhode Island, Texas, Utah, Washington and Wyoming.

However, there are ways to identify a deep fake. Keeping in mind that while creating a full person deep fake takes a lot of work, oftentimes just the face is substituted.

According to telefonica.com, most deep fakes are limited to face substitutions. So one way to detect forgery is to identify incongruities between the proportions of the body and face, or between facial expressions and body movements or postures. Also, reproducing the images of a person’s tongue, teeth and mouth is difficult making it easier to identify a deep fake when the person speaks.

In the end, it’s important to compare what you hear to what that person is known for. And in this season of politics, take the time to look at a candidate’s voting record – no deep fake can cover up that.

Robin Goldsworthy is the publisher of the Crescenta
Valley Weekly. She can be reached at robin@cvweekly.com
or (818) 248-2740.