Although the technology has previously been employed during Pakistan’s notoriously oppressive election season, this instance garnered international notice. Imran Khan, Pakistan’s former prime minister, has been in jail for the entire electoral campaign and was disqualified from running. With the U.S. presidential campaign trail on its way for the 2024 elections, this deepfake in Pakistan was done by the person himself while campaigning from jail — even though he is prohibited from doing so. It has garnered much attention.

On Saturday, Mr. Khan’s A.I. voice declared victory as official tallies revealed candidates affiliated with his party, Pakistan Tehreek-e-Insaf, or P.T.I., gaining the most seats in an unexpected outcome that sent the nation’s political structure into disarray.

This video looks like the one Khan released from jail on December 19, 2023 — with a few updates to the speech and declaring victory in the election. After the first part of the video spoken in Urdu — you can hear it spoken in English with English subtitles. You may find it interesting to listen to.

Says Khan, “I had full confidence that you would all come out to vote. You fulfilled my faith in you, and your massive turnout has stunned everybody.” The speech rejects the victory acceptance of  Nawaz Sharif — whom Khan calls a “rival,” and he urges everyone to defend his win.

The entire video is filled with historical images and footage of Mr. Khan, and remarkably includes a disclaimer regarding its artificial intelligence roots.

The New York Times points out that this type of AI usage is not unprecedented.

Prior to the 2022 election, the South Korean “People Power Party,” which was in opposition at the time, developed an artificial intelligence (AI) avatar of Yoon Suk Yeol, their presidential candidate, that conversed with voters digitally and used slang and jokes to appeal to a younger audience — and he won!

Politicians in the US, Canada, and New Zealand have employed artificial intelligence (A.I.) to produce dystopian imagery to support their positions or to highlight the potentially hazardous aspects of the technology, as demonstrated in a film featuring Jordan Peele and a deepfake Barack Obama.

To appeal to voters in that demographic, Manoj Tiwari, a candidate for the ruling Bharatiya Janata Party, produced an artificial intelligence (AI) deepfake of himself speaking Haryanvi for the 2020 state election in Delhi, India. It didn’t seem to be identified as A.I. as clearly as the Khan video was.

What about the fake robocall featuring President Joe Biden?

We just had the fake robocall featuring President Joe Biden — The caller states, “Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again,” in what appears to be an impersonation or digital manipulation of the president’s voice. To counteract the kind of misinformation that artificial intelligence and deepfakes can produce during elections, legislators from both major parties have drafted laws in at least 14 states.

As the U.S. elections get closer and the campaign trail becomes hotter and well-worn, more deepfakes will appear — just like the Imran Khan video. And the experts claim that these deepfakes may or may not be made by the candidate themselves.

Featured Image Credit: Ron Lach; Pexels

Deanna Ritchie

Managing Editor at ReadWrite

Deanna is an editor at ReadWrite. Previously she worked as the Editor in Chief for Startup Grind, Editor in Chief for Calendar, editor at Entrepreneur media, and has over 20+ years of experience in content management and content development.