DOJ Cites Deepfake Risk to Block Release of Biden’s Interview with Special Counsel

The Biden administration is refusing to release audio of President Biden’s interview with Special Counsel Robert Hur, arguing that it could be manipulated through deepfake technology.
The Justice Department (DOJ) outlined its concerns in a Friday court filing, acknowledging that there is already enough public audio available to create AI deepfakes of both Biden and Hur. However, it argued that releasing the true recording would make it harder to disprove false versions.
The DOJ also cited concerns about the passage of time and advancements in audio and artificial intelligence technologies that could amplify the potential for malicious manipulation of audio files.
Biden’s administration is facing pressure from conservative legal groups and House Republicans to release the audio. The DOJ has previously released a transcript of the interview, which revealed several embarrassing moments for the president.

The Music Industry’s Battle Against AI-Generated Fakery

As generative AI technology advances, the music industry faces a new challenge: the creation and proliferation of deepfake music that mimics the voices and styles of artists without their consent. This has raised concerns about copyright infringement, identity theft, and the potential devaluation of artists’ unique contributions.

Industry leaders are advocating for legislation to protect artists from unauthorized deepfake creations. The Tennessee ELVIS Act, the first of its kind in the US, criminalizes the use of AI to clone voices and likenesses of performers. Federal legislation is also being considered.

However, the use of AI in music also has potential benefits, such as assisting artists with songwriting and production. Industry stakeholders are exploring ways to regulate AI usage while promoting innovation and protecting artists’ interests.

Fake Audio of Philippine President Fuels Tensions in South China Sea

A manipulated audio clip falsely attributed to Philippine President Ferdinand Marcos Jr. has heightened tensions between the Philippines and China over their longstanding territorial disputes in the South China Sea. The fake audio, accompanied by images of Chinese vessels, was released on a popular YouTube channel and attempted to portray Marcos Jr. as directing the Philippine military to confront a particular foreign country. However, the Presidential Communications Office quickly confirmed the audio to be entirely fabricated and a product of deepfake technology. The office emphasized that no such directive has been issued and condemned the spread of misinformation and disinformation online. Experts believe the fake audio is unlikely to have been created by Beijing as it does not align with its interests in the region. The Philippines and the US recently commenced their annual Balikatan joint military exercises, further fueling concerns about escalating tensions. The viral deepfake has raised concerns about cybersecurity preparedness in the Philippines, prompting calls for increased education on online fraud detection. The government plans to combat the proliferation of deepfakes and other potentially malicious AI-generated content through its Media and Information Literacy Campaign.

FIR Filed against Individual for Uploading AI-Generated Political Video featuring Ranveer Singh

Maharashtra Police’s cyber cell has registered an FIR against an individual for uploading an AI-generated video featuring actor Ranveer Singh endorsing a political party. The video, which was manipulated from Ranveer’s recent interview in Varanasi, sparked concerns about the proliferation of deepfake videos. Ranveer’s father, Jagjit Singh Bhavnani, filed a complaint alleging that the video was false and that Ranveer has no affiliation with any political party. The FIR underscores the severity of the offense and highlights the potential impact of deepfake technology on public opinion.

Deepfake Video Probe Underway, Mumbai DCP Urges Non-Forwarding

Mumbai DCP Datta Nalawade has confirmed that an investigation into deepfake videos featuring Aamir Khan and Ranveer Singh is ongoing. He has urged people not to forward such videos, which are created using computer algorithms and multiple images of the same person. Five complaints have been filed so far, and the investigation is underway in all cases. Nalawade emphasized the importance of reporting such videos and being alert to them.

Ranveer Singh Denies Deepfake Video, FIR Filed Against User

A First Information Report (FIR) has been filed against an unidentified individual for allegedly uploading a deepfake video featuring actor Ranveer Singh. The video, which has since been removed, appeared to show Singh endorsing a political party. However, Singh’s father has stated that the actor never made such a statement and has no affiliation with any political organization.

Deepfake Mischief: Cyber Police File FIR against X User for Manipulating Ranveer Singh Video

The Maharashtra Police’s cyber cell has taken swift action by registering an FIR against an anonymous user, @sujataindia1st, for uploading a manipulated video of actor Ranveer Singh. The altered video, which featured Ranveer allegedly endorsing a political party, sparked outrage and concern. Reacting to the incident, Ranveer’s father, Jagjit Singh Bhavnani, filed a complaint, leading to the registration of the FIR. The malicious video distorted Ranveer’s actual statement, which praised Prime Minister Narendra Modi’s efforts to preserve Indian culture and heritage. The falsified version portrayed Ranveer criticizing the Prime Minister and promoting a political party, which Ranveer has explicitly denied having any affiliation with. The FIR includes charges under relevant sections of the Indian Penal Code and the Information Technology Act, emphasizing the seriousness of the offense. The incident highlights the growing threat of deepfake technology, which can be deployed to spread misinformation and defame individuals. The authorities are investigating the case, and further updates are expected.

Scroll to Top