![]() Embassy in Moscow complained to the Russian Foreign Ministry about a fake sex video it said was made to damage the reputation of a U.S. ![]() He called for government leaders and politicians to clearly say it has no place in civilized political debate.Ĭrude videos have been used for malicious political purposes for years, so there's no reason to believe the higher-tech ones, which are more realistic, won't become tools in future disinformation campaigns. "This technology, I think, will be irresistible for nation states to use in disinformation campaigns to manipulate public opinion, deceive populations and undermine confidence in our institutions," Grotto said. ![]() "Within a year or two, it's going to be really hard for a person to distinguish between a real video and a fake video," said Andrew Grotto, an international security fellow at the Center for International Security and Cooperation at Stanford University in California. For instance, people's blinking in fake videos may appear unnatural. "It's a weapon that could be used - timed appropriately and placed appropriately - in the same way fake news is used, except in a video form, which could create real chaos and instability on the eve of an election or a major decision of any sort," Rubio told The Associated Press.ĭeepfake technology still has a few hitches. leader - or an official from North Korea or Iran - warning the United States of an impending disaster. official supposedly admitting a secret plan to carry out a conspiracy. soldier massacring civilians overseas, or one of a U.S. Marco Rubio, R-Fla., one of several members of the Senate intelligence committee who are expressing concern about deepfakes.Ī foreign intelligence agency could use the technology to produce a fake video of an American politician using a racial epithet or taking a bribe, Rubio says. So far, deepfakes have mostly been used to smear celebrities or as gags, but it's easy to foresee a nation state using them for nefarious activities against the U.S., said Sen. If you have enough video and audio of someone, you can combine a fake video of the person with a fake audio and get them to say anything you want. The computer program learns how to mimic the person's facial expressions, mannerisms, voice and inflections. They are made by feeding a computer an algorithm, or set of instructions, lots of images and audio of a certain person. ![]() ![]() It's unclear if new ways to authenticate images or detect fakes will keep pace with deepfake technology.ĭeepfakes are so named because they utilize deep learning, a form of artificial intelligence. Right now, it takes extensive analysis to identify phony videos. Defense Advanced Research Projects Agency is already two years into a four-year program to develop technologies that can detect fake images and videos. Realizing the implications of the technology, the U.S. People may dismiss as fake genuine footage, say of a real atrocity, to score political points. When an average person can create a realistic fake video of the president saying anything they want, Farid said, "we have entered a new world where it is going to be difficult to know how to believe what we see." The reverse is a concern, too. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |