Falsified Satellite Images in Deepfake Geography Seen as Security Threat

By John P. Desmond, AI Trends Editor Deepfake is a portmanteau of “deep learning” and “fake”, and refers to a synthetic media usually in which a person in an existing image or video is replaced with someone else’s likeness. Deepfakes use techniques from machine learning and AI to manipulate visual and audio content with a […]

Nov 30, -0001 - 00:00
 0
Falsified Satellite Images in Deepfake Geography Seen as Security Threat
Techatty All-in-1 Publishing
Techatty All-in-1 Publishing

By John P. Desmond, AI Trends Editor

Deepfake is a portmanteau of “deep learning” and “fake”, and refers to a synthetic media usually in which a person in an existing image or video is replaced with someone else’s likeness. Deepfakes use techniques from machine learning and AI to manipulate visual and audio content with a high potential to deceive.

Deepfakes applied to geography have the potential to falsify satellite image data, which could pose a national security threat. Scientists at the University of Washington (UW) are studying this, in the hopes of finding ways to detect fake satellite images and warn of its dangers.

Bo Zhao, Assistant Professor of Geography, University of Washington

“This isn’t just Photoshopping things. It’s making data look uncannily realistic,” stated Bo Zhao, assistant professor of geography at the UW and lead author of the study, in a news release from the University of Washington. The study was published on April 21 in the journal Cartography and Geographic Information Science. “The techniques are already there. We’re just trying to expose the possibility of using the same techniques, and of the need to develop a coping strategy for it,” Zhao stated.

Fake locations and other inaccuracies have been part of mapmaking since ancient times, due to the nature of translating real-life locations to map form. But some inaccuracies in maps are created by the mapmakers to prevent copyright infringement.

Talk to Techatty
Talk to Techatty

National Geospatial Intelligence Agency Director Sounds Alarm

Now with the prevalence of geographic information systems, Google Earth and other satellite imaging systems, the spoofing involves great sophistication and carries more risks. The director of the federal agency in charge of geospatial intelligence, the National Geospatial Intelligence Agency (NGA), sounded the alarm at an industry conference in 2019.

“We’re currently faced with a security environment that is more complex, inter­connected, and volatile than we’ve experienced in recent memory—one which will require us to do things differently if we’re to navigate ourselves through it successfully,” stated NGA Director Vice Adm. Robert Sharp, according to an account from SpaceNews.

To study how satellite images can be faked, Zhao and his team at WU used an AI framework that has been used to manipulate other types of digital files. When applied to the field of mapping, the algorithm essentially learns the characteristics of satellite images from an urban area, then generates a deepfake image by feeding the characteristics of the learned satellite image characteristics onto a different base map. The researchers employed a generative adversarial network machine learning framework to achieve this.

The researchers combined maps and satellite images from three cities—Tacoma, Seattle and Beijing—to compare features and create new images of one city, drawn from the characteristics of the other two. The untrained eye may have difficulty detecting the differences between real and fake, the researchers noted. The researchers studied color histograms and frequency, texture, contrast, and spatial domains, to try to identify the fakes.

Web and Cloud LLC - talk to us and let's discuss your needs.
Let's help transform your business

Simulated satellite imagery can serve a legitimate purpose when used to represent how an area is affected by climate change over time, for example. If there are no images for a certain period, filling in the gaps to provide perspective can provide perspective. The simulations need to be labeled as such.

The researchers hope to learn how to detect fake images, to help geographers develop data literacy tools, similar to fact-checking services. As technology continues to evolve, this study aims to encourage more holistic understanding of geographic data and information, so that we can demystify the question of absolute reliability of satellite images or other geospatial data, Zhao stated. “We also want to develop more future-oriented thinking in order to take countermeasures such as fact-checking when necessary,” he said.

In an interview with The Verge, Zhao stated the aim of his study “is to demystify the function of absolute reliability of satellite images and to raise public awareness of the potential influence of deep fake geography.” He stated that although deepfakes are widely discussed in other fields, his paper is likely the first to touch upon the topic in geography.

“While many GIS [geographic information system] practitioners have been celebrating the technical merits of deep learning and other types of AI for geographical problem-solving, few have publicly recognized or criticized the potential threats of deep fake to the field of geography or beyond,” stated the authors.

US Army Researchers Also Working on Deepfake Detection

Professor C.-C. Jay Kuo, Professor of Electrical and Computer Engineering, University of Southern California

US Army researchers are also working on a deepfake detection method. Researchers at the US Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory, in collaboration with Professor C.C. Jay Kuo’s research group at the University of Southern California, are examining the threat that deepfakes pose to our society and national security, according to a release from the US Army Research Laboratory (ARL).

Their work is featured in the paper titled “DefakeHop: A light-weight high-performance deepfake detector,” which will be presented at the IEEE International Conference on Multimedia and Expo 2021 in July.

ARL researchers Dr. Suya You and Dr. Shuowen (Sean) Hu noted that most state-of-the-art deepfake video detection and media forensics methods are based upon deep learning, which has inherent weaknesses in robustness, scalability, and portability.

“Due to the progression of generative neural networks, AI-driven deepfakes have advanced so rapidly that there is a scarcity of reliable techniques to detect and defend against them,” You stated. “We have an urgent need for an alternative paradigm that can understand the mechanism behind the startling performance of deepfakes, and to develop effective defense solutions with solid theoretical support.”

Relying on their experience with machine learning, signal analysis, and computer vision, the researchers developed a new theory and mathematical framework they call the Successive Subspace Learning, or SSL, as an innovative neural network architecture. SSL is the key innovation of DefakeHop, the researchers stated.

“SSL is an entirely new mathematical framework for neural network architecture developed from signal transform theory,” Kuo stated. “It is radically different from the traditional approach. It is very suitable for high-dimensional data that have short-, mid- and long-range covariance structures. SSL is a complete data-driven unsupervised framework, offering a brand-new tool for image processing and understanding tasks such as face biometrics.”

Read the source articles and information in a news release from the University of Washington, in the journal Cartography and Geographic Information Science,  an account from SpaceNews,a release from the US Army Research Laboratory, and in the paper titled “DefakeHop: A light-weight high-performance deepfake detector.”

Techatty Connecting the world of tech differently! Read. Write. Learn. Thrive. Make an informed decision without distractions. We are building tech media and publication networks to connect YOU and everyone to reliable information, opportunities, and resources to achieve greater success.
Vote HARRIS for PRESIDENT.
Vote HARRIS for PRESIDENT.