HfG Karlsruhe prints 3D models of illegal ammunition to train neural networks and detect illegal weapons in the Syrian war
The research group on Artificial Intelligence KIM of the Karlsruhe University of Arts and Design announces its collaboration with the artist and coder Adam Harvey on the project VFRAME. VFRAME stands for “Visual Forensics and Metadata Extraction” and is an open source Artificial Intelligence system to detect illegal ammunition in war zones. Specifically, KIM has supported the printing of a 3D model of the illegal cluster munition AO-2.5RT (made in the Soviet Union and deployed today in the Syrian war) to train neural networks for the recognition of this weapon.
Human rights researchers often rely on videos shared online to document war crimes, atrocities, and human rights violations. The challenge is to deal with hundreds and thousands of video images that are uploaded daily on networks such as Twitter and Youtube. The work is time-consuming and the materials often show graphic content that can cause secondary traumas.
The VFRAME software is able to analyze videos and detect which weaponry appears automatically. The idea is to repurpose Artificial Intelligence that is usually employed in the commercial and military sector and adjust it to the needs of human rights researchers and investigative journalists, such as the Syrian Archive, an organization dedicated to documenting war crimes and human rights violations that works with video files uploaded to social media. The visual vocabulary of VFRAME allows to detect evidence of all classes of illegal weaponry that are catalogued by the Syrian Archive.
The VFRAME project, in general, demonstrates civil uses of AI for peace and justice — for once, not oriented to technologies of destruction and blind automation. Since the Karlsruhe University of Arts and Design finds its location in a former ammunition factory, forging, in the same place, 3D models of ammunition not for killing but for saving human lives has an important symbolic value.
VFRAME is currently funded by a grant from PrototypeFund.de and Bundesministerium für Bildung und Forschung.