If you ’re sick of being fooled by AI - generated image and deepfakes on the web , you might be concerned in what the Content Authenticity Initiative has been up to of late .
remain on this page to learn all about the Content Authenticity Initiative , including who is imply and how the chemical group is tackling imitation newsworthiness where range of a function and picture content is involved .
What is the Content Authenticity Initiative?
The Content Authenticity Initiative ( CAI ) is a biotic community of tech company , NGOs , academician , journalists and activists form in 2019 to address misinformation and content legitimacy at shell .
First announced by Adobe , the CAI has grown to admit more than 300 member . As of 2023 , that lean includes The Associated Press , Arm , BBC , Canon , Getty Images , Leica , Microsoft , Nikon , The New York Times , Nvidia , Qualcomm , Reuters , Shutterstock , Universal Music Group , The Wall Street Journal and The Washington Post .
The CAI ’s goal is to create a secure , undecided - source solution that adds a layer of verifiable trust to photos and videos , make it possible to determine whether they ’re real or not just by checking the Content Credentials metadata .
This solution requires cryptological plus hashing to create touch that prove that the metadata of an prototype or video has n’t been altered . Almighty can also use Content Credentials metadata to preserve attribution for a small-arm of mental object or remain anon. if they prefer .
The idea is that any edits , include alterations made in Adobe Photoshop , will be recorded in the metadata and preserved when the look-alike is share by news show outlets and societal web across the web . Anyone can then upload the content onto the CAI ’s Verify site to pick up more about the range and how it has changed over clock time .
Adobe already habituate Content Credentials to indicate that an double has been created using its Firefly generative AI models . Meanwhile , the Leica M11 - P launch in October 2023 as the first camera with Content Credentials build - in .
“ We are focalise on cross - industry participation , with an undefended , extensible approach for providing medium transparency that allows for better rating of mental object provenience ” , explain the CAI on its website .
“ This grouping collaborates with a wide lot of representatives from software , publication , and social media companionship , human right organizations , photojournalism , and pedantic researchers to develop contented attribution standard and tools ” .
The CAI is actuallyopen for anyone to joinfree of charge . By join , you ’ll be invited to establish prototypes , conjoin quarterly outcome and get to the CAI residential district forum .