(en) Matthijs Douze, Giorgos Tolias, Ed Pizzi, Zoë Papakipos, Lowik Chanussot, Filip Radenovic, Tomas Jenicek, Maxim Maximov, Laura Leal-Taixé, Ismail Elezi, Ondřej Chum et Cristian Canton Ferrer, « The 2021 Image Similarity Dataset and Challenge », (arXiv2106.09672) : « ILes empreintes digitales d'image, telles que PhotoDNA de Microsoft, sont utilisées dans l'ensemble de l'industrie pour identifier les images qui dépeignent l'exploitation et la maltraitance des enfants. »
Reinhard Eher, Leam A. Craig, Michael H. Miner et Friedemann Pfäfflin, International Perspectives on the Assessment and Treatment of Sexual Offenders: Theory, Practice and Research, John Wiley & Sons, (ISBN978-1119996200, lire en ligne), p. 514
Marcia Lattanzi-Licht et Kenneth Doka, Living with Grief: Coping with Public Tragedy, Routledge, (ISBN1135941513, lire en ligne), p. 317
(en-US) Kashmir Hill, « A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. », The New York Times, (ISSN0362-4331, lire en ligne, consulté le )
(en-US) Kashmir Hill, « A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. », The New York Times, (ISSN0362-4331, lire en ligne, consulté le )
« A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. », The New York Times, (lire en ligne, consulté le )
Emma Roth, « Google AI flagged parents' accounts for potential abuse over nude photos of their sick kids », The Verge, (consulté le ) : « Google has used hash matching with Microsoft’s PhotoDNA for scanning uploaded images to detect matches with known CSAM. [...] In 2018, Google announced the launch of its Content Safety API AI toolkit that can “proactively identify never-before-seen CSAM imagery so it can be reviewed and, if confirmed as CSAM, removed and reported as quickly as possible.” It uses the tool for its own services and, along with a video-targeting CSAI Match hash matching solution developed by YouTube engineers, offers it for use by others as well. »