Shumailov, Ilia; Shumaylov, Zakhar; Zhao, Yiren; Gal, Yarin; Papernot, Nicolas; Anderson, Ross (2023-05-31). "The Curse of Recursion: Training on Generated Data Makes Models Forget". arXiv:2305.17493 [cs.LG].
Alemohammad, Sina; Casco-Rodriguez, Josue; Luzi, Lorenzo; Humayun, Ahmed Imtiaz; Babaei, Hossein; LeJeune, Daniel; Siahkoohi, Ali; Baraniuk, Richard G. (July 4, 2023). "Self-Consuming Generative Models Go MAD". arXiv:2307.01850 [cs.LG].
Gerstgrasser, Matthias; Schaeffer, Rylan; Dey, Apratim; Rafailov, Rafael; Sleight, Henry; Hughes, John; Korbak, Tomasz; Agrawal, Rajashree; Pai, Dhruv; Gromov, Andrey; Roberts, Daniel A.; Yang, Diyi; Donoho, David L.; Koyejo, Sanmi (2024-04-01). "Is Model Collapse Inevitable? Breaking the Curse of Recursion by Accumulating Real and Synthetic Data". arXiv:2404.01413 [cs.LG].
Borji, Ali (2024-10-16). "A Note on Shumailov et al. (2024): "AI Models Collapse When Trained on Recursively Generated Data"". arXiv:2410.12954 [cs.LG].
Dohmatob, Elvis; Feng, Yunzhen; Kempe, Julia (2024-02-12). "Model Collapse Demystified: The Case of Regression". arXiv:2402.07712 [cs.LG].
Dohmatob, Elvis; Feng, Yunzhen; Yang, Pu; Charton, Francois; Kempe, Julia (2024-02-10). "A Tale of Tails: Model Collapse as a Change of Scaling Laws". arXiv:2402.07043 [cs.LG].
Seddik, Mohamed El Amine; Chen, Suei-Wen; Hayou, Soufiane; Youssef, Pierre; Debbah, Merouane (2024-04-07). "How Bad is Training on Synthetic Data? A Statistical Analysis of Language Model Collapse". arXiv:2404.05090 [cs.LG].
Guo, Yanzhu; Shang, Guokan; Vazirgiannis, Michalis; Clavel, Chloé (2024-04-16). "The Curious Decline of Linguistic Diversity: Training Language Models on Synthetic Text". arXiv:2311.09807 [cs.CL].
Kirchenbauer, John; Geiping, Jonas; Wen, Yuxin; Katz, Jonathan; Miers, Ian; Goldstein, Tom (2023-07-03). "A Watermark for Large Language Models". Proceedings of the 40th International Conference on Machine Learning. PMLR: 17061–17084.