Generative AI Goes 'MAD' When Trained on AI-Created Data Over Five Times
New research suggests that the much-desired "spinning wheel" of LLM training (that a model can be trained with its own outputs) is, for now, outside our capabilities. Autoencoders, Gaussian mixture models and large language models have all been shown to be affected.