700 words (8 minutes reading time) by Carole Parkinson

In many aspects of human endeavor, the belief that successive generations inherently improve holds true. Take sports, for example, where each Olympics sees records shattered by athletes who are better selected, trained, and conditioned than their predecessors. This continuous improvement is driven by a powerful incentive for each generation to surpass the achievements of the previous one.
However, this upward trajectory doesn’t always apply. In the realm of artificial intelligence (AI), there’s a phenomenon known as model collapse. Without diverse human-generated training data, AI systems risk malfunctioning if inundated with AI-generated content. The first generation of AI benefits from drawing on decades of human ingenuity, but as subsequent generations draw from their own created information, the pool of new ideas dwindles, resulting in a homogenised output.
Continue reading