Over 30 AI models have been trained at 10^25 FLOP or greater
Summary
Over 30 AI models have now been trained using at least 10^25 floating point operations (FLOP), marking a significant increase in the scale of AI training. This rapid growth in computational resources highlights accelerating progress in AI capabilities, but also raises concerns about resource concentration and the environmental impact of large-scale training.