Over 30 AI models have been trained at 10^25 FLOP or greater

Hacker News - AI
Jul 16, 2025 15:19
Jimmc414
1 views
hackernewsaidiscussion

Summary

Over 30 AI models have now been trained using at least 10^25 floating point operations (FLOP), marking a significant increase in the scale of AI training. This rapid growth in computational resources highlights accelerating progress in AI capabilities, but also raises concerns about resource concentration and the environmental impact of large-scale training.

Article URL: https://epoch.ai/data-insights/models-over-1e25-flop Comments URL: https://news.ycombinator.com/item?id=44583247 Points: 1 # Comments: 0