Javascript is required
logo-dastralogo-dastra

Training compute

Leïla Sayssa
Leïla Sayssa
24 July 2025·2 minutes read time

According to the European Commission Guidelines on GPAI providers, the training compute, namely the "amount of compute used to train a model, is typically proportional to the number obtained by multiplying the number of its parameters with the number of its training examples".

For example, if the training compute of the AI model is greater than 10^23 FLOP, it is presumed to be a General-purpose AI model (GPAI), as it is capable of performing a wide range of distinct tasks.

For more details on the training compute of GPAI models, refer to the annex of the above mentioned Guidelines.

Subscribe to our newsletter

We'll send you occasional emails to keep you informed about our latest news and updates to our solution

* You can unsubscribe at any time using the link provided in each newsletter.