FLOPS stands for Floating Point Operations Per Second and defines the number of floating point operations that a computer can perform in a second. IBM defined FLOPS as “the method used to code real numbers within the finite precision limit of a computer”.
Using a floating point, extremely large numbers can be represented inside a computer’s registries.

In modern IT, in particular in the field of super-computers and GPU computing, FLOPS, calculated in Tera FLOPS, are the reference unit used for the definition of computing performance.