The Gill Test

A measure of the arithmetic speed of an automatic computer. It is defined as the time in milliseconds required for a computer to:

A + B = C {Store C} C + D + E = F {Store F} G x H = K {Store K}

when all access is to the fastest internal storage used in the computer in question. For a one-address computer only with one constant storage access storage device {such as magnetic cores}, a gill is equal to ten times the average operation time.

{Named after Stanley Gill of Cambridge University}

In this age of Whetstones and Drystones and 500 MHz chips this measure in m-sec is not very significant, but back in 1960 it was a valid speed test.