How Good Is Your Science KQ?
[i.e. your Knowledge Quotient]
Modern Measurements Are 'a bit' More Precise.
How far is a mile? Well, the 'mile' in Roman times was 1,000
doublepaces, a distance that we would now describe as about 5,000 feet.
However, in the modern world, a mile is longer than that; it is, of course,
5,280 feet or 1,760 yards or 1.609344 kilometers. The lengths of a yard and a
meter, like many current units, have been standardized so that they are reliable
and accurate measures of distance.
Despite a gradual introduction of better defined, standardized
systems of measurement, there remained in use throughout the world a confusing
variety of units. And so, about fifty years ago, an international congress
devised the metricbased SI system of units of measurement (System
Internationale de Unites). Most countries in the world have accepted the new
system and are using it  particularly in science. The unfortunate failure of
the US to adhere to this standard led to the 1999 Mars Climate Orbiter crash,
which resulted from human entry of computer data in pounds instead of
newtons.
The modern digital computer era has introduced some new units
of measurement with which we are just becoming familiar. Digital computers
employ binary arithmetic (ones and zeros) and electronic units with names like
'bit', 'byte', 'kilobyte', and 'megabyte'. The millions of people using
computers today see these new words and often have no idea what they represent.
However, they accept with gratitude the 'gigabytes' of hard disk storage space
on their newer machines.
The term 'bit' represents one unit (zero or one) of binary
information. A 'byte' ^{ }on modern computers is a
unit of 8 bits, which is the current standard used to represent one character.
Units of 8 bits per byte have not always been the norm. Some early machines,
following Teletype practice, used fewer bits to represent two octal digits or
any one of 64 alphanumeric characters.
Stumper questions:
a) How many bits are there in a gigabyte?
b) What is the name given to a half byte? (If you don't know,
try guessing!)
c) How few bits are needed to represent two octal digits?
'Bit' is a contraction of 'binary digit' coined
in 1946 by John Tukey
'Byte' was coined by Werner Buchholz in 1956
