The concept of percentages is deeply embedded in our daily lives, but it didn't start with the modern "%" symbol. Its origins trace back thousands of years to Ancient Rome.
Ancient Roman Origins
Long before the decimal system was invented, Roman computations were often made in fractions which were multiples of 1/100. For example, Emperor Augustus levied a tax of 1/100 on goods sold at auction, known as centesima rerum venalium. Calculation with these fractions was equivalent to computing percentages.
As money denominations grew in the Middle Ages, computations with a denominator of 100 became increasingly standard. By the late 15th century, it was common for arithmetic texts to include such computations to calculate profit and loss, interest rates, and the Rule of Three.
Evolution of the Symbol
The term "percent" is derived from the Latin per centum, meaning "by the hundred". The symbol for percent (%) evolved from a symbol abbreviating the Italian per cento.
- In the 1425, a symbol similar to "pc" with a loop was used.
- By 1650, it had morphed into a horizontal line with two circles (÷), which looked confusingly like the division symbol.
- The modern "%" symbol with the diagonal slash became the standard in the 20th century.
Modern Usage
Today, percentages are used in almost every field:
- Finance: Interest rates, discounts, tax brackets.
- Science: Concentration of solutions, relative humidity, error margins.
- Technology: Battery life, download progress, uptime guarantees.
Understanding this history helps us appreciate that percentages are simply a standardized way of comparing fractions—a tool that has been refining commerce and communication for over two millennia.