Octal (base-8) uses eight digits (0-7) to represent numbers, whereas decimal (base-10) uses ten digits (0-9). Each octal digit represents three binary bits, making octal a compact way to represent binary values. For example, octal 17 equals decimal 15 because (1 × 8^1) + (7 × 8^0) = 8 + 7 = 15. Converting decimal to octal involves repeatedly dividing by 8 and collecting remainders: 64 ÷ 8 = 8 remainder 0, then 8 ÷ 8 = 1 remainder 0, then 1 ÷ 8 = 0 remainder 1, reading remainders bottom-to-top gives octal 100.
Octal notation was historically important in computing because early computers used word sizes divisible by 3 (12-bit, 24-bit, 36-bit). Representing these binary values in octal was more convenient than binary (shorter) or decimal (easier to convert to/from binary). Modern systems primarily use hexadecimal for this purpose, but octal persists in specific contexts like Unix file permissions. Understanding octal helps interpret legacy code, configure Unix systems, and grasp the relationship between different number bases in computing.
Unix file permissions use octal notation to represent read (4), write (2), and execute (1) permissions for owner, group, and others. For example, chmod 755 sets permissions to rwxr-xr-x (owner: 7 = 4+2+1 = rwx, group: 5 = 4+1 = rx, others: 5 = rx). Each octal digit (0-7) maps to three permission bits (rwx), making octal the natural representation for three-bit permission groups. Converting between octal and decimal is essential for understanding and configuring Unix file permissions programmatically or manually.