Hexadecimal notation is a system of representing numbers using a base of 16, whereas decimal notation uses a base of 10.
- Hexadecimal notation has a base (radix) of 16, whereas decimal notation uses a base of 10.
- Hexadecimal notation is often used in computing and digital systems because it is a convenient way to represent large numbers in a compact and easy-to-read format.
- The decimal system of notation uses nine digits, 0 to 9, and adds columns to the right to denote units of 10s, 100s, etc.
- A hexadecimal system of notation uses sixteen digits. The digits 0-9 represent values from zero to nine, and A-F (or a-f) represent values from ten to fifteen. Further columns are added to represent units of 16, 256, etc.
- The rightmost digit represents units (16^0)
- The second-rightmost digit represents 16s (16^1)
- The third-rightmost digit represents 256s (16^2), and so on.
- A hexadecimal triplet is a six-digit, three-byte hexadecimal system of notation used in programming and software applications (such as graphic design, web development, and photography) to represent colours. The bytes represent the red, green, and blue components of a colour, in that order.
- Each byte represents a number in the range 00 to FF in hexadecimal notation, which is equivalent to 0 to 255 in decimal notation.
- The hash symbol (#) is used to indicate hexadecimal notation.
- Specific colours can be represented in hexadecimal notation, such as red (#FF0000), yellow (#FFFF00), green (#00FF00), cyan (#00FFFF), blue (#0000FF), and magenta (#FF00FF).
An hexadecimal number (hex number) has a base (radix) of 16 whilst a decimal system of notation has a base of 10.
- The familiar decimal system of notation uses nine distinct symbols 0 – 9. It then adds columns to the right to denote 10’s, 100’s etc.
- A hexadecimal system of notation uses sixteen distinct symbols, most often the symbols 0–9 to represent values zero to nine, and A, B, C, D, E, F (or a, b, c, d, e, f) to represent values ten to fifteen. Further columns are added on the right to denote 16’s, 256’s etc.
- A hexadecimal triplet is a six-digit, three-byte hexadecimal system of notation used in programming and software applications (graphic design, web development, photography) to represent colours. The bytes represent the red, then green and then blue components of a colour.
- Hexadecimal triplets can be used to represent 256 x 256 x 256 different colours.
- Each byte represents a number in the range 00 to FF in hexadecimal notation (0 to 255 in decimal notation).
- The hash symbol (#) is used to indicate hex notation.
- Red = #FF0000
- Yellow = #FFFF00
- Green = #00FF00
- Cyan = 00FFFF
- Blue = #0000FF
- Magenta = #FF00FF
- The sequence of hexadecimal values between 1 and 16 = 0,1,2,3,4,5,6,7,8,9,A,B,C,D,E and F.
- The sequence of hexadecimal values between 17 and 32 = 10,11,12,13,14,15,16,17,18,19,1A,1B,1C,1D,1E and 1F.
- The sequence then continues to increment the two digits up to 256.