The trick with this code is that the unary part ofthe code tells us how many bits
to expect in the binary part. We end up with a code that uses no more bits than the
unary code for any number, and for numbers larger than 2, it uses fewer bits. The
savings for large numbers is substantial. We can, for example, now encode 1023
in 19 bits, instead of 1024 using just unary code.
For any number k, the Elias-γ code requires ⌊log
2
k⌋ + 1 bits for k
d
in unary
code and ⌊log
2
k⌋ bits for k
r in binary. Therefore, 2⌊log
2
k⌋ + 1 bits are required
in all.