[Math] How many bits of memory per character

computer science

If I create an array with 10 random numbers in the range [0, 2^30]. How can calculate the number of bits that it will consume of memory?

Let's assume that each of the numbers has 10 digits. That totals 100 digits. Would it be 800 bits (8 bits per character)?

Best Answer

If they are randomly distributed, each one needs 30 bits, so you need 300 bits if you store them in binary. If you convert them to decimal, you need 10 digits each (maybe 11). Then if you store the digits in 8 bit ASCII you need 800 (or 880) bits. The big inefficiency is taking a decimal digit (of which there are only 10) and using 8 bits (of which there are 256) to store it.

Related Question