Viewing a single comment thread. View all comments

Tomi97_origin t1_iy8v9zb wrote

Because 2^3 = 8, 2^4 = 16, 2^5 = 32,...

Computers work with only 2 values 1 and 0 called bits.

You can imagine the chips storing these as apartments. Each door in the apartment must have a number (address).

2^x tells us the maximum number of doors with x being the number of bits needed to write the address for the door.

If you want to write 20 you will find that 2^4 (16) is not enough and you need 2^5, but at that point you might as well use all 32 addresses.

2

pseudopad t1_iy8x7vq wrote

Not really a complete explanation. It's not impossible to make a 20 GB USB drive, you just need to use a 16 GB chip combined with a 4 GB chip in the same USB drive.

However, this adds complexity to the circuit board, and it's easier to just buy 100000 32GB chips and make 100k USB drives from that, than it is to buy 50k 16GB chips and 50k 4GB chips and put these two chips on the same board for 20 GB. You'd also get better discounts from the chip manufacturers by buying 100k of a single chip than half as many of two different chips.

The chip cost is also just a portion of the total cost of the drive. You still need the same number of USB plugs, and with two chips, the plastic casing needs to be bigger, which also costs more money. Then it also needs to be packaged in some way to make it to stores, and because the unit in total is bigger with two chips, it'll weigh more and take up more space, which makes shipping more expensive.

So even if you're saving 25% of the chip costs by buying one 16 and one 4 gb chip, that's perhaps just 50 cents less on a unit that is gonna sell for let's say 20. Now the customer is faced with a choice of 32 GB for 20 bucks, or 20 for 19.50.

5