A second of music involves 44100 individual samples. Each individual sample involves two amplitude values, one for each channel, and each of those values is represented with 16 bits. Therefore, one second of music requires 44100x2x16 = 1411200 bits. Four minutes of music (240 seconds) thus requires 338688000 bits. This is equal to 42336000 bytes (as 1 byte = 8 bits), and this is approximately 40.375 MB (as 1MB = 220 bytes).
Based on the above, approximately 17.3 such songs would fit on the CD.
Since there are 11 distinct charcters, at least 4 bits per character are needed for a fixed-length code, and thus 4x39 = 156 bits overall.
When merging two groups while building the tree, there is no specified rule for which of the two pieces becomes the left branch. Therefore, there is not a unique correct code. Here is one possible solution. There is, however, only one slight alteration possible in the lengths of the correct codes: D could be one bit shorter while S and T are one bit longer than the solution shown.
| Letter | Code |
|---|---|
| P | 10 |
| E | 11 |
| blank | 010 |
| C | 0001 |
| I | 0010 |
| K | 0011 |
| R | 0110 |
| D | 00000 |
| L | 00001 |
| S | 01110 |
| T | 01111 |
and the tree which was used when generating the code
122 bits = (9*2 + 8*2 + 5*3 + 3*4 + 3*4 + 3*4 + 3*4 + 2*5 + 1*5 + 1*5 + 1*5)
122/156 = 0.78
If the resolution of a picture is 1024x768 pixels, then there are 786432 pixels. With true color, each pixel representation requires 24 bits, so there are a total of 18874368 bits of information in a picture. Since 1 byte is 8 bits, and 1 KB is 1024 bytes, and 1MB = 1024KB, we have that a single picture takes 2359296 bytes = 2304 KB = 2.25 MB.
This means we can fit approximately 227.56 pictuers on a 512MB flash drive.