Short int, Hexa Decimal and Binary bases in C

I'd like your help with analyzing the output of the following code:

```int f(unsigned short x){
int count;
for (count=0; x!=0; x>>=1){
if (x & 1)
count++;
}
return count;
}

int main(void){
unsigned short x = 0x00ef;

while (x){
printf("%d", f(f(x)));
x<<=4;
}

printf("\n");
return 0;
}
```

I treat x as 239 or 11101111 in binary base.

so 11101111 != 0 so we enter the while loop: f(f(11101111)), then f's count reaches to 7: 11101111 to 01110111 to 00111011**to **00011101 to 00001110 and so on. we have 7 cases where (x&1!=0) then 00000111 is sent to f, and count is 3, 3 is being printed and the original number becomes 11110000 and sent to f, 4 is sent then to f, but then since 4 is 00000100 count=1 and then I expected 1 to be printed, but the output is 3331.

Can someone please point out my mistakes?

You seem to be expecting the original number to be limited at 8 bits, which is rarely the size of unsigned short.

After the first shift, x will hold 0x00ef << 4, i.e. 0x0ef0 which still has the same number of bits set.