Can someone explains why Brian Kernighan's algorithm takes O(log N) to count set bits (1s) in an integer. A simple implementation of this algorithm is below (in JAVA)
int count_set_bits(int n){ int count = 0; while(n != 0){ n &= (n-1); count++; } return count;}
I understand how it works by clearing the rightmost set bit one by one until it becomes 0, but I just don't know how we get O(log N).