Problem statement

Given an integer n, return an array ans of length n + 1 such that for each i (0 <= i <= n), ans[i] is the number of 1’s in the binary representation of i.
Link to problem

Example 1

Input: n = 2
Output: [0,1,1]
Explanation:
0 --> 0
1 --> 1
2 --> 10

Example 2

Input: n = 5
Output: [0,1,1,2,1,2]
Explanation:
0 --> 0
1 --> 1
2 --> 10
3 --> 11
4 --> 100
5 --> 101

Solutions

Approach 1: Normal $O(n logn)$ Approach

We could reuse what we did in Blind 11.

Implementation in Rust

fn count_ones(n: i32) -> i32 {
    let mut count = 0;
    let mut num = n;

    while num > 0 {
        num = num & (num-1);
        count += 1;
    }

    count
}

fn count_bits(n: i32) -> Vec<i32> {
    let mut results = Vec::new();

    for i in 0..n+1 {
        results.push(count_ones(i));
    }

    results
}

Time Complexity: $O(n logn)$
Runtime on LeetCode: $9$ms

Approach 2: Bitwise Odd/Even Pattern

There’s a pattern that can be obsereved when we draw out the number of ones considering that the number is odd or even.

Such as this:

count(0) = 0000 = 0
count(1) = 0001 = 1 -> count(1/2) + 1 = count(0) + 1
count(2) = 0010 = 1 -> count(2/2) + 0 = count(1) + 0
count(3) = 0011 = 2 -> count(3/2) + 1 = count(2) + 1
...

Implementation in Rust

fn count_bits(n: i32) -> Vec<i32> {
    let n = (n+1) as usize;
    let mut results = vec![0; n];
    for i in 1..n {
        results[i] = results[i >> 1] + (i & 1) as i32;
    }
    results
}

Time Complexity: $O(n)$
Runtime on LeetCode: $5$ms