Start with a positive number n and repeatedly apply these simple rules:In 1937, Lothar Collatz asked whether this procedure always stops for every positive starting value of n. If Gerhard Opfer is correct, we can finally say that indeed it always stops.
- If n = 1, stop.
- If n is even, divide n by 2.
- If n is odd, multiply n by 3 and add 1.
I have looked at this myself for fun once. Collatz sequences are interesting when you look at the bit representation of the numbers: Step 2 becomes n >> 1, right-shift one bit. Step 3 becomes n << 1 + n + 1, left-shift the argument one bit and add the argument and one.
For instance, let's take an arbitrary uneven number and see what happens on step 3, the only interesting step in the function.
Step 3: 110011 1100110 1 --------+ 10011010
To me, the example screams that the Collatz conjecture is about bit cancellation. I.e., that there is an inductive argument on sequences of bits. Question would be: What is the decreasing measure on bit sequences with each iteration of Collatz?
(It's interesting to modify the definition a bit by removing step 2 and terminating on a sequence which just consist of a one followed by zeroes.)