This was quite a straightforward fun problem for me.
My first thought was that if we precompute all the values of d(n), 0 < n < u + 1, then M(n, k) can quite easily be calculated from M(n - 1, k) so we don't have to find the maximum of 100,000 values every time, but instead we just compare the incoming and outgoing values. This problem is commonly known as the Sliding Window Problem.
I found a pretty neat trick using dictionaries.
Store each unique d(n) for 0 < n < 100,001, in the dictionary, which I call window, with its count as the key.
Then applying the max function only checks a few values instead of 100,000.
The incoming value is d(100,001)
Now if d(100,001) > max(window), then d(100,001) is the new maximum of the window
The outgoing value is d(1)
Now if d(1) is the current maximum, there are 2 cases
If window[d(1)] > 0, that is the value still exists in the window then it remains the maximum
If window[d(1)] = 0, that is the value is no longer in the window, I need to check my whole window again to find the maximum
Continue this process to easily find the maximum of the current window
Now the problem boils down to being able to quickly calculate d(n), 0 < n < u + 1.
I used a trick which precompute the smallest (or greatest) prime factor sieve. See an implementation of smallest prime factor sieve here: https://www.geeksforgeeks.org/prime-factorization-using-sieve-olog-n-multiple-queries/
I was able to solve the problem in ~120 seconds. After some optimization's and using some clever tricks from the thread I got it down to ~85 seconds which I'm very happy with.
Enter 2 Integers separated by a space (u, k)
Code will output S(u, k)