The editorial is up: http://apps.topcoder.com/wiki/display/tc/SRM+592 and I have some things to say.

### The delay

Lately I have been trying really, really hard, to get editorials ready within 48 hours after their match. This time I had to break my successful streak though :(. You see, When the match ended, I wrote my recap , then I caught a tweet from rng_58:

1000 は Slow Fourier Transform です

— rng_2 (@rng_58) September 27, 2013

Yep, even though it is in Japanese, it was pretty clear what was going to soon happen to me, as the editorial writer. I have never gotten the chance to learn FFT correctly. Although I tried to figure the theory out, it is a bit confusing. It did take me around a month to actually understand max flow.

Yesterday, I was already past the 48 hours deadline and I really didn't want to, once again, have an editorial take one week of my life, plus I really hate the delays. When an editorial is published too late after a match, most people are no longer interested in the match...

Lately I've been learning that it is sometimes better to admit limitations and delegate. It's not the end of the world if I don't write all the explanations, and knowing my limitations is good. If I really did the thing in which I spend a week working on this editorial, then I would have to delay the next editorial too because of being tired. So I asked **Rustyoldman** to write this explanation and split the payments. Yay

### What is up with division 2 problem sets lately?

I am not sure what's going on, but it seems to me that division 2 problem sets are getting harder. This match's div2 easy requires a proof and the div2 1000 was incredibly complicated, imho. It is an "easy" dynamic programming problem in which you can easily see that it reduces to dynamic programming. But the time and memory optimizations took me a serious while to get to work correctly. This is the second div2 hard in a row that feels out of place and, to me, closer to div1 medium.

### The nice evilness in div1 medium

I mentioned in the recap that I tried to come up with a way to split this problem in sub-problems during the match. Well, that was not all the story. The *first thing* I did when trying to solve this problem during the match was to reduce it to generating a single permutation instead of a pair of permutations (this is the trick to solve the division 2 version quickly). During all the match I was trying to solve this modified version of the problem, and it seemed very complicated to find a polynomial time algorithm.

Then I read **ffao**'s explanation in the forums. It suddenly felt too easy. As long as you **don't** make the reduction to a single permutation, it is actually easier to come up with a polynomial time algorithm. This is quite a special problem and something to keep in mind. It appears that changing the problem to generating a single permutation instead of two would simplify it, but that't not always true. There is a reason why reducing a problem A to a problem B is said to mean that problem A is as easy or easier than problem B and not vice versa. It is possible that the original problem A is actually easier than the reduction.

### The match

I have no complaints about division 1, I actually loved it. The easy problem was fairly easy, but not in anyway uninteresting. Medium was cool, as I mentioned. The hard problem seems to have been enjoyed by the likes of Petr.

I would have done Division 2 much differently though. I think division 1 easy was a good problem for division 2 medium. Like I mentioned before, it is a good thing to share problems between the divisions. And next_permutation problems are not as interesting as the ad hoc div1 easy, plus apparently they give python coders a big disadvantage. Div2 easy may have been slightly harder than usual. But specially I would have given div2 hard smaller constraints. Much smaller , actually.