So, I decided a new strategy. Not opening the 250-points (Easy) problem is surely going to be interesting. Today's match seemed intentionally directed towards making that strategy fail. Medium and Easy were both solved by plenty of coders and I lost rating even after solving the medium.

## Delay

I had a delay, I was using the time between 11:00 AM and 11:10 AM to try to improve my solution for the Marathon Round 3 problem. Near 11:10, my most recent implementation wasn't working as intended. So I kept insisting even after 11:10 AM, I didn't want to start the SRM without removing this bug from my mind. I think I started the match seven minutes afterwards.

## 950 points

I opened the hard problem. Couldn't do much and gave up and moved to 500 I guess that's going to take me a while to solve for the editorial.

## 500 points - The one with trees

Lately TopCoder is taking ages before the results and problem statements are published.

In short, you have a tree of at most 50 vertices. You have to pick up the minimum number of simple paths between its vertices. Some edges can be picked any number of times, some edges must be picked an even number of times and some edges must be picked an odd number of times.

So, let us try to decide the number of times each edge will be used and then pick the minimum cost to do that...

The root of the tree has some children nodes, each connected with an edge. Let us decide to use its edge #0 in a total of **i** simple paths. Assume that this pick is valid (even or odd as needed). Interesting things happen afterwards:

- Assume that we then solved for child #0, we need to decide what to do with edges 1, 2, etc of the root. It is exactly a new version of the same problem, the only difference is that we have used
**i**edges in "the past".**i**edges are [available]. - The children 0 of the root now has a parent edge that is used
**i**times. We can consider this scenario a new version of the problem, the only difference is that the root has**i**[available] edges. - We call these edges "available", because if we decide to use a later edge in a path, we can use the root to connect these two edges and take part in the same simple path (and save a cost unit).

So the problem is split in two sub-versions of the same problem.

Let us solve the general version: f(**x**, **paths**, **e**) returns the minimum cost to solve the sub-tree rooted at **x**. We are only allowed to use edges indexed greater than or equal to **e**. We know that the "past" edges (including the parent edge) were used a number of times and we have **paths** "available" egdes.

Let us pick the number of times **i** we will use edge **e** in a simple path (Has to have a valid parity). This could increase the cost, if **i** <= **paths**, it wouldn't, because we can
just use the previous paths without increasing the cost. **i** > **paths**, then we have created (**i** - **paths**) new paths. The number of paths for the next edge changes. It is abs(**i** - **paths**).

- We need to solve the sub-problem for the child connected by edge
**e**: f(**child**,**i**, 0) (Because its parent edge is used**i**times). - We need to solve the sub-problem of the remaining edges of the root: f(
**x**,**new value of paths**,**e**+ 1 ).

The sum between the cost and the two new calls to f() is the result. We can implement this with memoization.

`const int MAX_VERTICES = 50;`

const int MAX_EDGES = 49;

const int MAX_STEPS = MAX_EDGES; //we can just pick each important edge once.

struct TurnOnLamps

{

int degree[MAX_VERTICES];

int g[MAX_VERTICES][MAX_EDGES];

bool can[MAX_VERTICES][MAX_EDGES][2];

int mem[MAX_VERTICES][ MAX_STEPS + 1][MAX_EDGES+1];

int rec(int x, int paths, int e)

{

int & res = mem[x][paths][e];

if (res == -1) {

if (e == degree[x]) {

//base case. All the sub-trees were decided, well-done!

// we can just decide not to use the available roads

res = 0;

} else {

res = MAX_STEPS * 2; //Arbitrarily large value

// how many paths use edge e?

for (int i = 0; i <= MAX_STEPS; i++) {

// Use this edge in i paths:

if (can[x][e][ abs(i) % 2]) { //Valid parity for the number of paths

int npaths = abs( paths - i );

int cost = max( i - paths , 0 );

int p = rec(g[x][e], i , 0 );

int q = rec(x , npaths, e+1);

res = std::min(res, cost + p + q );

}

}

}

}

return res;

}

int minimize(vector <int> roads, string initState, string isImportant)

{

memset(mem, -1, sizeof(mem));

int n = roads.size() + 1;

for (int i=0; i<n; i++) {

degree[i] = 0;

}

for (int i=0; i<n-1; i++) {

//between roads[i] and i+1

int u = roads[i], v = i + 1;

int x = degree[u]++;

g[u][x] = v;

can[u][x][0] = can[u][x][1] = true;

if (isImportant[i] == '1') {

can[u][x][ initState[i]- '0' ] = false;

}

}

return rec(0, 0, 0);

}

};

During the match, it took me a while to think of a dynamic programming approach. Then I started to code it, and it was already a bit late, I think only 15 minutes were left before the end of the match.

When I finished coding and debugging that approach, it turns out that it passed example tests, but it was too slow for larger test cases. I was unnecessarily nesting a second dynamic programming inside the main one. I noticed that I could merge the two ideas into a single dynamic programming. But there were very few minutes left, at most 8, if I remember correctly.

I rushed and did the best I could to implement the fix quickly. I was able to compile in the last second and submitted. I think that I literally did it at the last second.

## The easy problem

I opened this later after the match. Turns out it was just a simple shortest paths problem! They are not even weighted edges!. If you use Floyd-Warshall this problem is ridiculously easy.

`int minTimes(vector <int> range, int startCity, int endCity)`

{

int n = range.size();

int dist[n][n];

for (int i=0; i<n; i++) {

for (int j=0; j<n; j++) {

if ( std::min(abs(j - i), std::min( j+n - i, i+n - j) ) <= range[i] ) {

dist[i][j] = 1;

} else {

dist[i][j] = 1000000000;

}

}

}

for (int k=0; k<n; k++) {

for (int i=0; i<n; i++) {

for (int j=0; j<n; j++) {

dist[i][j] = std::min(dist[i][j], dist[i][k] + dist[k][j] );

}

}

}

return dist[startCity][endCity];

}

## Conclusion

I really think the easy problem was too easy this time. The other problems were fine. Although it seems that there was a greedy solution for the 500 points. I think it is still a interesting problem.

The strategy is doing fine. If I used my old strategy, I would have switched to the easy problem before the last 10 minutes. Bore myself trying to implement Floyd-Warshall as fast as possible and then miss the chance to prove that I could have solved the medium problem. It was a lot more exciting and fun to solve the medium this time.

## No comments :

Post a Comment