We're now 40% of the way done. Almost halfway! You can make it! You can do it! I believe in you! Even if you fell off the sleigh, as they say,...
For further actions, you may consider blocking this person and/or reporting abuse
I didn't really have fun today. My final solution in Ruby took way too long to get too and doesn't feel like an accomplishment. Rather it's a Dynamic Programming trick you just have to know or see, but not something to derive at logically or interestingly.
The grunt work is
O(n)
, the sort isO(n log n)
, so sorting the input becomes the "limiting" factor. YMMV.Hey, thank you very much! Could you give me a hint on how this "trick" is called?
Certainly. It's dynamic programming!
In this case, the problem can be broken down and simplified. We're not looking for the list of combinations. We are looking for the total number of options to build a valid connection between start (0) and end (max + 3).
For each adapter the following is true:
The number of paths to get to this adapter from the start is equal to the sum of the number of paths to get from the previous adapter to this one.
That means that this problem can be reduced. You can walk the list from start to end or end to start and only answer the question: paths to / paths from this adapter.
Storing the intermediate outcome is a typical dynamic programming this.
Stored is the value 1 (ways to get to this index) on the index 1 (the joltage of the first adapter):
Stored at index 4 (joltage) is the number of paths to this index: ways to joltage 4 - 3 plus ways to joltage 4 - 2 plus ways to joltage 4 - 1.
Stored at index 5 (joltage) is the number of paths to this index: ways to joltage 5 - 3 plus ways to joltage 5 - 2 plus ways to joltage 5 - 1.
As you can see, here is where it becomes interesting. By storing the ways to get to a joltage in the array, when we need it's évalués, it doesn't need to calculate or worse, recalculate those values.
This is what is key in Dynamic Programming. You store intermediate results so that you don't need to do the work again. A classic example is solving Fibonacci.
You may want to solve it recursively, but the problem with doing that without memorisation step is that you'll be calculating the same values over and over. Instead, because each fib(n) equals fib(n-1) + fib(n-2), the pair input and output can reduce the recursive complexity of
O(2^n) to O(n). Each term is only considered once, instead of many many many times.
Oh wow, thank you very much 🚀 Really didn't expect such a lengthy, comprehensible answer. So, thank you very much again! Just thought it's a special technique one could look up and practice but I guess I just have to look into dynamic programming again 👍🏻
Yeah. Once you do a series of solutions using Dynamic Programming, you'll get a feel for it!
Thanks for the clear explanation Derk-Jan Karrenbeld!
Here is my implementation in Python:
Another day where I solved Part1 and had to go back to rewrite it when Part2 came around!
My insight was that you could count the length of the sequences of 1 joltage deltas, and use that to derive the number of possible combinations. Unfortunately, I wasn't able to figure out the general formula for sequenceLength->permutations, and had to hardcode the table 😣
Part 1
Part 2
Part 1 minified javascript solution:
Here it is in normal beauty:
Posting part2 soon!
Bit of a mad mix of imperative and functional programming for me today. I hate calculating permutations.
My Elixir solution:
Did a bit of tricky business after looking over the test cases and my input. Linear time with no recursion.
Because the numbers in my input were either 3 or 1 apart, never 2, and there were never more than 4 1's in a row before a 3, I pre-calculated the # of possibilities each of the possible patterns generated and then multiplied together to find all possible combinations.
Then I rolled through the list producting up all the values for each "cell" of ones. The only thing I had an issue with is that I didn't think about the fact that the result was going to be a
long long int
, so I chased my tail printing out regular ints for a while even though my algorithm was right.Also, TIL that C has a build-in sorting function called
qsort
!I also use Ruby (if I have time try to replicate in Python for learning) and found part 1 ok.
But Part 2 stumped me. I got the general gist of the problem, but couldn't get started on a solution.
heres the solution in python for both part1 and 2 :
C# solution part 1.
Execution time: 0ms, 235 ticks, the quickest so far :)
`
Took me a while to figure out. I was initially going down some path finding algorithm. But after looking at it a while It started to look like some dynamic programming/combinatorics.
Took me a while to get the actual equation down but finally did it.
input
is with0
andmax + 3
already added.Part 2 javscript solution 🎨:
COBOL, only part 1 again, gotta catch up during the week-end.
Sorry it's so late, but I'm pretty proud of figuring this out :D Obviously I'm continuing my javascript odyssey. It's pretty long, so I'll just link to the gist.
Too late but for the sake of explanation: the DFS algorithms and the Dynamic Programming tehcniques are useful to know but the problem of counting (not enumerating) different paths in an oriented graph can be simply solved in O(n^3) time by these cycles:
for(i = 0 to N)
for(j =0 to N)
for(k = 0 to N)
path[i][j] += path[i][k] * path [k][j]
where path[][] is the matrix mapping any vertex to each other setting "0" if there is no edge between them and "1" if they are connected. The result will come in few seconds. Thanks to this post:
stackoverflow.com/questions/164213...
My JavaScript walkthrough (today was tough but recursion with memoization did the trick!):
Part 2 stumped me, and this really helped!
Thanks for the memoizatino tip! Without caching intermediary results my algorithm was simply stuck. Once I started caching, the computation was almost instantaneous!