diff --git a/dynamic_programming/README.md b/dynamic_programming/README.md index 9c7eeb9..fe874d0 100644 --- a/dynamic_programming/README.md +++ b/dynamic_programming/README.md @@ -19,12 +19,15 @@ * recursion is an approach to solving problems using a function that calls itself as a subroutine. every time the function calls itself, it reduces the problem into subproblems. the recursion calls continue until it reaches a point where the subproblem can be solved without further recursion. * a recursive function should have the following properties so it does not result in an infinite loop: * one or more base cases (a terminating scenario that does not use recursion to produce an answer) - * a set of rules, also known as recurrence relation, that reduces all other cases towards the base case. + * a set of rules, also known as a recurrence relation, that reduces all other cases towards the base case. * there can be multiple places where the function may call itself. * any recursion problem can be solved iteratively and vice-versa. +* the **master theorem** is an advanced technique of asymptotic analysis (time complexity) for many of the recursion algorithms that follow the pattern of divide-and-conquer.
+--- + #### vizualing the stack
@@ -50,7 +53,7 @@ def reverse(s): * memoization is an optimization technique that avoids recursion's duplicate calculations. * it's primarily used to speed up code by storing the intermediate results in a cache so that it can be reused later. * for example, a hash table can be used as a cache and should be passed along each subroutine call. -* classical examples are fibonacci and the "climbing stairs" problem: +* classic examples are fibonacci and the "climbing stairs" problem: ```python cache = {1: 1, 0: 1} @@ -74,8 +77,8 @@ def climbing_stairs(n) -> int: * the time complexity of a recursion algorithm is the product of the number of recursion invocations and the time complexity of the calculation, `R*O(s)`. * you can also look at the "execution tree", which is a tree that is used to denote the execution flow of a recursive function in particular. each node represents an invocation of the recursive function. therefore, the total number of nodes in the tree corresponds to the number of recursion calls during the execution. * the execution tree of a recursive function would form a n-ary tree, with as the number of times recursion appears in the recurrence relation (for example, fibonacci would be a binary tree). - * in a full binary tree with n levels, the total number of nodes is 2**n -1, so O(2**n) would be the time complexity of the function. - * with memoization, fibonacci becomes O(1)*n = O(n) + * in a full binary tree with n levels, the total number of nodes is `2**N - 1`, so `O(2**N)` would be the time complexity of the function. + * with memoization, fibonacci becomes `O(1)*N = O(N)`.
@@ -86,11 +89,13 @@ def climbing_stairs(n) -> int:
* there are mainly two parts of the space consumption that one should see in a recursive algorithm: - * recursion related space: refers to the memory cost that is incurred directly by the recursion, i.e. the stack to keep track of recursive function calls. in order to complete a recursive call, the system allocates some space in the stack to hold: the returning address of the function call, the parameters that are passed to the function call, the local variables within the funcion call. once the function call is done, the space is freed. for recursive algorithms, the function calls chain up successively until it reaches a base case, so the space used for each call is accumulated. overconsumption can cause stack overflow. + * recursion related space: refers to the memory cost that is incurred directly by the recursion, i.e. the stack to keep track of recursive function calls. in order to complete a recursive call, the system allocates some space in the stack to hold: the returning address of the function call, the parameters that are passed to the function call, the local variables within the function call. once the function call is done, the space is freed. for recursive algorithms, the function calls chain up successively until it reaches a base case, so the space used for each call is accumulated. overconsumption can cause stack overflow. * non-recursive related space: refers to the memory that is not directly related to recursion, which includes the space (normally in heap) that is allocated for global variables.
+--- + #### tail recursion
@@ -125,4 +130,29 @@ def sum_tail_recursion(ls):
+--- + +### backtracking + +
+ +* backtracking is a general algorithm for finding solutions to some computation problems, which incrementally builds candidates to the solution and abandons a candidate ("backtracks") as soon as it determines that the candidate cannot lead to a valid solution. +* you can imagine the procedure as the tree traversal. + +
+ +```python +def backtrack(candidate): + if find_solution(candidate): + output(candidate) + return + + for next in list_of_candidates: + + if is_valid(next_candidate): + place(next_candidate) + backtrack(next_candidate) + remove(next_candidate) +```` +