2023-07-31 13:27:44 -07:00

128 lines
5.1 KiB
Markdown

## dynamic programming
<br>
* dynamic programming is the process of taking a recursive algorithm and cache overlapping problems (repeated calls).
* the runtime is given by the number of calls.
* **top-down**: how can we divide the problem into sub-problems?
* top-down dynamic programming is called **memoization**.
* **bottom-up**: solve for a simple case, then figure out for more elements.
<br>
---
### recursion
<br>
* recursion is an approach to solving problems using a function that calls itself as a subroutine. every time the function calls itself, it reduces the problem into subproblems. the recursion calls continue until it reaches a point where the subproblem can be solved without further recursion.
* a recursive function should have the following properties so it does not result in an infinite loop:
* one or more base cases (a terminating scenario that does not use recursion to produce an answer)
* a set of rules, also known as recurrence relation, that reduces all other cases towards the base case.
* there can be multiple places where the function may call itself.
* any recursion problem can be solved iteratively and vice-versa.
<br>
#### vizualing the stack
<br>
* to visualize how the stack operates during recursion calls, check the example below where we reverse a string:
```python
def reverse(s):
if len(s) == 0:
return s
else:
return reverse(s[1:]) + s[0]
```
<br>
---
### memoization
<br>
* memoization is an optimization technique that avoids recursion's duplicate calculations.
* it's primarily used to speed up code by storing the intermediate results in a cache so that it can be reused later.
* for example, a hash table can be used as a cache and should be passed along each subroutine call.
* classical examples are fibonacci and the "climbing stairs" problem:
```python
cache = {1: 1, 0: 1}
def climbing_stairs(n) -> int:
if n not in cache:
cache[n] = climbing_stairs(n-1) + climbing_stairs(n-2)
return cache[n]
```
<br>
----
### time complexity
<br>
* the time complexity of a recursion algorithm is the product of the number of recursion invocations and the time complexity of the calculation, `R*O(s)`.
* you can also look at the "execution tree", which is a tree that is used to denote the execution flow of a recursive function in particular. each node represents an invocation of the recursive function. therefore, the total number of nodes in the tree corresponds to the number of recursion calls during the execution.
* the execution tree of a recursive function would form a n-ary tree, with as the number of times recursion appears in the recurrence relation (for example, fibonacci would be a binary tree).
* in a full binary tree with n levels, the total number of nodes is 2**n -1, so O(2**n) would be the time complexity of the function.
* with memoization, fibonacci becomes O(1)*n = O(n)
<br>
----
### space complexity
<br>
* there are mainly two parts of the space consumption that one should see in a recursive algorithm:
* recursion related space: refers to the memory cost that is incurred directly by the recursion, i.e. the stack to keep track of recursive function calls. in order to complete a recursive call, the system allocates some space in the stack to hold: the returning address of the function call, the parameters that are passed to the function call, the local variables within the funcion call. once the function call is done, the space is freed. for recursive algorithms, the function calls chain up successively until it reaches a base case, so the space used for each call is accumulated. overconsumption can cause stack overflow.
* non-recursive related space: refers to the memory that is not directly related to recursion, which includes the space (normally in heap) that is allocated for global variables.
<br>
#### tail recursion
<br>
* tail recursion is a recursion where the recursive call is the final instruction in the recursion function. and there should be only one recursive call in the function.
* tail recursion is exempted from the space overhead discussed above, ad it skips an entire chain of recursive calls returning and returns straight to the original caller (it does not need a call stack for all the recursive calls - instead of allocating new space on the stack, the system could simply reuse the space allocated earlier for this second recursion call).
* some languages' compiler recognizes tail recursion pattern and optimizes their execution (e.g., C and C++, but not Java, Rust, or Python - although it's possible to implement ad-hoc).
<br>
```python
def sum_non_tail_recursion(ls):
if len(ls) == 0:
return 0
# not a tail recursion because it does some computation after the recursive call returned
return ls[0] + sum_non_tail_recursion(ls[1:])
def sum_tail_recursion(ls):
def helper(ls, acc):
if len(ls) == 0:
return acc
# this is a tail recursion because the final instruction is a recursive call
return helper(ls[1:], ls[0] + acc)
return helper(ls, 0)
```
<br>