mirror of
https://github.com/autistic-symposium/master-algorithms-py.git
synced 2025-04-29 20:26:07 -04:00
Update README.md
This commit is contained in:
parent
e0ef50da5b
commit
d1123f8c75
577
trees/README.md
577
trees/README.md
@ -3,7 +3,9 @@
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
* a tree is a widely used abstract data type that represents a hierarchical structure with a set of connected nodes.
|
* a tree is a widely used abstract data type that represents a hierarchical structure with a set of connected nodes.
|
||||||
|
|
||||||
* each node in the tree can be connected to many children, but must be connect to exactly one parent (except for the root node).
|
* each node in the tree can be connected to many children, but must be connect to exactly one parent (except for the root node).
|
||||||
|
|
||||||
* a tree is an undirected and connected acyclic graph and there are no cycle or loops.
|
* a tree is an undirected and connected acyclic graph and there are no cycle or loops.
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@ -14,17 +16,83 @@
|
|||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
* **binary trees** are trees that have each up to 2 children. a node is called **leaf** if it has no children.
|
* **binary trees** are trees that have each up to 2 children.
|
||||||
|
|
||||||
* access, search, remove, insert are all `O(log(N)`. space complexity of traversing balanced trees is `O(h)` where `h` is the height of the tree (while very skewed trees will be `O(N)`.
|
* access, search, remove, insert are all `O(log(N)`. space complexity of traversing balanced trees is `O(h)` where `h` is the height of the tree (while very skewed trees will be `O(N)`.
|
||||||
|
|
||||||
* the **width** is the number of nodes in a level.
|
* the **width** is the number of nodes in a level.
|
||||||
|
|
||||||
* the **degree** is the nunber of children of a node.
|
* the **degree** is the nunber of children of a node.
|
||||||
* a **balanced tree** is a binary tree in which the left and right subtrees of every node differ in height by no more than 1.
|
|
||||||
* a **complete tree** is a tree on which every level is fully filled (except perhaps for the last).
|
* a **complete tree** is a tree on which every level is fully filled (except perhaps for the last).
|
||||||
* a **full binary tree** has each node with either zero or two children (and no node has only one child).
|
|
||||||
* a **perfect tree** is both full and complete (it must have exactly `2**k - 1` nodes, where `k` is the number of levels).
|
* a **perfect tree** is both full and complete (it must have exactly `2**k - 1` nodes, where `k` is the number of levels).
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### full trees
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* a **full binary tree** has each node with either zero or two children (and no node has only one child).
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def is_full(node) -> bool:
|
||||||
|
if node is None:
|
||||||
|
return True
|
||||||
|
return bool(node.right and node.left) and is_full(node.right) and is_full(node.left)
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### is leaf?
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* a node is called **leaf** if it has no children.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def is_leaf(node):
|
||||||
|
return bool(not node.right and not node.left)
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### balanced trees
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* a **balanced tree** is a binary tree in which the left and right subtrees of every node differ in height by no more than 1.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def height(root):
|
||||||
|
if root is None:
|
||||||
|
return -1
|
||||||
|
|
||||||
|
return 1 + max(height(root.left), height(root.right))
|
||||||
|
|
||||||
|
def is_balanced(root):
|
||||||
|
if root is None:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return abs(height(root.left) - height(root.right)) < 2 and \
|
||||||
|
is_balanced(root.left) and is_balanced(root.right)
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
----
|
----
|
||||||
|
|
||||||
### depth of a binary tree
|
### depth of a binary tree
|
||||||
@ -53,6 +121,7 @@ def max_depth(root) -> int:
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
* the **height** of a node is the number of edges on the **longest path** between that node and a leaf.
|
* the **height** of a node is the number of edges on the **longest path** between that node and a leaf.
|
||||||
|
|
||||||
* the **height of tree** is the height of its root node, or the depth of its deepest node.
|
* the **height of tree** is the height of its root node, or the depth of its deepest node.
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@ -112,8 +181,10 @@ def bfs_iterative(root):
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
- deep-first search (DFS) can also be used to find the path from the root node to the target node if you want to visit every node and/or search the deepest paths firsts.
|
- deep-first search (DFS) can also be used to find the path from the root node to the target node if you want to visit every node and/or search the deepest paths firsts.
|
||||||
- recursion solutions are easier to implement; however, if the recursion depth is too high, stack overflow might occur. in that case, you might use BFS instead or implement DFS using an explicit stack (i.e., use a while loop and a stack structure to simulate the system call stack).
|
|
||||||
- overall, we only trace back and try another path after we reach the deepest node. as a result, the first path you find in DFS is not always the shortest path.
|
- recursion solutions are easier to implement; however, if the recursion depth is too high, stack overflow might occur. in that case, you might use BFS instead or implement DFS using an explicit stack (i.e., use a while loop and a stack structure to simulate the system call stack).
|
||||||
|
|
||||||
|
- overall, we only trace back and try another path after we reach the deepest node. as a result, the first path you find in DFS is not always the shortest path:
|
||||||
- we first push the root node to the stack, then we try the first neighbor and push its node to the stack, etc.
|
- we first push the root node to the stack, then we try the first neighbor and push its node to the stack, etc.
|
||||||
- when we reach the deepest node, we need to trace back.
|
- when we reach the deepest node, we need to trace back.
|
||||||
- when we track back, we pop the deepest node from the stack, which is actually the last node pushed to the stack.
|
- when we track back, we pop the deepest node from the stack, which is actually the last node pushed to the stack.
|
||||||
@ -128,9 +199,13 @@ def bfs_iterative(root):
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
- `left -> node -> right`
|
- `left -> node -> right`
|
||||||
|
|
||||||
- in a bst, in-order traversal will be sorted in the ascending order (therefore, it's the most frequently used method).
|
- in a bst, in-order traversal will be sorted in the ascending order (therefore, it's the most frequently used method).
|
||||||
|
|
||||||
- converting a sorted array to a bst with inorder has no unique solution (in another hadnd, both preorder and postorder are unique identifiers of a bst).
|
- converting a sorted array to a bst with inorder has no unique solution (in another hadnd, both preorder and postorder are unique identifiers of a bst).
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
```python
|
```python
|
||||||
def inorder(root):
|
def inorder(root):
|
||||||
if root is None:
|
if root is None:
|
||||||
@ -154,7 +229,35 @@ def inorder_iterative(root) -> list:
|
|||||||
node = node.right
|
node = node.right
|
||||||
|
|
||||||
return result
|
return result
|
||||||
````
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* we can also build an interator:
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
class BST_Iterator:
|
||||||
|
|
||||||
|
def __init__(self, root):
|
||||||
|
self.stack = []
|
||||||
|
self.left_inorder(root)
|
||||||
|
|
||||||
|
def left_inorder(self, root):
|
||||||
|
while root:
|
||||||
|
self.stack.append(root)
|
||||||
|
root = root.left
|
||||||
|
|
||||||
|
def next(self) -> int:
|
||||||
|
top_node = self.stack.pop()
|
||||||
|
if top_node.right:
|
||||||
|
self.left_inorder(top_node.right)
|
||||||
|
return top_node.val
|
||||||
|
|
||||||
|
def has_next(self) -> bool:
|
||||||
|
return len(self.stack) > 0
|
||||||
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
@ -165,6 +268,7 @@ def inorder_iterative(root) -> list:
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
- `node -> left -> right`
|
- `node -> left -> right`
|
||||||
|
|
||||||
- top-down (parameters are passed down to children), so deserialize with a queue.
|
- top-down (parameters are passed down to children), so deserialize with a queue.
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@ -203,8 +307,11 @@ def preorder_iterative(root) -> list:
|
|||||||
<br>
|
<br>
|
||||||
|
|
||||||
- `left -> right -> node`
|
- `left -> right -> node`
|
||||||
|
|
||||||
- bottom-up solution.
|
- bottom-up solution.
|
||||||
|
|
||||||
- deletion process is always post-order: when you delete a node, you will delete its left child and its right child before you delete the node itself.
|
- deletion process is always post-order: when you delete a node, you will delete its left child and its right child before you delete the node itself.
|
||||||
|
|
||||||
- post-order can be used in mathematical expressions as it's easier to write a program to parse a post-order expression. using a stack, each time when you meet an operator, you can just pop 2 elements from the stack, calculate the result and push the result back into the stack.
|
- post-order can be used in mathematical expressions as it's easier to write a program to parse a post-order expression. using a stack, each time when you meet an operator, you can just pop 2 elements from the stack, calculate the result and push the result back into the stack.
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@ -382,7 +489,11 @@ def has_path_sum(root, target_sum) -> bool:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### build tree from preorder and inorder
|
### build tree from inorder with preorder or postorder
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* building with preorder:
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
@ -394,11 +505,10 @@ def build_tree(preorder, inorder) -> Optional[Node]:
|
|||||||
if left > right:
|
if left > right:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
root = Node(preorder.pop(0))
|
root = Node(preorder.pop(0)) # this order change from postorder
|
||||||
index_here = index_map[root.val]
|
index_here = index_map[root.val]
|
||||||
|
|
||||||
# this order change from postorder
|
root.left = helper(left, index_here - 1, index_map) # this order change from postorder
|
||||||
root.left = helper(left, index_here - 1, index_map)
|
|
||||||
root.right = helper(index_here + 1, right, index_map)
|
root.right = helper(index_here + 1, right, index_map)
|
||||||
|
|
||||||
return root
|
return root
|
||||||
@ -410,44 +520,77 @@ def build_tree(preorder, inorder) -> Optional[Node]:
|
|||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
|
* build with postorder:
|
||||||
----
|
|
||||||
|
|
||||||
### binary search trees
|
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
* **binary search tree** are binary trees where all nodes on the left are smaller than the root, which is smaller than all nodes on the right.
|
```python
|
||||||
* if a bst is **balanced**, it guarantees `O(log(N))` for insert and search (as we keep the tree's height as `h = log(N)`).
|
def build_tree(left, right, index_map):
|
||||||
* common types of balanced trees are **red-black** and **avl**.
|
|
||||||
|
if left > right:
|
||||||
|
return None
|
||||||
|
|
||||||
|
root = Node(postorder.pop()) # this order change from preorder
|
||||||
|
index_here = index_map[root.val]
|
||||||
|
|
||||||
|
root.right = build_tree(index_here + 1, right, index_map) # this order change from preorder
|
||||||
|
root.left = build_tree(left, index_here - 1, index_map)
|
||||||
|
|
||||||
|
return root
|
||||||
|
|
||||||
|
|
||||||
|
def build_tree(inorder, postorder) -> Optional[Node]:
|
||||||
|
|
||||||
|
index_map = {val: i for i, value in enumerate(inorder)}
|
||||||
|
|
||||||
|
return fill_tree(0, len(inorder) - 1, index_map)
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
#### find if balanced
|
### return number of unival subtrees
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* a unival subtree means all nodes of the subtree have the same value
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
```python
|
```python
|
||||||
def is_balanced(root):
|
def count_unival(root) -> int:
|
||||||
|
|
||||||
if not root:
|
global count = 0
|
||||||
return True
|
|
||||||
|
def dfs(node):
|
||||||
|
if node is None:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if dfs(node.left) and dfs(node.right):
|
||||||
|
if (node.left and node.left.val != node.val) or \
|
||||||
|
(node.right and node.right.val != node.val):
|
||||||
|
return False
|
||||||
|
self.count += 1
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
return abs(height(root.left) - height(root.right)) < 2 and \
|
dfs(root)
|
||||||
is_balanced(root.left) and is_balanced(root.right)
|
return count
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
#### predecessor and successor
|
### successors and precessors
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
```python
|
```python
|
||||||
|
|
||||||
def successor(root):
|
def successor(root):
|
||||||
|
|
||||||
root = root.right
|
root = root.right
|
||||||
@ -466,11 +609,153 @@ def predecessor(root):
|
|||||||
return root
|
return root
|
||||||
```
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
### binary search trees
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* **binary search tree** are binary trees where all nodes on the left are smaller than the root, which is smaller than all nodes on the right.
|
||||||
|
|
||||||
|
* if a bst is **balanced**, it guarantees `O(log(N))` for insert and search (as we keep the tree's height as `h = log(N)`).
|
||||||
|
|
||||||
|
* common types of balanced trees are **red-black** and **avl**.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
#### search for a value
|
### insert a node
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* the main strategy is to find out a proper leaf position for the target and then insert the node as a leaf (therefore, insertion will begin as a search).
|
||||||
|
|
||||||
|
* the time complexity is `O(h)` where `h` is a tree height. that results in `O(log(N))` in the average case, and `O(N)` worst case.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def bst_insert_iterative(root, val):
|
||||||
|
|
||||||
|
node = root
|
||||||
|
while node:
|
||||||
|
|
||||||
|
if val > node.val:
|
||||||
|
if not node.right:
|
||||||
|
node.right = Node(val)
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
node = node.right
|
||||||
|
|
||||||
|
else:
|
||||||
|
if not node.left:
|
||||||
|
node.left = Node(val)
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
node = node.left
|
||||||
|
|
||||||
|
return root
|
||||||
|
|
||||||
|
|
||||||
|
def bst_insert_recursive(root, val):
|
||||||
|
|
||||||
|
if root is None:
|
||||||
|
return Node(val)
|
||||||
|
|
||||||
|
if val > root.val:
|
||||||
|
root.right = self.bst_insert_recursive(root.right, val)
|
||||||
|
|
||||||
|
else:
|
||||||
|
root.left = self.bst_insert_recursive(root.left, val)
|
||||||
|
|
||||||
|
return root
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### delete a node
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* deletion is a more complicated operation, and there are several strategies.
|
||||||
|
|
||||||
|
* one of them is to replace the target node with a proper child:
|
||||||
|
- if the target node has no child (it's a leaf): simply remove the node
|
||||||
|
- if the target node has one child, use the child to replace the node
|
||||||
|
- if the target node has two child, replace the node with its in-order successor or predecessor node and delete the node
|
||||||
|
|
||||||
|
* similar to the recursion solution of the search operation, the time complexity is `O(h)` in the worst case.
|
||||||
|
|
||||||
|
* according to the depth of recursion, the space complexity is also `O(h)` in the worst case. we can also represent the complexity using the total number of nodes `N`.
|
||||||
|
|
||||||
|
* the time complexity and space complexity will be `O(log(N))` in the best case but `O(N)` in the worse case.
|
||||||
|
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def successor(root):
|
||||||
|
|
||||||
|
root = root.right
|
||||||
|
while root.left:
|
||||||
|
root = root.left
|
||||||
|
return root.val
|
||||||
|
|
||||||
|
|
||||||
|
def predecessor(root):
|
||||||
|
|
||||||
|
root = root.left
|
||||||
|
while root.right:
|
||||||
|
root = root.right
|
||||||
|
return root.val
|
||||||
|
|
||||||
|
|
||||||
|
def delete_node(root, key):
|
||||||
|
|
||||||
|
if root is None:
|
||||||
|
return root
|
||||||
|
|
||||||
|
if key > root.val:
|
||||||
|
root.right = delete_node(root.right, key)
|
||||||
|
|
||||||
|
elif key < root.val:
|
||||||
|
root.left = delete_node(root.left, key)
|
||||||
|
|
||||||
|
else:
|
||||||
|
if not (root.left or root.right):
|
||||||
|
root = None
|
||||||
|
|
||||||
|
elif root.right:
|
||||||
|
root.val = successor(root)
|
||||||
|
root.right = delete_node(root.right, root.val)
|
||||||
|
|
||||||
|
else:
|
||||||
|
root.val = predecessor(root)
|
||||||
|
root.left = delete_node(root.left, root.val)
|
||||||
|
|
||||||
|
return root
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### search for a value
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* for the recursive solution, in the worst case, the depth of the recursion is equal to the height of the tree. therefore, the time complexity would be `O(h)`. the space complexity is also `O(h)`.
|
||||||
|
|
||||||
|
* for an iterative solution, the time complexity is equal to the loop time which is also `O(h)`, while the space complexity is `O(1)`.
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
@ -479,97 +764,128 @@ def search_bst_recursive(root, val):
|
|||||||
|
|
||||||
if root is None or root.val == val:
|
if root is None or root.val == val:
|
||||||
return root
|
return root
|
||||||
|
|
||||||
if val > root.val:
|
if val > root.val:
|
||||||
return search_bst_recursive(root.right, val)
|
return search_bst_recursive(root.right, val)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
return search_bst_recursive(root.left, val)
|
return search_bst_recursive(root.left, val)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def search_bst_iterative(root, val):
|
def search_bst_iterative(root, val):
|
||||||
|
|
||||||
node = root
|
while root:
|
||||||
while node:
|
|
||||||
|
|
||||||
if node.val == val:
|
if root.val == val:
|
||||||
return node
|
break
|
||||||
|
if root.val < val:
|
||||||
if node.val < val:
|
root = root.right
|
||||||
node = node.right
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
node = node.left
|
root = root.left
|
||||||
|
|
||||||
return False
|
return root
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
* for the recursive solution, in the worst case, the depth of the recursion is equal to the height of the tree. therefore, the time complexity would be `O(h)`. the space complexity is also `O(h)`.
|
|
||||||
* for an iterative solution, the time complexity is equal to the loop time which is also `O(h)`, while the space complexity is `O(1)`.
|
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
#### find lowest common ancestor
|
### find successor of two nodes inorder
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
```python
|
```python
|
||||||
def lca(self, root, p, q):
|
def find_successor(node1, node2):
|
||||||
|
|
||||||
|
successor = None
|
||||||
|
|
||||||
|
while node1:
|
||||||
|
|
||||||
|
if node1.val <= node2.val:
|
||||||
|
node1 = node1.right
|
||||||
|
else:
|
||||||
|
successor = node1
|
||||||
|
node1 = node1.left
|
||||||
|
|
||||||
|
return successor
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### convert sorted array to bst
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
* note that there is no unique solution.
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def convert_sorted_array_to_bst(nums):
|
||||||
|
|
||||||
|
def helper(left, right):
|
||||||
|
|
||||||
node = root
|
if left > right:
|
||||||
this_lcw = root.val
|
return None
|
||||||
|
|
||||||
|
p = (left + right) // 2
|
||||||
|
|
||||||
|
root = Node(nums[p])
|
||||||
|
root.left = helper(left, p - 1)
|
||||||
|
root.right = helper(p + 1, right)
|
||||||
|
|
||||||
|
return root
|
||||||
|
|
||||||
|
return helper(0, len(nums) - 1)
|
||||||
|
```
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### lowest common ancestor for a bst
|
||||||
|
|
||||||
|
<br>
|
||||||
|
|
||||||
|
```python
|
||||||
|
def lowest_common_ancestor(root, p, q):
|
||||||
|
|
||||||
|
node, result = root, root
|
||||||
|
|
||||||
while node:
|
while node:
|
||||||
|
|
||||||
this_lcw = node
|
result = node
|
||||||
|
|
||||||
if node.val > p.val and node.val > q.val:
|
if node.val > p.val and node.val > q.val:
|
||||||
node = node.left
|
node = node.left
|
||||||
|
|
||||||
elif node.val < p.val and node.val < q.val:
|
elif node.val < p.val and node.val < q.val:
|
||||||
node = node.right
|
node = node.right
|
||||||
|
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
return this_lcw
|
return result
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
#### checking if valid
|
### checking if bst is valid
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
```python
|
```python
|
||||||
|
|
||||||
def is_valid_bst_recursive(root):
|
|
||||||
|
|
||||||
def is_valid(root, min_val=float(-inf), max_val=float(inf)):
|
|
||||||
if root is None:
|
|
||||||
return True
|
|
||||||
|
|
||||||
return (min_val < root.val < max_val) and \
|
|
||||||
is_valid(root.left, min_val, root.val) and \
|
|
||||||
is_valid(root.right, root.val, max_val)
|
|
||||||
|
|
||||||
return is_valid(root)
|
|
||||||
|
|
||||||
|
|
||||||
def is_valid_bst_iterative(root):
|
def is_valid_bst_iterative(root):
|
||||||
|
|
||||||
queue = deque()
|
queue = deque((root, float(-inf), float(inf)))
|
||||||
queue.append((root, float(-inf), float(inf)))
|
|
||||||
|
|
||||||
while queue:
|
while queue:
|
||||||
|
|
||||||
node, min_val, max_val = queue.popleft()
|
node, min_val, max_val = queue.popleft()
|
||||||
|
|
||||||
if node:
|
if node:
|
||||||
if min_val >= node.val or node.val >= max_val:
|
if min_val >= node.val or node.val >= max_val:
|
||||||
return False
|
return False
|
||||||
@ -579,6 +895,16 @@ def is_valid_bst_iterative(root):
|
|||||||
queue.append((node.right, node.val, max_val))
|
queue.append((node.right, node.val, max_val))
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def is_valid_bst_recursive(root, min_val=float(-inf), max_val=float(inf)):
|
||||||
|
|
||||||
|
if root is None:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return (min_val < root.val < max_val) and \
|
||||||
|
is_valid_bst_recursive(root.left, min_val, root.val) and \
|
||||||
|
is_valid_bst_recursive(root.right, root.val, max_val)
|
||||||
|
|
||||||
|
|
||||||
def is_valid_bst_inorder(root):
|
def is_valid_bst_inorder(root):
|
||||||
@ -588,13 +914,14 @@ def is_valid_bst_inorder(root):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
inorder(node.left)
|
inorder(node.left)
|
||||||
queue.append(node.val)
|
stack.append(node.val)
|
||||||
inorder(node.right)
|
inorder(node.right)
|
||||||
|
|
||||||
queue = []
|
stack = []
|
||||||
inorder(root)
|
inorder(root)
|
||||||
for i in range(1, len(queue)):
|
|
||||||
if queue[i] <= queue[i-1]:
|
for i in range(1, len(stack)):
|
||||||
|
if queue[i] <= queue[i - 1]:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@ -602,104 +929,4 @@ def is_valid_bst_inorder(root):
|
|||||||
|
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
#### inserting a node
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
* the main strategy is to find out a proper leaf position for the target and then insert the node as a leaf (therefore, insertion will begin as a search).
|
|
||||||
* the time complexity is `O(H)` where `H` is a tree height. that results in `O(log(N))` in the average case, and `O(N)` worst case.
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
```python
|
|
||||||
def bst_insert_iterative(root, val):
|
|
||||||
|
|
||||||
new_node = Node(val)
|
|
||||||
this_node = root
|
|
||||||
|
|
||||||
while this_node:
|
|
||||||
|
|
||||||
if val > this_node.val:
|
|
||||||
if not this_node.right:
|
|
||||||
this_node.right = new_node
|
|
||||||
return root
|
|
||||||
else:
|
|
||||||
this_node = this_node.right
|
|
||||||
|
|
||||||
else:
|
|
||||||
if not this_node.left:
|
|
||||||
this_node.left = new_node
|
|
||||||
return this_node
|
|
||||||
else:
|
|
||||||
this_node = this_node.left
|
|
||||||
|
|
||||||
return new_node
|
|
||||||
|
|
||||||
|
|
||||||
def bst_insert_recursive(root, val):
|
|
||||||
|
|
||||||
if not root:
|
|
||||||
return Node(val)
|
|
||||||
|
|
||||||
if val > root.val:
|
|
||||||
root.right = self.insertIntoBST(root.right, val)
|
|
||||||
|
|
||||||
else:
|
|
||||||
root.left = self.insertIntoBST(root.left, val)
|
|
||||||
|
|
||||||
return root
|
|
||||||
```
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
#### deleting a node
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
* deletion is a more complicated operation, and there are several strategies. one of them is to replace the target node with a proper child:
|
|
||||||
- if the target node has no child (it's a leaf): simply remove the node
|
|
||||||
- if the target node has one child, use the child to replace the node
|
|
||||||
- if the target node has two child, replace the node with its in-order successor or predecessor node and delete the node
|
|
||||||
|
|
||||||
* similar to the recursion solution of the search operation, the time complexity is `O(H)` in the worst case. according to the depth of recursion, the space complexity is also `O(H)` in the worst case. we can also represent the complexity using the total number of nodes `N`. The time complexity and space complexity will be `O(logN)` in the best case but `O(N)` in the worse case.
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
```python
|
|
||||||
def delete_node(root, key):
|
|
||||||
|
|
||||||
if not root:
|
|
||||||
return root
|
|
||||||
|
|
||||||
if key > root.val:
|
|
||||||
root.right = deleteNode(root.right, key)
|
|
||||||
|
|
||||||
elif key < root.val:
|
|
||||||
root.left = deleteNode(root.left, key)
|
|
||||||
|
|
||||||
else:
|
|
||||||
if not (root.left or root.right):
|
|
||||||
root = None
|
|
||||||
|
|
||||||
elif root.right:
|
|
||||||
root.val = successor(root)
|
|
||||||
root.right = deleteNode(root.right, root.val)
|
|
||||||
|
|
||||||
else:
|
|
||||||
root.val = predecessor(root)
|
|
||||||
root.left = deleteNode(root.left, root.val)
|
|
||||||
|
|
||||||
return root
|
|
||||||
````
|
|
||||||
|
|
||||||
<br>
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user