Skip to content

Quick Tips

Concepts that are easy for me to forget

Solving problems

Think of data structures

Example:

Strings

Storing alphanumeric characters

You can store all alphanumeric characters in an array of length 26 → constant space & sometimes time

Sorting an array of strings (M * N log N)

Let n be the number of elements in the str array. Let m be the average/maximum # of characters for the strings in the str arrayThen, calling Arrays.sort(str) on this array would have the performance characteristic of O(m * n log n).

Hashtables

collections.OrderedDict

collections.OrderedDict() can be used as a stack with the help of .popitem() and a queue with .popitem(last=False)

Binary trees

Definition of terms

A balanced binary tree is a binary tree in which the height of the left and right subtree of any node differ by not more than 1.

A complete binary tree is a binary tree in which every level, except possibly the last is completely filled, and all nodes are as far left as possible

A full binary tree is a binary tree in which every node other than the leaves has two children.

A perfect binary tree is a full binary tree in which all leaves are at the same depth, and in which every parent has two children. (every level of the tree is completely full)

Binary Search Tree

The key lookup, insertion, and deletion take time proportional to the height of the tree, which can in the worst-case be O(n) if insertions and deletions are naively implemented. Eg if the tree looks like a linked list.

When we are talking about the average case, it is the time it takes for the operation on a balanced tree, and when we are talking about the worst case, it is the time it takes for the given operation on a non-balanced tree.

Graphs

Vertex/Node & Edge/Arc

Topological ordering

Vertices in a directed acyclic graph that have no incoming edges are referred to as sources. Vertices that have no outgoing edges are referred to as sinks**.

Indegree → count of incoming edges of each vertex/node or how many parents it has (used to determine sources)

Representing Graphs in Code

Edge List: well suited to performant lookups of an edge, or listing all edges

Adjacency List: this representation allows for constant-time lookup of adjacent vertices, which is useful in many query and pathfinding scenarios.

Adjacency Matrix: O(1) edge lookups. Easy and fast to tell if a pair (i,j) is an edge: simply check if A[i][j] is 1 or 0

DFS & BFS

The Time complexity of BFS is O(V+E) when Adjacency List is used and O(V^2) when Adjacency Matrix is used, where V stands for vertices and E stands for edges.

stack for DFS, imagine a vertical flow queue for BFS, horizontal flow

Remember to unvisit nodes when doing DFS on a graph

Dijkstra's Algorithm Time & Space complexity

O((v + e) * log(v)) time | O(v) space - where v is the number of vertices and e is the number of edges in the input graph


Searching

Quick select

""" Time complexity
Best: O(n) time | O(1) space - where n is the length of the input array 
Average: O(n) time | O(1) space
Worst: O(n^2) time | O(1) space
"""

Sorting

Remember these exist:

  • Topological sort
  • Cyclic sort

Bubble sort

# Best: O(n) time | O(1) space -> if already sorted
# Average: O(n^2) time | O(1) space
# Worst: O(n^2) time | O(1) space

Insertion sort

  1. Take an element from the second and insert it to the first.
  2. Sort the first using something similar to Bubble sort.
# Best: O(n) time | O(1) space -> if already sorted
# Average: O(n^2) time | O(1) space
# Worst: O(n^2) time | O(1) space

Selection sort

Divide the input array into two subarrays in place. The first subarray should be sorted at all times and should start with a length of 0, while the second subarray should be unsorted. Select the smallest (or largest) element in the unsorted subarray and insert it into the sorted subarray with a swap.

# Best: O(n^2) time | O(1) space
# Average: O(n^2) time | O(1) space
# Worst: O(n^2) time | O(1) space

Merge Sort

two halves → merge. The time complexity of merge sort is n log(n)

Quicksort

  1. We first select an element that we will call the pivot from the array.
  2. Move all elements that are smaller than the pivot to the left of the pivot; move all elements that are larger than the pivot to the right of the pivot. This is called the partition operation.
  3. Recursively apply the above 2 steps separately to each of the sub-arrays of elements with smaller and bigger values than the last pivot.
# Average: O(nlog(n)) time | O(log(n)) space
# Worst: O(n^2) time | O(log(n)) space

Heap Sort

Heap sort is the improved version of the Selection sort, which takes advantage of a heap data structure rather than a linear-time search to find the max value item. Using the heap, finding the next largest element takes O(log(n)) time, instead of O(n) for a linear scan as in simple selection sort. This allows heap sort to run in O(n log(n)) time, and this is also the worst-case complexity.

Bit manipulation

Shifting

Left shift: **0010 << 1 → 0100 A single left shift multiplies a binary number by 2**

Arithmetic Right shift: **1011 >> 1 → 1101**

Logical Right Shifts: (not in python) divides a number by 2, throwing out any remainders.



Find the original version of this page (with additional content) on Notion here.



Last update: December 13, 2021 16:05:48
Created: December 13, 2021 16:05:48
Authors: paulonteri (94.62%), Not Committed Yet (5.38%)