Learn Real World Engineering skills through Real World projects

Stay Sharp: Data Structure and Algorithm Cheatsheet

Keeping up with interview prep is hard over time. If we are being honest we know that the interviews that a majority of our industry provides are nothing like our day jobs. But in order to secure most of the high paying jobs in our industry you must go through a few rounds of programming interview questions generally consisting of Data Structure and Algorithm questions.

To combat the programming question atrophy we now have a new series for After CompSci, Stay Sharp. Stay Sharp will provide you with some interview questions and interview prep material so that you won’t have to spend 3 months prepping those interviewing skills before your next interview cycle.

Top 10 Data Structures

Here’s a cheat sheet of the Top 10 Data Structures for a software engineering interview, including typical time complexities (Big O notation) for common operations like insertion, deletion, search, and access, as well as their space complexity.

Data StructureAverage Time Complexity (Big O) Worst Case Time Complexity (Big O) Space Complexity
ArrayAccess: O(1) Search: O(n) Insert: O(n) Delete: O(n)Access: O(1) Search: O(n) Insert: O(n) Delete: O(n)O(n)
Linked ListAccess: O(n) Search: O(n) Insert: O(1) Delete: O(1)Access: O(n) Search: O(n) Insert: O(1) Delete: O(1)O(n)
StackAccess: O(n) Search: O(n) Insert: O(1) Delete: O(1)Access: O(n) Search: O(n) Insert: O(1) Delete: O(1)O(n)
QueueAccess: O(n) Search: O(n) Insert: O(1) Delete: O(1)Access: O(n) Search: O(n) Insert: O(1) Delete: O(1)O(n)
Hash TableAccess: O(1) Search: O(1) Insert: O(1) Delete: O(1)Access: O(n) Search: O(n) Insert: O(n) Delete: O(n)O(n)
Binary Search TreeAccess: O(log n)
Search: O(log n)
Insert: O(log n) Delete: O(log n)
Access: O(n) Search: O(n) Insert: O(n) Delete: O(n)O(n)
HeapAccess: O(n) Search: O(n) Insert: O(log n) Delete: O(log n)Access: O(n) Search: O(n) Insert: O(log n) Delete: O(log n)O(n)
Graph (Adjacency List)Access: O(1) Search: O(V + E)
Insert: O(1) Delete: O(E)
Access: O(1) Search: O(V + E)
Insert: O(1) Delete: O(E)
O(V + E)
Graph (Adjacency Matrix)Access: O(1) Search: O(V^2)
Insert: O(1) Delete: O(V)
Access: O(1) Search: O(V^2)
Insert: O(1) Delete: O(V)
O(V^2)
TrieAccess: O(m) Search: O(m) Insert: O(m) Delete: O(m)Access: O(m) Search: O(m) Insert: O(m) Delete: O(m)O(m * n)

Key Notes:

  • n: Number of elements in the data structure
  • V: Number of vertices (for Graph)
  • E: Number of edges (for Graph)
  • m: Length of the input string (for Trie)
  • Arrays are great for direct access but not for insertion or deletion, while linked lists provide fast insertions/deletions at the expense of access time.
  • Stacks and queues offer efficient LIFO/FIFO operations but can be slow for searching.
  • Hash tables provide very fast average operations but degrade in the worst case due to collisions.
  • Heaps are optimal for priority queue operations but not direct access.
  • Trees and tries support hierarchical data and efficient sorted operations.

10 Common Algorithms

Here’s a cheat sheet of the 10 common algorithms commonly encountered in software engineering interviews, including their time complexities (Big O notation) and space complexities.

Algorithm NameTime Complexity (Big O)Space Complexity (Big O)
Binary SearchTime: O(log n)Space: O(1)
Merge SortTime: O(n log n)Space: O(n)
Quick SortTime: O(n log n) (average)
O(n²) (worst)
Space: O(log n) (average)
O(n) (worst)
Breadth-First Search (BFS)Time: O(V + E)Space: O(V)
Depth-First Search (DFS)Time: O(V + E)Space: O(V)
Dijkstra’s AlgorithmTime: O((V + E) log V)Space: O(V + E)
Dynamic Programming (e.g., Knapsack, Fibonacci)Time: O(n) to O(n²) (depends on problem)Space: O(n) to O(n²)
Kruskal’s AlgorithmTime: O(E log E)Space: O(E)
Prim’s AlgorithmTime: O((V + E) log V)Space: O(V)
Topological SortTime: O(V + E)Space: O(V)

Key Notes:

  • n: Number of elements in the input (array size, etc.)
  • V: Number of vertices (for Graph)
  • E: Number of edges (for Graph)
  • Binary Search is efficient for searching in sorted arrays.
  • Merge Sort is stable and guarantees O(n log n) performance, but requires additional space.
  • Quick Sort is often faster in practice but can have a worst-case of O(n²) if the pivot selection is poor.
  • BFS and DFS are key graph traversal algorithms, useful for searching, shortest paths (unweighted), and connected components.
  • Dijkstra’s Algorithm is a go-to for finding the shortest path in a graph with non-negative weights.
  • Dynamic Programming is a key technique for optimization problems like Knapsack and Fibonacci. Think recursion, memoization and matrix exponentiation.
  • Kruskal’s and Prim’s algorithms are for finding minimum spanning trees in a graph.
  • Topological Sort is useful for dependency resolution and directed acyclic graphs (DAGs).

Leave a comment