-
- Best practice questions for coding interviews
- Grind 75 - the next evolution of Blind 75, bigger and better
- How to prepare for coding interviews
- Coding interview best practices - Straight-to-the-point Do's and Don'ts
- Algorithm cheatsheets and tips categorized by topic
- Step-by-step Software Engineer resume guide to prepare a FAANG-ready resume
- Behavioral questions asked by the top tech companies
- Front end interview preparation
- Tech Interview Guide
- Common Python Data Structures & Algorithms
-
- Videos:
- Implement a vector (mutable array with automatic resizing):
- Practice coding using arrays and pointers, and pointer math to jump to an index instead of using indexing.
- new raw data array with allocated memory
- can allocate int array under the hood, just not use its features
- start with 16, or if starting number is greater, use power of 2 - 16, 32, 64, 128
- size() - number of items
- capacity() - number of items it can hold
- is_empty()
- at(index) - returns item at given index, blows up if index out of bounds
- push(item)
- insert(index, item) - inserts item at index, shifts that index's value and trailing elements to the right
- prepend(item) - can use insert above at index 0
- pop() - remove from end, return value
- delete(index) - delete item at index, shifting all trailing elements left
- remove(item) - looks for value and removes index holding it (even if in multiple places)
- find(item) - looks for value and returns first index with that value, -1 if not found
- resize(new_capacity) // private function
- when you reach capacity, resize to double the size
- when popping an item, if size is 1/4 of capacity, resize to half
- Time
- O(1) to add/remove at end (amortized for allocations for more space), index, or update
- O(n) to insert/remove elsewhere
- Space
- contiguous in memory, so proximity helps performance
- space needed = (array capacity, which is >= n) * size of item, but even if 2n, still O(n)
-
- Videos:
- C Code (video) - not the whole video, just portions about Node struct and memory allocation.
- Linked List vs Arrays:
- why you should avoid linked lists (video)
- Gotcha: you need pointer to pointer knowledge: (for when you pass a pointer to a function that may change the address where that pointer points) This page is just to get a grasp on ptr to ptr. I don't recommend this list traversal style. Readability and maintainability suffer due to cleverness.
- implement (I did with tail pointer & without):
- size() - returns number of data elements in list
- empty() - bool returns true if empty
- value_at(index) - returns the value of the nth item (starting at 0 for first)
- push_front(value) - adds an item to the front of the list
- pop_front() - remove front item and return its value
- push_back(value) - adds an item at the end
- pop_back() - removes end item and returns its value
- front() - get value of front item
- back() - get value of end item
- insert(index, value) - insert value at index, so current item at that index is pointed to by new item at index
- erase(index) - removes node at given index
- value_n_from_end(n) - returns the value of the node at nth position from the end of the list
- reverse() - reverses the list
- remove_value(value) - removes the first item in the list with this value
- Doubly-linked List
- Description (video)
- No need to implement
-
- Stacks (video)
- Using Stacks Last-In First-Out (video)
- Will not implement. Implementing with array is trivial.
-
- Using Queues First-In First-Out(video)
- Queue (video)
- Circular buffer/FIFO
- Priority Queues (video)
- Implement using linked-list, with tail pointer:
- enqueue(value) - adds value at position at tail
- dequeue() - returns value and removes least recently added element (front)
- empty()
- Implement using fixed-sized array:
- enqueue(value) - adds item at end of available storage
- dequeue() - returns value and removes least recently added element
- empty()
- full()
- Cost:
- a bad implementation using linked list where you enqueue at head and dequeue at tail would be O(n) because you'd need the next to last element, causing a full traversal each dequeue
- enqueue: O(1) (amortized, linked list and array [probing])
- dequeue: O(1) (linked list and array)
- empty: O(1) (linked list and array)
-
-
Videos:
-
Online Courses:
-
implement with array using linear probing
- hash(k, m) - m is size of hash table
- add(key, value) - if key already exists, update value
- exists(key)
- get(key)
- remove(key)
-
-
- Binary Search (video)
- Binary Search (video)
- detail
- Implement:
- binary search (on sorted array of integers)
- binary search using recursion
-
- Bits cheat sheet - you should know many of the powers of 2 from (2^1 to 2^16 and 2^32)
- Get a really good understanding of manipulating bits with: &, |, ^, ~, >>, <<
- 2s and 1s complement
- count set bits
- round to next power of 2:
- swap values:
- absolute value:
-
- Series: Core Trees (video)
- Series: Trees (video)
- basic tree construction
- traversal
- manipulation algorithms
- BFS (breadth-first search)
- MIT (video)
- level order (BFS, using queue) time complexity: O(n) space complexity: best: O(1), worst: O(n/2)=O(n)
- DFS (depth-first search)
- MIT (video)
- notes: time complexity: O(n) space complexity: best: O(log n) - avg. height of tree worst: O(n)
- inorder (DFS: left, self, right)
- postorder (DFS: left, right, self)
- preorder (DFS: self, left, right)
-
- Binary Search Tree Review (video)
- Series (video)
- starts with symbol table and goes through BST applications
- Introduction (video)
- MIT (video)
- C/C++:
- Binary search tree - Implementation in C/C++ (video)
- BST implementation - memory allocation in stack and heap (video)
- Find min and max element in a binary search tree (video)
- Find height of a binary tree (video)
- Binary tree traversal - breadth-first and depth-first strategies (video)
- Binary tree: Level Order Traversal (video)
- Binary tree traversal: Preorder, Inorder, Postorder (video)
- Check if a binary tree is binary search tree or not (video)
- Delete a node from Binary Search Tree (video)
- Inorder Successor in a binary search tree (video)
- Implement:
- insert // insert value into tree
- get_node_count // get count of values stored
- print_values // prints the values in the tree, from min to max
- delete_tree
- is_in_tree // returns true if given value exists in the tree
- get_height // returns the height in nodes (single node's height is 1)
- get_min // returns the minimum value stored in the tree
- get_max // returns the maximum value stored in the tree
- is_binary_search_tree
- delete_value
- get_successor // returns next-highest value in tree after given value, -1 if none
-
- visualized as a tree, but is usually linear in storage (array, linked list)
- Heap
- Introduction (video)
- Naive Implementations (video)
- Binary Trees (video)
- Tree Height Remark (video)
- Basic Operations (video)
- Complete Binary Trees (video)
- Pseudocode (video)
- Heap Sort - jumps to start (video)
- Heap Sort (video)
- Building a heap (video)
- MIT: Heaps and Heap Sort (video)
- CS 61B Lecture 24: Priority Queues (video)
- Linear Time BuildHeap (max-heap)
- Implement a max-heap:
- insert
- sift_up - needed for insert
- get_max - returns the max item, without removing it
- get_size() - return number of elements stored
- is_empty() - returns true if heap contains no elements
- extract_max - returns the max item, removing it
- sift_down - needed for extract_max
- remove(i) - removes item at index x
- heapify - create a heap from an array of elements, needed for heap_sort
- heap_sort() - take an unsorted array and turn it into a sorted array in-place using a max heap
- note: using a min heap instead would save operations, but double the space needed (cannot do in-place).
-
Notes:
- Implement sorts & know best case/worst case, average complexity of each:
- no bubble sort - it's terrible - O(n^2), except when n <= 16
- stability in sorting algorithms ("Is Quicksort stable?")
- Which algorithms can be used on linked lists? Which on arrays? Which on both?
- I wouldn't recommend sorting a linked list, but merge sort is doable.
- Merge Sort For Linked List
- Implement sorts & know best case/worst case, average complexity of each:
-
For heapsort, see Heap data structure above. Heap sort is great, but not stable.
-
UC Berkeley:
-
Merge sort code:
-
Quick sort code:
-
Implement:
- Mergesort: O(n log n) average and worst case
- Quicksort O(n log n) average case
- Selection sort and insertion sort are both O(n^2) average and worst case
- For heapsort, see Heap data structure above.
-
Not required, but I recommended them:
As a summary, here is a visual representation of 15 sorting algorithms. If you need more detail on this subject, see "Sorting" section in Additional Detail on Some Subjects
Graphs can be used to represent many problems in computer science, so this section is long, like trees and sorting were.
-
Notes:
- There are 4 basic ways to represent a graph in memory:
- objects and pointers
- adjacency matrix
- adjacency list
- adjacency map
- Familiarize yourself with each representation and its pros & cons
- BFS and DFS - know their computational complexity, their tradeoffs, and how to implement them in real code
- When asked a question, look for a graph-based solution first, then move on if none.
- There are 4 basic ways to represent a graph in memory:
-
Skiena Lectures - great intro:
- CSE373 2012 - Lecture 11 - Graph Data Structures (video)
- CSE373 2012 - Lecture 12 - Breadth-First Search (video)
- CSE373 2012 - Lecture 13 - Graph Algorithms (video)
- CSE373 2012 - Lecture 14 - Graph Algorithms (con't) (video)
- CSE373 2012 - Lecture 15 - Graph Algorithms (con't 2) (video)
- CSE373 2012 - Lecture 16 - Graph Algorithms (con't 3) (video)
-
Graphs (review and more):
- 6.006 Single-Source Shortest Paths Problem (video)
- 6.006 Dijkstra (video)
- 6.006 Bellman-Ford (video)
- 6.006 Speeding Up Dijkstra (video)
- Aduni: Graph Algorithms I - Topological Sorting, Minimum Spanning Trees, Prim's Algorithm - Lecture 6 (video)
- Aduni: Graph Algorithms II - DFS, BFS, Kruskal's Algorithm, Union Find Data Structure - Lecture 7 (video)
- Aduni: Graph Algorithms III: Shortest Path - Lecture 8 (video)
- Aduni: Graph Alg. IV: Intro to geometric algorithms - Lecture 9 (video)
-
CS 61B 2014 (starting at 58:09) (video) - CS 61B 2014: Weighted graphs (video)
- Greedy Algorithms: Minimum Spanning Tree (video)
- Strongly Connected Components Kosaraju's Algorithm Graph Algorithm (video)
-
Full Coursera Course:
-
I'll implement:
- DFS with adjacency list (recursive)
- DFS with adjacency list (iterative with stack)
- DFS with adjacency matrix (recursive)
- DFS with adjacency matrix (iterative with stack)
- BFS with adjacency list
- BFS with adjacency matrix
- single-source shortest path (Dijkstra)
- minimum spanning tree
- DFS-based algorithms (see Aduni videos above):
- check for cycle (needed for topological sort, since we'll check for cycle before starting)
- topological sort
- count connected components in a graph
- list strongly connected components
- check for bipartite graph
You'll get more graph practice in Skiena's book (see Books section below) and the interview books
-
- Stanford lectures on recursion & backtracking:
- when it is appropriate to use it
- how is tail recursion better than not?
-
- This subject can be pretty difficult, as each DP soluble problem must be defined as a recursion relation, and coming up with it can be tricky.
- I suggest looking at many examples of DP problems until you have a solid understanding of the pattern involved.
- Videos:
- the Skiena videos can be hard to follow since he sometimes uses the whiteboard, which is too small to see
- Skiena: CSE373 2012 - Lecture 19 - Introduction to Dynamic Programming (video)
- Skiena: CSE373 2012 - Lecture 20 - Edit Distance (video)
- Skiena: CSE373 2012 - Lecture 21 - Dynamic Programming Examples (video)
- Skiena: CSE373 2012 - Lecture 22 - Applications of Dynamic Programming (video)
- Simonson: Dynamic Programming 0 (starts at 59:18) (video)
- Simonson: Dynamic Programming I - Lecture 11 (video)
- Simonson: Dynamic programming II - Lecture 12 (video)
- List of individual DP problems (each is short): Dynamic Programming (video)
- Yale Lecture notes:
- Coursera:
-
- Optional: UML 2.0 Series (video)
- Object-Oriented Software Engineering: Software Dev Using UML and Java (21 videos):
- Can skip this if you have a great grasp of OO and OO design practices.
- OOSE: Software Dev Using UML and Java
- SOLID OOP Principles:
- Bob Martin SOLID Principles of Object Oriented and Agile Design (video)
- SOLID Principles (video)
- S - Single Responsibility Principle | Single responsibility to each Object
- O - Open/Closed Principal | On production level Objects are ready for extension but not for modification
- L - Liskov Substitution Principal | Base Class and Derived class follow ‘IS A’ principal
- I - Interface segregation principle | clients should not be forced to implement interfaces they don't use
- D -Dependency Inversion principle | Reduce the dependency In composition of objects.
-
- Quick UML review (video)
- Learn these patterns:
- strategy
- singleton
- adapter
- prototype
- decorator
- visitor
- factory, abstract factory
- facade
- observer
- proxy
- delegate
- command
- state
- memento
- iterator
- composite
- flyweight
- Chapter 6 (Part 1) - Patterns (video)
- Chapter 6 (Part 2) - Abstraction-Occurrence, General Hierarchy, Player-Role, Singleton, Observer, Delegation (video)
- Chapter 6 (Part 3) - Adapter, Facade, Immutable, Read-Only Interface, Proxy (video)
- Series of videos (27 videos)
- Head First Design Patterns
- I know the canonical book is "Design Patterns: Elements of Reusable Object-Oriented Software", but Head First is great for beginners to OO.
- Handy reference: 101 Design Patterns & Tips for Developers
- Design patterns for humans
-
- Math Skills: How to find Factorial, Permutation and Combination (Choose) (video)
- Make School: Probability (video)
- Make School: More Probability and Markov Chains (video)
- Khan Academy:
- Course layout:
- Just the videos - 41 (each are simple and each are short):
-
- Know about the most famous classes of NP-complete problems, such as traveling salesman and the knapsack problem, and be able to recognize them when an interviewer asks you them in disguise.
- Know what NP-complete means.
- Computational Complexity (video)
- Simonson:
- Skiena:
- Complexity: P, NP, NP-completeness, Reductions (video)
- Complexity: Approximation Algorithms (video)
- Complexity: Fixed-Parameter Algorithms (video)
- Peter Norvig discusses near-optimal solutions to traveling salesman problem:
- Pages 1048 - 1140 in CLRS if you have it.
-
- Computer Science 162 - Operating Systems (25 videos):
- for processes and threads see videos 1-11
- Operating Systems and System Programming (video)
- What Is The Difference Between A Process And A Thread?
- Covers:
- Processes, Threads, Concurrency issues
- difference between processes and threads
- processes
- threads
- locks
- mutexes
- semaphores
- monitors
- how they work
- deadlock
- livelock
- CPU activity, interrupts, context switching
- Modern concurrency constructs with multicore processors
- Paging, segmentation and virtual memory (video)
- Interrupts (video)
- Scheduling (video)
- Process resource needs (memory: code, static storage, stack, heap, and also file descriptors, i/o)
- Thread resource needs (shares above (minus stack) with other threads in the same process but each has its own pc, stack counter, registers, and stack)
- Forking is really copy on write (read-only) until the new process writes to memory, then it does a full copy.
- Context switching
- How context switching is initiated by the operating system and underlying hardware
- Processes, Threads, Concurrency issues
- threads in C++ (series - 10 videos)
- concurrency in Python (videos):
- Computer Science 162 - Operating Systems (25 videos):
-
Free Classes and Datasets
-
Sources: