Data Structures — Detailed Notes & 50 MCQs

God-level SEO, advanced UI, mobile-first responsive design. Correct answers highlighted.

Overview

Data structures are organized formats to store and manage data efficiently. Choosing the right data structure impacts time and space complexity of algorithms.

Arrays

Contiguous block of memory storing fixed-size elements. O(1) access by index, O(n) insertion/deletion unless at end. Use for static collections and when index-based access is essential.

// Example: array of ints
int a[5]; a[0] = 10;

Linked Lists

Nodes containing data and pointer to next (singly) or prev (doubly). Dynamic size, O(1) insertion/deletion at known position, O(n) random access. Important for implementing stacks, queues, adjacency lists.

struct Node { int data; struct Node* next; };

Stacks & Queues

Stack: LIFO — push/pop. Useful for function calls, DFS, undo. Queue: FIFO — enqueue/dequeue. Useful for BFS, scheduling. Variants: circular queue, deque, priority queue (heap-backed).

Trees

Hierarchical structures. Binary Tree (each node up to 2 children). Binary Search Tree (BST) — left<node < right for O(log n) average search/insert/delete. Balanced trees: AVL, Red-Black ensure O(log n) worst-case.

// Inorder traversal of BST prints sorted order

Heaps

Complete binary tree represented as array. Max-heap: parent >= children. Min-heap: parent <= children. Useful for priority queues and heap-sort (O(n log n)).

Graphs

Vertices connected by edges. Representation: adjacency list (sparse graphs), adjacency matrix (dense graphs). Traversals: BFS (shortest unweighted path), DFS. Algorithms: Dijkstra (shortest path), Kruskal/Prim (MST).

Hashing & Hash Tables

Maps keys to indices via hash function. Handle collisions via chaining or open addressing (linear probing, quadratic, double hashing). Average O(1) lookup/insert/delete; worst-case O(n) with poor hashing.

Sorting Algorithms

Common sorts: Bubble O(n^2), Insertion O(n^2), Merge O(n log n), Quick O(n log n) average (O(n^2) worst), Heap O(n log n). Stable sorts preserve relative order of equal elements.

Complexity & Big-O

Time complexity describes how runtime grows with input size; space complexity measures memory usage. Big-O (upper bound), Big-Theta (tight), Big-Omega (lower bound). Amortized analysis useful for dynamic arrays and hash tables.

Advanced Structures

Tries (prefix trees) for string operations, Segment Trees and Fenwick Trees for range queries, Disjoint Set Union (DSU) for union-find problems, B-Trees for databases, Suffix Arrays/Tries for string matching.

Memory & Cache Considerations

Cache-friendly structures (arrays) often outperform pointer-heavy ones (linked lists) due to locality. Consider space overhead for pointers and metadata.

50 Practice MCQs — Answers highlighted

1. Which data structure provides O(1) average time for lookup, insert and delete?

  • A) Array
  • B) Hash Table ✅
  • C) Binary Search Tree
  • D) Linked List

2. Stack follows which order?

  • A) LIFO (Last In First Out) ✅
  • B) FIFO
  • C) Random
  • D) None

3. Queue is best described as:

  • A) FIFO ✅
  • B) LIFO
  • C) Priority only
  • D) Random access

4. Which operation is O(1) in singly linked list (given pointer to node)?

  • A) Insert after node ✅
  • B) Delete previous node
  • C) Random access
  • D) Reverse list

5. Average time complexity for search in a balanced BST is:

  • A) O(n)
  • B) O(log n) ✅
  • C) O(1)
  • D) O(n log n)

6. A max-heap ensures:

  • A) Parent >= children ✅
  • B) Parent <= children
  • C) Sorted array
  • D) Linked order

7. Which traversal finds shortest path in unweighted graph?

  • A) BFS ✅
  • B) DFS
  • C) Dijkstra
  • D) Kruskal

8. DFS uses which data structure for iterative implementation?

  • A) Stack ✅
  • B) Queue
  • C) Heap
  • D) Hash Table

9. Adjacency list representation is preferred when graph is:

  • A) Sparse ✅
  • B) Dense
  • C) Complete
  • D) Tree

10. Which collision resolution uses linked lists at each bucket?

  • A) Chaining ✅
  • B) Linear probing
  • C) Quadratic probing
  • D) Double hashing

11. Trie is mainly used for:

  • A) Efficient string prefix searches ✅
  • B) Sorting numbers
  • C) Graph traversal
  • D) Memory allocation

12. DSU (Disjoint Set Union) supports which operations efficiently?

  • A) Union and Find ✅
  • B) Insert and Delete
  • C) Search and Sort
  • D) Merge and Split

13. Segment tree is primarily used for:

  • A) Range queries and updates ✅
  • B) Shortest path
  • C) String matching
  • D) Memory pooling

14. Fenwick tree (BIT) is used for:

  • A) Prefix sum queries in O(log n) ✅
  • B) Sorting arrays
  • C) Graph connectivity
  • D) Hashing

15. Time complexity of BFS on adjacency list is:

  • A) O(V + E) ✅
  • B) O(V^2)
  • C) O(E log V)
  • D) O(V log V)

16. Average time complexity of quicksort is:

  • A) O(n log n) ✅
  • B) O(n^2)
  • C) O(n)
  • D) O(log n)

17. Which sort is stable and O(n log n)?

  • A) Merge Sort ✅
  • B) Heap Sort
  • C) Quick Sort
  • D) Selection Sort

18. Which sort is in-place?

  • A) Quick Sort (in typical implementations) ✅
  • B) Merge Sort
  • C) Counting Sort
  • D) Bucket Sort

19. Topological sort applies to:

  • A) Directed Acyclic Graphs (DAG) ✅
  • B) Undirected graphs
  • C) Cyclic graphs
  • D) Trees only

20. Kruskal's algorithm finds:

  • A) Minimum Spanning Tree ✅
  • B) Shortest path
  • C) Maximum flow
  • D) Topological order

21. Bloom filter is used for:

  • A) Probabilistic membership testing (may have false positives) ✅
  • B) Exact dictionary lookup
  • C) Sorting numbers
  • D) Compression

22. Amortized analysis is useful for:

  • A) Dynamic array resizing ✅
  • B) Binary search
  • C) Quick sort worst-case
  • D) Hash collision

23. Reversing a singly linked list can be done in:

  • A) O(n) time and O(1) space ✅
  • B) O(n) time and O(n) space
  • C) O(n^2)
  • D) O(log n)

24. Which structure has better cache locality?

  • A) Array ✅
  • B) Linked List
  • C) Tree
  • D) Graph

25. Which DS can help check palindrome efficiently?

  • A) Stack ✅
  • B) Queue
  • C) Hash Table
  • D) Tree

26. Binary search requires:

  • A) Sorted array ✅
  • B) Linked list
  • C) Unsorted array
  • D) Hash table

27. Load factor in hash table is defined as:

  • A) n / m (number of elements / number of buckets) ✅
  • B) m / n
  • C) n * m
  • D) n - m

28. Red-Black trees ensure:

  • A) O(log n) height (balanced) ✅
  • B) O(n) height
  • C) Always perfect balance
  • D) No deletions

29. Which traversal uses more memory in worst case for wide graphs?

  • A) BFS (stores whole level in queue) ✅
  • B) DFS
  • C) Both equal
  • D) None

30. Traveling Salesman Problem (TSP) is:

  • A) NP-hard optimization problem ✅
  • B) P problem
  • C) Easy with greedy
  • D) Linear

31. Suffix arrays are used for:

  • A) Fast substring search and pattern matching ✅
  • B) Hashing only
  • C) Graph algorithms
  • D) Sorting integers

32. Bellman-Ford algorithm can handle:

  • A) Negative weight edges ✅
  • B) Only positive weights
  • C) Only unweighted graphs
  • D) Only DAGs

33. Which data structure is best for implementing LRU cache?

  • A) Hash map + Doubly linked list ✅
  • B) Stack only
  • C) Queue only
  • D) Binary tree

34. Bloom filters can have:

  • A) False positives but no false negatives ✅
  • B) False negatives but no false positives
  • C) Neither
  • D) Both

35. Good hash functions aim for:

  • A) Uniform distribution of keys across buckets ✅
  • B) Clustering keys
  • C) Deterministic collisions
  • D) Minimal computation only

36. B-Trees are commonly used in databases because:

  • A) They are disk-friendly and keep nodes wide (reduce disk reads) ✅
  • B) Use less memory than arrays
  • C) Simpler than BSTs
  • D) None

37. Persistent data structures preserve:

  • A) Previous versions after updates ✅
  • B) Memory only
  • C) Only last state
  • D) Speed only

38. Doubling strategy for dynamic array gives amortized insert time:

  • A) O(1) ✅
  • B) O(log n)
  • C) O(n)
  • D) O(n log n)

39. Cuckoo hashing attempts to:

  • A) Resolve collisions by moving existing keys to alternate locations ✅
  • B) Use chaining exclusively
  • C) Use a single hash function
  • D) Store duplicates

40. Linked lists are preferred over arrays when:

  • A) Frequent insertions/deletions in middle ✅
  • B) Random indexed access
  • C) Tight memory locality
  • D) Sorting frequently

41. Comparator function for sorting should be:

  • A) Transitive and consistent ✅
  • B) Random
  • C) Non-deterministic
  • D) State-changing

42. Edge list representation stores:

  • A) List of edges (pairs of vertices) ✅
  • B) Adjacency matrix
  • C) Adjacency list
  • D) Degree list

43. Big-O focuses on:

  • A) Upper bound for growth rate ✅
  • B) Exact running time
  • C) Lower bound
  • D) Constant factors only

44. Stability in sorting preserves:

  • A) Relative order of equal elements ✅
  • B) Sort order only
  • C) Memory layout
  • D) None

45. Floyd's cycle-finding algorithm detects cycle in linked list using:

  • A) Two pointers moving at different speeds ✅
  • B) Hash map only
  • C) Stack
  • D) Sorting

46. Articulation points in graphs are found using:

  • A) DFS with discovery/low times (Tarjan) ✅
  • B) BFS only
  • C) Dijkstra
  • D) Kruskal

47. For very sparse graphs, preferred storage is:

  • A) Adjacency list ✅
  • B) Adjacency matrix
  • C) Edge matrix
  • D) Complete list

48. Priority queue can be implemented with:

  • A) Binary heap ✅
  • B) Stack
  • C) Queue
  • D) Linked list only

49. Strongly connected components in directed graphs can be found by:

  • A) Kosaraju or Tarjan algorithms ✅
  • B) Dijkstra
  • C) Prim
  • D) BFS only

50. Which data structure supports fast prefix queries for strings?

  • A) Trie ✅
  • B) Hash Table
  • C) BST
  • D) Heap