Data Structures & Algorithms CS - 2510 Spring 2025
Instructor’s Introduction Syed Zaid Irshad Senior Lecturer Enrolled in PhD Computer Science at MAJU 9 Semester 681 Students 11 Programs 1148 Grades 15% Grade A/A- 11% Grade F/W
Couse Details • Teams Code: 8blnr94 • Marks Breakup • Midterm: 20 Marks • Final Term: 50 Marks • Assignment + Quizzes + Online Certificate: 30 Marks • Course Outline is shared on Teams
Rules for this subject • No Retake or Resubmission after deadline • No correction of Attendance • If you are late sit quietly or leave • No cell phone usage except mine • If you know about topic keep it to yourself unless I am saying anything wrong • Your behavior in class will determine how difficult Papers will be • Do not sit in class without pen and paper
Topics for Midterm • Intro to Data Structures • Algorithm Analysis (Big O, Big Theta, Big Omega) • Search Algorithms (Linear Search, Binary Search) • Sorting Algorithms (Bubble Sort, Insertion Sort, Selection Sort, Merge Sort, Quick Sort, Counting Sort) • String Algorithms (Knuth-Morris-Pratt (KMP), Naive Pattern Searching (NPS)) • Linked Lists (Single, Doubly, Circular, Search, Sort) • Stacks (implementation using Array/LinkedList)
Introduction to Data Structures • Definition: A data structure is a way of organizing and storing data to perform operations efficiently. • Why are they important? Helps manage large amounts of data Improves efficiency in searching, sorting, and modifying data Forms the foundation of programming and algorithms
Type of Data Structures • Linear Data Structures: • Array (Fixed-size collection of elements) • Linked List (Dynamic memory allocation) • Stack (LIFO - Last In, First Out) • Queue (FIFO - First In, First Out) • Non-Linear Data Structures: • Tree (Hierarchical structure) • Graph (Network-like connections) • Hash Table (Key-Value mapping for fast access)
Array
Types of Array • Static Array: A collection of elements stored in contiguous memory locations. • int arr[5] = {1, 2, 3, 4, 5}; • Dynamic Array: A Dynamic Array is an array that can grow or shrink in size automatically during runtime. • int* arr = new int[5]; • delete[] arr;
Dynamic Array vs. Static Array Operation Static Array Dynamic Array Memory Location Stack Heap Resizable? NO Yes Efficiency Fast Slower Flexibility No Yes Access arr[i] O(1) O(1) Insertion at End Not Possible O(1) Insertion at Middle O(n) O(n) Delete at End Not Possible O(1) Delete at Middle O(n) O(n)
Advantages & Disadvantages of Dynamic Array • Advantages: Flexible size Efficient memory allocation Easier than linked lists (cache-friendly) • Disadvantages: Resizing can be costly (O(n) for copying elements) Uses extra memory for resizing overhead
Linked List • Definition: A collection of nodes, where each node contains data and a reference to the next node. • Types: Singly Linked List Doubly Linked List Circular Linked List
Stack • Definition: A data structure that follows Last In, First Out (LIFO) order. • Operations: Push (Insert), Pop (Remove), Peek (Top Element) • Use Cases: Undo functionality, Backtracking algorithms
Queue • Definition: A data structure that follows First In, First Out (FIFO) order. • Operations: Enqueue (Add), Dequeue (Remove), Peek (Front Element) • Use Cases: Task scheduling, Print queue
Trees • Definition: A non-linear data structure with hierarchical relationships. • Types: Binary Tree Binary Search Tree (BST) AVL Tree B+ Tree • Use Cases: File systems, Database indexing
Graph • Definition: A collection of nodes (vertices) connected by edges. • Types: Directed Graph Undirected Graph • Use Cases: Social networks, Google Maps
Algorithm Analysis • Algorithm analysis is the process of evaluating the efficiency of an algorithm in terms of time complexity and space complexity. • It helps us determine the performance of an algorithm before implementation.
Why Analyze Algorithms? • Helps choose the most efficient algorithm for a problem. • Predicts performance for large inputs (scalability). • Avoids inefficient solutions that could lead to slow execution.
Time Complexity Complexity Notation Example Constant Time O(1) Accessing an element in an array arr[i] Logarithmic Time O(log n) Binary Search Linear Time O(n) Iterating through an array Linearithmic Time O(n log n) Merge Sort, Quick Sort (average case) Quadratic Time O(n²) Nested loops (e.g., Bubble Sort) Cubic Time O(n³) Triple nested loops Exponential Time O(2ⁿ) Recursion in Fibonacci Factorial Time O(n!) Brute-force permutation generation
Space Complexity Complexity Example O(1) Using a few variables (constant space) O(n) Storing an array of n elements O(n²) Using a 2D matrix O(n log n) Recursive merge sort
Best, Worst, and Average Cases Case Meaning Best Case Minimum time needed (ideal scenario) Worst Case Maximum time needed (guaranteed upper bound) Average Case Expected time for random inputs
Amortized Analysis • Analyzes the average performance of an operation over multiple runs.
Asymptotic Notations • Asymptotic notations describe the growth rate of an algorithm's running time or space usage as the input size n increases. They help compare algorithms based on their efficiency. • They allow us to ignore constant factors and focus on growth trends. • Help us determine whether an algorithm is scalable for large inputs. • Provide upper, lower, and tight bounds on algorithm performance.
Types of Asymptotic Notations Notation Meaning Definition Use Case Big-O (O) Upper Bound (Worst Case) T(n) ≤ c * f(n) for large n Ensures performance will not exceed this limit Big-Omega (Ω) Lower Bound (Best Case) T(n) ≥ c * f(n) for large n Guarantees the algorithm will at least take this much time Big-Theta (Θ)) Tight Bound (Average/Exact Case c₁ * f(n) ≤ T(n) ≤ c₂ * f(n) for large n Defines the exact growth rate Little-o (o) Strict Upper Bound T(n) < c * f(n), but not tight Shows an algorithm grows strictly slower than f(n) Little-omega (ω) Strict Lower Bound T(n) > c * f(n), but not tight Shows an algorithm grows strictly faster than f(n)
Searching Algorithm • Searching algorithms are essential tools in computer science used to locate specific items within a collection of data. • These algorithms are designed to efficiently navigate through data structures to find the desired information, making them fundamental in various applications such as databases, web search engines, and more.
Linear VS Binary Linear Search Binary Search Traverse each Element Divide data into half Works on any sequence of data Only works if data is in proper sequence Best Case: O(1), Worst / Average Case: O(n) Best Case: O(1), Worst / Average Case: O(log n) Auxiliary Space: O(1) Auxiliary Space: O(1) Deals with Small Datasets Deals with Large Datasets
Empty Array
Random Array
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Visualizing Linear Search
Code Snippet
Sorted Array
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Visualizing Binary Search
Code Snippet
Trapping Rainwater Problem
Reshaping problem into Array Left Boundary Right Boundary
Reshaping problem into Array Starting Point
Reshaping problem into Array M H L M H R
Reshaping problem into Array Left Max Right Max
Reshaping problem into Array Min Height Starting Point =2-1=1
Reshaping problem into Array Starting Point =2-1=1
Reshaping problem into Array =2-1=1 M H L M H R
Reshaping problem into Array =2-1=1 Left/Right Max
Reshaping problem into Array =2-1=1 Min H/ Starting P
Reshaping problem into Array =2-1=1 Starting Point
Reshaping problem into Array =2-1=1 M H L M H R
Reshaping problem into Array =2-1=1 Left Max Right Max
Reshaping problem into Array =2-1=1 Min Height Starting Point =4-3=1
Reshaping problem into Array =2-1=1 =4-3=1 Starting Point
Reshaping problem into Array =2-1=1 =4-3=1 M H L M H R
Reshaping problem into Array =2-1=1 Left Max Right Max =4-3=1
Reshaping problem into Array =2-1=1 Min Height Starting Point =4-3=1 =4-1=3
Reshaping problem into Array =2-1=1 Starting Point =4-3=1 =4-1=3
Reshaping problem into Array =2-1=1 =4-3=1 =4-1=3 M H L M H R
Reshaping problem into Array =2-1=1 Left Max Right Max =4-3=1 =4-1=3
Reshaping problem into Array =2-1=1 Min Height Starting Point =4-3=1 =4-1=3 =4-0=4
Total Trapped Rainwater • 1 + 1 + 3 + 4 = 9 • This is known as Naïve Approach • Time Complexity is O(n2) • Space Complexity is O(1)
Sorting Algorithms • Sorting refers to rearrangement of a given array or list of elements according to a comparison operator on the elements. • The comparison operator is used to decide the new order of elements in the respective data structure.
Sorting Basics • In-place Sorting: An in-place sorting algorithm uses constant space for producing the output (modifies the given array only. Examples: Selection Sort, Bubble Sort, Insertion Sort and Heap Sort. • Internal Sorting: Internal Sorting is when all the data is placed in the main memory or internal memory. In internal sorting, the problem cannot take input beyond allocated memory size. • External Sorting : External Sorting is when all the data that needs to be sorted need not to be placed in memory at a time, the sorting is called external sorting. External Sorting is used for the massive amount of data. For example, Merge sort can be used in external sorting as the whole array does not have to be present all the time in memory, • Stable sorting: When two same items appear in the same order in sorted data as in the original array called stable sort. Examples: Merge Sort, Insertion Sort, Bubble Sort. • Hybrid Sorting: A sorting algorithm is called Hybrid if it uses more than one standard sorting algorithms to sort the array. The idea is to take advantages of multiple sorting algorithms. For example, IntroSort uses Insertions sort and Quick Sort.
Sorting Techniques • There are various sorting algorithms are used in data structures. The following two types of sorting algorithms can be broadly classified: o Comparison-based: We compare the elements in a comparison-based sorting algorithm) o Non-comparison-based: We do not compare the elements in a non- comparison-based sorting algorithm)
Random Array
Bubble Sort • Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping the adjacent elements if they are in the wrong order. • Complexity Analysis of Bubble Sort: o Time Complexity: O(n2) o Auxiliary Space: O(1) • Advantages of Bubble Sort: o Bubble sort is easy to understand and implement. o It does not require any additional memory space. o It is a stable sorting algorithm, meaning that elements with the same key value maintain their relative order in the sorted output. • Disadvantages of Bubble Sort: o Bubble sort has a time complexity of O(n2) which makes it very slow for large data sets. o Bubble sort has almost no or limited real world applications. It is mostly used in academics to teach different ways of sorting.
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
1st Pass (Pick & Compare)
1st Pass (Copy Bot 1 into Bot 3)
1st Pass (Copy Bot 2 into Bot 1)
1st Pass (Copy Bot 3 into Bot 2)
2nd Pass
3rd Pass
4th Pass
5th Pass
6th Pass
7th Pass
8th Pass
9th Pass
10th Pass
Code Snippet
Random Array
Insertion Sort • Insertion sort is a simplesorting algorithm that works by iteratively inserting each element of an unsorted listinto its correctposition in asorted portion of the list. • Complexity Analysis of Insertion Sort : o Time Complexity of Insertion Sort ▪ Best case: O(n), If the list is already sorted, where n is the number of elements in thelist. ▪ Average case: O(n2), If the list is randomly ordered ▪ Worst case: O(n2), If the list is in reverseorder o Space Complexity of Insertion Sort ▪ Auxiliary Space: O(1), Insertion sort requires O(1) additional space, making it a space-efficient sorting algorithm. • Advantages of Insertion Sort: o Simple and easy to implement. o Stable sorting algorithm. o Efficient for small lists and nearly sorted lists. o Space-efficient as it is an in-place algorithm. o Adoptive. the number of inversions is directly proportional to number of swaps. For example, no swapping happens for a sorted array, and it takes O(n) time only. • Disadvantages of Insertion Sort: o Inefficient for large lists. o Not as efficient as other sorting algorithms (e.g., merge sort, quick sort) for most cases.
1st Pass (Pick & Compare)
1st Pass (Update Bot 1 with Bot 2)
1st Pass (Bot 2 is out of bound)
1st Pass (Update Bot 1 with Bot 3)
2nd Pass (Pick & Compare)
2nd Pass (Update Bot 1 with Bot 2)
2nd Pass (Update Bot 1 with Bot 2)
2nd Pass (Update Bot 1 with Bot 2)
2nd Pass (Bot 2 is out of bound)
2nd Pass (Update Bot 1 with Bot 3)
3rd Pass (Pick & Compare)
4th Pass (Pick & Compare)
4th Pass (Update Bot 1 with Bot 2)
4th Pass (Update Bot 1 with Bot 2)
4th Pass (Update Bot 1 with Bot 2)
4th Pass (Update Bot 1 with Bot 2)
4th Pass (Bot 2 is out of bound)
4th Pass (Update Bot 1 with Bot 3)
5th Pass (Pick & Compare)
5th Pass (Update Bot 1 with Bot 2)
5th Pass (Update Bot 1 with Bot 2)
5th Pass (Update Bot 1 with Bot 2)
5th Pass (Update Bot 1 with Bot 2)
5th Pass (Update Bot 1 with Bot 2)
5th Pass (Bot 2 is out of bound)
5th Pass (Update Bot 1 with Bot 3)
6th Pass (Pick & Compare)
6th Pass (Update Bot 1 with Bot 2)
6th Pass (Update Bot 1 with Bot 2)
6th Pass (Bot 2 hold less value then Bot3)
6th Pass (Update Bot 1 with Bot 3)
7th Pass (Pick & Compare)
8th Pass (Pick & Compare)
8th Pass (Update Bot 1 with Bot 2)
8th Pass (Update Bot 1 with Bot 2)
8th Pass (Update Bot 1 with Bot 2)
8th Pass (Update Bot 1 with Bot 2)
8th Pass (Bot 2 hold less value then Bot3)
8th Pass (Update Bot 1 with Bot 3)
9th Pass (Pick & Compare)
9th Pass (Update Bot 1 with Bot 2)
9th Pass (Bot 2 hold less value then Bot3)
9th Pass (Update Bot 1 with Bot 3)
10th Pass (Pick & Compare)
10th Pass (Bot 2 hold less value then Bot3)
10th Pass (Update Bot 1 with Bot 3)
11th Pass (Update Bot 1 with Bot 3)
12th Pass (Update Bot 1 with Bot 3)
13th Pass (Update Bot 1 with Bot 3)
14th Pass (Update Bot 1 with Bot 3)
Code Snippet
Random Array
Selection Sort • Selection Sort is a comparison-based sorting algorithm. It sorts an array by repeatedly selecting the smallest (or largest) element from the unsorted portion and swapping it with the first unsorted element. • Complexity Analysis of Selection Sort o Time Complexity: O(n2) ,as thereare two nested loops: o One loop to selectan element of Array one by one = O(n) o Anotherloop to comparethat element with everyother Array element = O(n) o Therefore, overall complexity = O(n) * O(n) = O(n*n) =O(n2) o Auxiliary Space: O(1) as the only extra memoryused is for temporary variables. • Advantages of Selection Sort o Easy to understand and implement, making itideal for teaching basic sorting concepts. o Requires only a constantO(1) extra memoryspace. o It requires asmaller number of swaps (ormemory writes) compared to many other standard algorithms. Only cycle sort beats it in terms of memory writes. Therefore, it can besimplealgorithm choice when memory writes arecostly. • Disadvantages of the Selection Sort o Selection sort has a time complexity of O(n^2) makes it slower compared to algorithms like Quick Sort or Merge Sort. o Does not maintain the relativeorder of equalelements which means itis not stable.
1st Pass (Pick & Compare)
1st Pass (Update Index Bot 2)
1st Pass (Update Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Update Index Bot 2)
1st Pass (Update Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Unchanged Index Bot 2)
1st Pass (Now Swap value Index Bot 1 with Index Bot 2)
2nd Pass (Now Swap value Index Bot 1 with Index Bot 2)
3rd Pass (Now Swap value Index Bot 1 with Index Bot 2)
4th Pass (Now Swap value Index Bot 1 with Index Bot 2)
5th Pass (Now Swap value Index Bot 1 with Index Bot 2)
6th Pass (Now Swap value Index Bot 1 with Index Bot 2)
7th Pass (Now Swap value Index Bot 1 with Index Bot 2)
8th Pass (Now Swap value Index Bot 1 with Index Bot 2)
9th Pass (Now Swap value Index Bot 1 with Index Bot 2)
10th Pass (Now Swap value Index Bot 1 with Index Bot 2)
11th Pass (Now Swap value Index Bot 1 with Index Bot 2)
12th Pass (Now Swap value Index Bot 1 with Index Bot 2)
13th Pass (Now Swap value Index Bot 1 with Index Bot 2)
14th Pass (Now Swap value Index Bot 1 with Index Bot 2)
Code Snippet
Random Array
Merge Sort • Merge sort is a sorting algorithm that follows the divide-and-conquer approach. It works by recursively dividing the input array into smaller subarrays and sorting those subarrays then merging them back together to obtain the sorted array. • Complexity Analysis of Merge Sort o Time Complexity: ▪ Best Case: O(n log n), When the array is already sorted or nearly sorted. ▪ Average Case: O(n log n), When the array is randomly ordered. ▪ Worst Case: O(n log n), When the array is sorted in reverse order. o Auxiliary Space: O(n), Additional space is required for the temporary array used during merging.
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Visualizing Merge Sort
Code Snippet
Code Snippet
Code Snippet
Random Array
Quick Sort • Quick Sort is a sorting algorithm based on the Divide and Conquer that picks an element as a pivot and partitions the given array around the picked pivot by placing the pivot in its correct position in the sorted array. • Complexity Analysis of Merge Sort o Time Complexity: ▪ Best Case: (Ω(n log n)), Occurs when the pivot element divides the array into two equal halves. ▪ Average Case (θ(n log n)), On average, the pivot divides the array into two parts, but not necessarily equal. ▪ Worst Case: (O(n²)), Occurs when the smallest or largest element is always chosen as the pivot (e.g., sorted arrays). o Auxiliary Space: O(n), due to recursive call stack
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Visualizing Quick Sort
Code Snippet
Code Snippet
Random Array
Counting Sort • Counting Sort is a non-comparison-based sorting algorithm. It is particularly efficient when the range of input values is small compared to the number of elements to be sorted. The basic idea behind Counting Sort is to count the frequency of each distinct element in the input array and use that information to place the elements in their correct sorted positions. • Complexity Analysis of Merge Sort o Time Complexity: O(N+M), where N and M are the size of inputArray[] and countArray[] respectively. ▪ Worst-case: O(N+M). ▪ Average-case: O(N+M). ▪ Best-case: O(N+M). o Auxiliary Space: O(N+M), where N and M are the space taken by outputArray[] and countArray[] respectively.
Visualizing Counting Sort (Find Max Value)
Visualizing Counting Sort (Create New Array)
Visualizing Counting Sort (Store value count at index)
Visualizing Counting Sort (Calculate Prefix Sum)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Visualizing Counting Sort (Sort Data)
Code Snippet
Summary Algorithm Best Case Worst Case Average Case Instruction Comparisons Swaps Copy Bubble O(n) O(n2) O(n2) 451 99 53 0 Insertion O(n) O(n2) O(n2) 297 53 0 81 Selection O(n2) O(n2) O(n2) 331 105 14 0 Merge O(n log2n) O(n log2n) O(n log2n) 666 43 0 118 Quick O(n log2n) O(n2) O(n log2n) 330 45 39 0 Counting O(n + m) O(n + m) O(n + m) 328 15 0 15
String Algorithm • String matching algorithms have greatly influenced computer science and play an essential role in various real-world problems. • It helps in performing time-efficient tasks in multiple domains. • These algorithms are useful in the case of searching a string within another string. o Naïve Algorithm o Knuth Morris Pratt (KMP) Algorithm
Naïve Algorithm • Given text string with length n and a pattern with length m, the task is to prints all occurrences of pattern in text. • Best Case: O(n) o When the pattern is found at the very beginning of the text (or very early on). o The algorithm will perform a constant number of comparisons, typically on the order of O(n) comparisons, where n is the length of the pattern. • Worst Case: O(n2) o When the pattern doesn’t appear in the text at all or appears only at the very end. o The algorithm will perform O((n-m+1)*m) comparisons, where n is the length of the text and m is the length of the pattern. o In the worst case, for each position in the text, the algorithm may need to compare the entire pattern against the text.
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Visualizing Naïve Algorithm
Code Snippet
KMP Algorithm • The KMP matching algorithm uses degenerating property (pattern having the same sub-patterns appearing more than once in the pattern) of the pattern and improves the worst-case complexity to O(n+m). • The basic idea behind KMP’s algorithm is: owhenever we detect a mismatch (after some matches), we already know some of the characters in the text of the next window. o We take advantage of this information to avoid matching the characters that we know will anyway match.
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Visualizing KMP Algorithm
Code Snippet
Code Snippet
Code Snippet
Linked List • A linked list is a linear data structure where elements (nodes) are connected using pointers. • Unlike arrays, linked lists provide dynamic memory allocation, meaning they can grow and shrink without memory reallocation.
Visualizing Linked List
Visualizing Linked List
Visualizing Linked List
Visualizing Linked List
Visualizing Linked List
Visualizing Linked List
Code Snippet
Code Snippet
Code Snippet
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Visualizing Linked List Traversal
Code Snippet
Visualizing Linked List Search
Visualizing Linked List Search
Visualizing Linked List Search
Code Snippet
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Code Snippet
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Code Snippet
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Visualizing Linked List Insertion
Code Snippet
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Code Snippet
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Code Snippet
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Visualizing Linked List Deletion
Code Snippet
Visualizing Linked List Modify
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Visualizing Linked List Reverse
Code Snippet
Stack • A stack is a linear data structure. • Follows the LIFO principle – Last In, First Out. • Elements are added and removed from the top only. • Think of a stack of plates – last one on is the first one off. • Operations: o push(x): Add element x to the top. o pop(): Remove the top element. o peek()/top(): View the top element without removing it. oisEmpty(): Check if the stack is empty.
Stack • Applications: oExpression evaluation (e.g., postfix) o Syntax parsing (e.g., balanced parentheses) o Backtracking algorithms (e.g., maze, puzzles) oFunction calls and recursion • Time Complexity oPush() -> O(1) o Pop() -> O(1) o Peek() -> O(1)
Visualizing Stack (isFull())
Visualizing Stack (push(x))
Visualizing Stack (push(x))
Code Snippet
Visualizing Stack (isEmpty())
Visualizing Stack (pop())
Code Snippet
Visualizing Stack (peek())
Code Snippet
Visualizing Stack (isEmpty())
Code Snippet
Visualizing Stack (push(x))
Visualizing Stack (push(x))
Code Snippet
Visualizing Stack (pop())
Visualizing Stack (pop())
Visualizing Stack (pop())
Visualizing Stack (pop())
Visualizing Stack (pop())
Code Snippet

Data Structures & Algorithms - Spring 2025.pdf