Loading problems...
TL;DR: Use a "slow" pointer to track the position of the last unique element and a "fast" pointer to scan the array, overwriting duplicates in-place.
The LeetCode 26 problem, Remove Duplicates from Sorted Array, asks us to process a sorted integer array and modify it directly so that every element appears only once. Because the array is sorted, duplicates are grouped together. We must place the unique elements at the beginning of the array and return the count of these unique elements. This is a classic interview question testing mastery of array manipulation and space efficiency.
A naive approach to solving this problem in-place involves simulating the deletion process manually. When we iterate through the array and find a duplicate (where nums[i] == nums[i-1]), we can shift all subsequent elements one position to the left to overwrite the duplicate.
Pseudo-code:
texti = 1 current_length = nums.length while i < current_length: if nums[i] == nums[i-1]: # Shift all elements from i to end one step left for j from i to current_length - 2: nums[j] = nums[j+1] current_length = current_length - 1 else: i = i + 1 return current_length
Time Complexity: Space Complexity:
Why this fails: While this solution respects the space constraint, it fails on time efficiency. For every duplicate found, we perform a shift operation that takes time. In the worst-case scenario (an array with all unique elements or many duplicates), this results in quadratic time complexity. For an input size of , an operation performs roughly operations, which risks a Time Limit Exceeded (TLE) error or simply performs poorly compared to the optimal linear solution.
The key intuition for LeetCode 26 Solution lies in the fact that the array is sorted. This guarantees that all duplicate values are adjacent to each other. We do not need a hash set to track seen elements; we only need to compare the current element with the previous one.
The optimal strategy decouples the "reading" of data from the "writing" of data.
fast): Scans the array from left to right.slow): Indicates the position where the next unique element should be placed.Invariant:
At any point in the execution, the subarray nums[0...slow-1] contains exactly the unique elements found so far, in sorted order. The fast pointer searches for a value strictly greater than the last unique value found. When fast finds a new unique number, we copy it to the slow position and advance slow.
Visual Description:
Imagine the array is divided into two regions. The left region (indices 0 to slow-1) is the "processed/unique" zone. The right region is the "unknown/unprocessed" zone. The fast pointer expands the known territory. Since fast is always equal to or ahead of slow, we can safely overwrite nums[slow] because its original value has already been processed or is a duplicate we no longer need.

Consider the input: nums = [0, 0, 1, 1, 1, 2, 2]
slow = 1, fast = 1.fast = 1): Compare nums[1] (0) with nums[0] (0). They are equal. Duplicate found. Do nothing. fast becomes 2.fast = 2): Compare nums[2] (1) with nums[1] (0). They are different. New unique found.
nums[2] to nums[slow] (which is nums[1]). Array becomes [0, 1, 1, 1, 1, 2, 2].slow to 2.fast becomes 3.fast = 3): Compare nums[3] (1) with nums[2] (1). Equal. Ignore. fast becomes 4.fast = 4): Compare nums[4] (1) with nums[3] (1). Equal. Ignore. fast becomes 5.fast = 5): Compare nums[5] (2) with nums[4] (1). Different.
nums[5] to nums[slow] (which is nums[2]). Array becomes [0, 1, 2, 1, 1, 2, 2].slow to 3.fast becomes 6.fast = 6): Compare nums[6] (2) with nums[5] (2). Equal. Ignore. fast becomes 7.fast is out of bounds. Return slow (which is 3). The first 3 elements are [0, 1, 2].The algorithm maintains the invariant that nums[0...slow-1] contains the unique elements encountered so far in sorted order.
slow starts at 1, preserving this.fast moves, we check if nums[fast] != nums[fast-1]. Since the array is sorted, a change in value implies a strictly greater value, which has not been written to the unique prefix yet. By copying nums[fast] to nums[slow], we extend the unique prefix.fast visits every element. Therefore, all unique transitions are captured. slow ends up at the count of unique elements.cppclass Solution { public: int removeDuplicates(vector<int>& nums) { // Edge case: constraints say length >= 1, so no need to check for empty. int slow = 1; // The write pointer // fast is the read pointer for (int fast = 1; fast < nums.size(); fast++) { // Compare current element with the previous one if (nums[fast] != nums[fast - 1]) { // Found a unique element, write it to the slow position nums[slow] = nums[fast]; slow++; } } return slow; // slow represents the number of unique elements } };
javaclass Solution { public int removeDuplicates(int[] nums) { // Constraints guarantee nums.length >= 1 int slow = 1; // Pointer for the position of the next unique element for (int fast = 1; fast < nums.length; fast++) { // If current element is different from the previous one, it is unique if (nums[fast] != nums[fast - 1]) { nums[slow] = nums[fast]; slow++; } } return slow; } }
pythonclass Solution: def removeDuplicates(self, nums: List[int]) -> int: # Constraints guarantee len(nums) >= 1 slow = 1 # Write pointer # Fast pointer iterates through the list for fast in range(1, len(nums)): # Check if current element is different from the previous one if nums[fast] != nums[fast - 1]: nums[slow] = nums[fast] slow += 1 return slow
Time Complexity:
We traverse the array exactly once with the fast pointer. Inside the loop, we perform constant time operations (comparison and assignment). is the number of elements in nums.
Space Complexity:
We modify the array in-place. We only use two integer variables (slow and fast) for state tracking, regardless of the input size.
The Two Pointers - In-place Array Modification pattern is a fundamental technique for array manipulation. Understanding the "read/write" pointer dynamic enables you to solve several related problems efficiently:
nums[fast] against nums[slow-2].Q: Why do I get a "Wrong Answer" even if my output array looks correct?
Mistake: Returning the modified list or array instead of the integer k.
Explanation: The problem explicitly asks you to return the count of unique elements. The judge uses this return value to slice your array and verify correctness.
Q: Why does my solution fail with "Index Out of Bounds"?
Mistake: Accessing nums[i+1] inside the loop without stopping at length - 1.
Correction: When comparing adjacent elements, it is safer to iterate i from 1 to length and compare nums[i] with nums[i-1]. If you compare nums[i] with , the loop condition must be .
nums[i+1]i < length - 1Q: Can I use a Set to solve this? Mistake: Using a Hash Set to filter duplicates. Consequence: While this produces the correct logical result, it violates the Space Complexity constraint. The problem requires extra space (in-place modification), whereas a Set uses space.
Q: Why is my output shorter than expected?
Mistake: Starting the iteration or the slow pointer at 0 and overwriting the first element incorrectly.
Explanation: The first element is always unique. The logic should start comparing from the second element (index 1). If you start at 0, you might miss counting the first number or access nums[-1].