You compare this to the target and update your “best so far” if this sum is closer than any previous one.
Then you adjust pointers based on whether you need a larger or smaller sum.
If current_sum is less than the target, you need a larger sum, so you move the left pointer to the right.
If current_sum is greater than the target, you need a smaller sum, so you move the right pointer to the left.
If current_sum equals the target, you are done, because you cannot get closer than exact.
This pointer movement is what makes the algorithm efficient. Each scan moves inward, so for a fixed i, the left and right pointers together move at most n steps.
Why this guarantees you find the closest sum
The sorted order ensures monotonic behavior.
Moving the left pointer always increases the sum.
Moving the right pointer always decreases the sum.
So when the current sum is too small, the only move that can improve it is increasing the left side. When it’s too large, the only move that can improve it is decreasing the right side. This eliminates huge parts of the search space without missing the optimal answer.
You still evaluate enough candidate sums to guarantee correctness, but you do it in a structured way.
Want to explore more coding problem solutions? Check out the Evaluate Division and Coin Change II coding problem solutions.
Complexity in plain terms
Sorting takes O(n log n).
The main loop fixes one index and runs a linear two-pointer scan, which gives O(n²) overall.
That is a major improvement over O(n³), and it’s the expected solution in interviews.
Top comments (0)