Problem Statement:

UVa 10125 - Sumsets

Summary:

Given a set \(S\) of distinct integers, find the largest \(d\) such that \(a + b + c = d\) where \(a,b,c,d\) are all distinct elements in S.

Simple brute force approach will give an \(O(N^4)\) complexity, which is bad since \(|S|\) can be as large as 1000 in this problem. Therefore the problem begs for a more efficient approach, and indeed an \(O(N^2\lg{N})\) solution exist, and we need to utilize binary search to achieve this running time.

Firstly, notice that the problem can be restated as finding \(a,b,c,d\) that satisfy \(a+b = d - c\). This simple restatement of the problem has a huge implication: we can find all possible values of \(a+b\) in \(O(N^2)\) time, similarly for \(d-c\), hence total we need \(O(N^2)\) to generate all \(a+b\) and all \(d-c\), which can be stored in two arrays or vectors, say \(P\) and \(Q\) respectively. Next, we sort \(P\) and \(Q\). Why sorting? so that we can do a binary search! From here on the task is simple; we need to try all elements in \(P\) and find whether there exist an element with the same value in \(Q\). If there is, then check whether they have distinct \(a,b,c,d\), and if so we update our answer on \(d\).

Each binary search has time complexity of \(O(\lg{N^2})\), which happen \(N^2\) times, so we have running time of \(O(N^2\lg{N})\) over all. Beautiful isn't it.