Point covering problem
        Posted  
        
            by Sean
        on Stack Overflow
        
        See other posts from Stack Overflow
        
            or by Sean
        
        
        
        Published on 2010-05-12T18:19:26Z
        Indexed on 
            2010/05/12
            18:24 UTC
        
        
        Read the original article
        Hit count: 244
        
I recently had this problem on a test: given a set of points m (all on the x-axis) and a set n of lines with endpoints [l, r] (again on the x-axis), find the minimum subset of n such that all points are covered by a line. Prove that your solution always finds the minimum subset.
The algorithm I wrote for it was something to the effect of: (say lines are stored as arrays with the left endpoint in position 0 and the right in position 1)
algorithm coverPoints(set[] m, set[][] n):
    chosenLines = []
    while m is not empty:
        minX = min(m)
        bestLine = n[0]
        for i=1 to length of n:
            if n[i][0] <= m and n[i][1] > bestLine[1] then
                bestLine = n[i]
        add bestLine to chosenLines
        for i=0 to length of m:
            if m <= bestLine[1] then delete m[i] from m
    return chosenLines
I'm just not sure if this always finds the minimum solution. It's a simple greedy algorithm so my gut tells me it won't, but one of my friends who is much better than me at this says that for this problem a greedy algorithm like this always finds the minimal solution. For proving mine always finds the minimal solution I did a very hand wavy proof by contradiction where I made an assumption that probably isn't true at all. I forget exactly what I did.
If this isn't a minimal solution, is there a way to do it in less than something like O(n!) time?
Thanks
© Stack Overflow or respective owner