We study the problem of computing a longest increasing subsequence in a sequence S of n distinct elements in the presence of persistent comparison errors. In this model, Braverman and… Click to show full abstract
We study the problem of computing a longest increasing subsequence in a sequence S of n distinct elements in the presence of persistent comparison errors. In this model, Braverman and Mossel ( Noisy sorting without resampling , SODA 2008, pages 268–276, 2008) every comparison between two elements can return the wrong result with some fixed (small) probability p , and comparisons cannot be repeated. Computing the longest increasing subsequence exactly is impossible in this model, therefore, the objective is to identify a subsequence that (i) is indeed increasing and (ii) has a length that approximates the length of the longest increasing subsequence. We present asymptotically tight upper and lower bounds on both the approximation factor and the running time. In particular, we present an algorithm that computes an O ( log n ) $O(\log n)$ -approximation in O ( n log n ) $O(n\log n)$ time, with high probability. This approximation relies on the fact that we can approximately sort (Geissmann et al. Optimal Sorting with Persistent Comparison Errors , ArXiv e-prints 1804.07575, 2018) n elements in O ( n log n ) $O(n\log n)$ time such that the maximum dislocation of an element is O ( log n ) $O(\log n)$ . For the lower bounds, we prove that (i) there is a set of sequences, such that on a sequence picked randomly from this set every algorithm must return an Ω ( log n ) ${\Omega }(\log n)$ -approximation with high probability, and (ii) any log n $\log n$ -approximation algorithm for longest increasing subsequence requires Ω ( n log n ) ${\Omega }(n \log n)$ comparisons, even in the absence of errors.
               
Click one of the above tabs to view related content.