Emerging applications in machine learning have imposed the problem of monotone non-submodular maximization subject to a cardinality constraint. Meanwhile, parallelism is prevalent for large-scale optimization problems in bigdata scenario while… Click to show full abstract
Emerging applications in machine learning have imposed the problem of monotone non-submodular maximization subject to a cardinality constraint. Meanwhile, parallelism is prevalent for large-scale optimization problems in bigdata scenario while adaptive complexity is an important measurement of parallelism since it quantifies the number of sequential rounds by which the multiple independent functions can be evaluated in parallel. For a monotone non-submodular function and a cardinality constraint, this paper devises an adaptive algorithm for maximizing the function value with the cardinality constraint through employing the generic submodularity ratio $$\gamma $$ to connect the monotone set function with submodularity. The algorithm achieves an approximation ratio of $$1-e^{-\gamma ^2}-\varepsilon $$ and consumes $$O(\log (n/\eta )/\varepsilon ^2)$$ adaptive rounds and $$O(n\log \log (k)/\varepsilon ^3)$$ oracle queries in expectation. Furthermore, when $$\gamma =1$$ , the algorithm achieves an approximation guarantee $$1-1/e-\varepsilon $$ , achieving the same ratio as the state-of-art result for the submodular version of the problem.
               
Click one of the above tabs to view related content.