Timothy M. Chan's Publications: Geometric optimization techniques and low-dimensional linear programming

Faster algorithms for largest empty rectangles and boxes

We revisit a classical problem in computational geometry: finding the largest-volume axis-aligned empty box (inside a given bounding box) amidst n given points in d dimensions. Previously, the best algorithms known have running time O(n log^2 n) for d = 2 (by Aggarwal and Suri [SoCG'87]) and near n^d for d >= 3. We describe faster algorithms with running time

To obtain the higher-dimensional result, we adapt and extend previous techniques for Klee�s measure problem to optimize certain objective functions over the complement of a union of orthants.

Smallest k-enclosing rectangle revisited

Sariel Har-Peled)

Given a set of n points in the plane, and a parameter k, we consider the problem of computing the minimum (perimeter or area) axis-aligned rectangle enclosing k points. We present the first near quadratic time algorithm for this problem, improving over the previous near-O(n^{5/2})-time algorithm by Kaplan et al. [ESA 2017]. We provide an almost matching conditional lower bound, under the assumption that (min,+)-convolution cannot be solved in truly subquadratic time. Furthermore, we present a new reduction (for either perimeter or area) that can make the time bound sensitive to k, giving near O(nk) time. We also present a near linear time (1+eps)-approximation algorithm to the minimum area of the optimal rectangle containing k points. In addition, we study related problems including the 3-sided, arbitrarily oriented, weighted, and subset sum versions of the problem.

Improved deterministic algorithms for linear programming in low dimensions

At SODA'93, Chazelle and Matousek presented a derandomization of Clarkson's sampling-based algorithm for solving linear programs with n constraints and d variables in d^{(7+o(1))d} n deterministic time. The time bound can be improved to d^{(5+o(1))d} n with subsequent work by Bronnimann, Chazelle, and Matousek [FOCS'93]. We first point out a much simpler derandomization of Clarkson's algorithm that avoids epsilon-approximations and runs in d^{(3+o(1))d} n time. We then describe a few additional ideas that eventually improve the deterministic time bound to d^{(1/2+o(1))d} n.

Three problems about dynamic convex hulls

We present three results related to dynamic convex hulls:

Multi-pass geometric algorithms

(with Eric Y. Chen)

We initiate the study of exact geometric algorithms that require limited storage and make only a small number of passes over the input. Fundamental problems such as low-dimensional linear programming and convex hulls are considered.

Finding the shortest bottleneck edge in a parametric minimum spanning tree

Given a parametric graph with n vertices and m edges where edge weights change linearly over time, we show how to find the time value at which the heaviest edge weight in the minimum spanning tree is minimized in O(n(m/n)^epsilon log n + m) expected time...

Three problems about simple polygons

We give three related algorithmic results concerning a simple polygon P:

An optimal randomized algorithm for maximum Tukey depth

We present the first optimal algorithm to compute the maximum Tukey depth (also known as location or halfspace depth) for a non-degenerate point set in the plane. The algorithm is randomized and requires O(n log n) expected time for n data points. In a higher fixed dimension d >= 3, the expected time bound is O(n^{d-1}), which is probably optimal as well. The result is obtained using an interesting variant of the author's randomized optimization technique, capable of solving "implicit" linear-programming-type problems; some other applications of this technique are briefly mentioned.

Low-dimensional linear programming with violations

Two decades ago, Megiddo and Dyer showed that linear programming in 2 and 3 dimensions (and subsequently, any constant number of dimensions) can be solved in linear time. In this paper, we consider linear programming with at most k violations: finding a point inside all but at most k of n given halfspaces. We give a simple algorithm in 2-d that runs in O((n + k^2) log n) expected time; this is faster than earlier algorithms by Everett, Robert, and van Kreveld (1993) and
Matousek (1994) and is probably near-optimal for all k << n/2. A (theoretical) extension of our algorithm in 3-d runs in near O(n + k^{11/4}n^{1/4}) expected time. Interestingly, the idea is based on concave-chain decompositions (or covers) of the (<= k)-level, previously used in proving combinatorial k-level bounds.

Applications in the plane include improved algorithms for finding a line that misclassifies the fewest among a set of bichromatic points, and finding the smallest circle enclosing all but k points. We also discuss related problems of finding local minima in levels.

More planar two-center algorithms

This paper considers the planar Euclidean two-center problem: given a planar n-point set S, find two congruent circular disks of the smallest radius covering S. The main result is a deterministic algorithm with running time O(n log^2 n log^2 log n), improving the previous O(n log^9 n) bound of
Sharir and almost matching the randomized O(n log^2 n) bound of Eppstein. If a point in the intersection of the two disks is given, then we can solve the problem in O(n log n) time with high probability.

Geometric applications of a randomized optimization technique

We propose a simple, general, randomized technique to reduce certain geometric optimization problems to their corresponding decision problems. These reductions increase the expected time complexity by only a constant factor and eliminate extra logarithmic factors in previous, often more complicated, deterministic approaches (such as parametric searching). Faster algorithms are thus obtained for a variety of problems in computational geometry: finding minimal k-point subsets, matching point sets under translation, computing rectilinear p-centers and discrete 1-centers, and solving linear programs with k violations.

Deterministic algorithms for 2-d convex programming and 3-d online linear programming

We present a deterministic algorithm for solving two-dimensional convex programs with a linear objective function. The algorithm requires O(k log k) primitive operations for k constraints; if a feasible point is given, the bound reduces to O(k log k / log log k). As a consequence, we can decide whether k convex n-gons in the plane have a common intersection in O(k log n min{log k, log log n}) worst-case time. Furthermore, we can solve the three-dimensional online linear programming problem in o(log^3 n) worst-case time per operation.

Fixed-dimensional linear programming queries made easy

We derive two results from Clarkson's randomized algorithm for linear programming in a fixed dimension d. The first is a simple general method that reduces the problem of answering linear programming queries to the problem of answering halfspace range queries. For example, this yields a randomized data structure with O(n) space and O(n^{1-1/floor(d/2)} 2^O(log* n)) query time for linear programming on n halfspaces (d > 3). The second result is a simpler proof of the following: a sequence of q linear programming queries on n halfspaces can be answered in O(n log q) time, if q <= n^{alpha_d} for a certain constant alpha_d > 0. Unlike previous methods, our algorithms do not require parametric searching.

Copyright Notice

The documents contained in this directory are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Timothy Chan (Last updated Aug 2023)