Modern algorithmic frameworks must contend with data that is both massive and inherently uncertain. This dissertation explores efficient geometric algorithms for handling such data across two regimes: temporal evolution and spatial classification.
In the first part, we address temporal uncertainty by developing algorithms to track Evolving Data. We introduce a framework for tracking labels on trees and moving points in the Euclidean plane, establishing the necessary "speedup factors" required to maintain an accurate hypothesis of the ground truth. We extend this to track evolving probability distributions under a local-motion model, using the Kullback-Leibler divergence to bound the information loss over time.
In the second part, we address spatial uncertainty by developing classifiers for data constrained to bounded domains, such as the probability simplex. We investigate Support Vector Machines (SVMs) under the Hilbert metric, a projective geometry that is sensitive to domain boundaries. We prove that the Hilbert SVM problem is of LP-type, allowing for exact solutions. Next we present a polynomial-time approximation algorithm that overcomes the curse of dimensionality inherent in exact approaches. Together, these results provide a robust algorithmic foundation for managing uncertainty in dynamic and bounded geometric systems.
Adi Acharya is a PhD candidate advised by Dr. David Mount. His research primarily lies in the field of computational geometry, with a strong emphasis on theoretical foundations along with their implications in machine learning. As of late, he has been working towards the development of efficient algorithms, and optimized frameworks for analyzing high-dimensional categorical and stochastic data.

