Abstract

The contributions of the paper span theoretical and implementational results. First, we prove that Kd-trees can be extended to ℝ^d with the distance measured by an arbitrary Bregman divergence. Perhaps surprisingly, this shows that the triangle inequality is not necessary for correct pruning in Kd-trees. Second, we offer an efficient algorithm and C++ implementation for nearest neighbour search for decomposable Bregman divergences. The implementation supports the Kullback-Leibler divergence (relative entropy) which is a popular distance between probability vectors and is commonly used in statistics and machine learning. This is a step toward broadening the usage of computational geometry algorithms. Our benchmarks show that our implementation efficiently handles both exact and approximate nearest neighbour queries. Compared to a linear search, we achieve two orders of magnitude speedup for practical scenarios in dimension up to 100. Our solution is simpler and more efficient than competing methods.

Keywords

InferenceEstimatorUpper and lower boundsLatent variableDifferentiable functionBayes' theoremComputer sciencePosterior probabilityBayesian inferenceAlgorithmProbabilistic logicMathematicsApplied mathematicsArtificial intelligenceBayesian probabilityStatistics

Related Publications

Publication Info

Year
2025
Type
preprint
Citations
1878
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1878
OpenAlex

Cite This

Diederik P. Kingma, Max Welling (2025). Fast Kd-Trees for the Kullback-Leibler Divergence and Other Decomposable Bregman Divergences. Wiardi Beckman Foundation (Wiardi Beckman Foundation) . https://doi.org/10.4230/lipics.wads.2025.45

Identifiers

DOI
10.4230/lipics.wads.2025.45

Data Quality

Data completeness: 77%