Pondering k-Nearest Neighbor density estimation. This is appealing, as it can adapt to the local density of points. However there is a little subtlety making it smooth, and ensuring the density function integrates to 1. Here is a simple scheme:
- Assign each point a radius from the distance to it's kth nearest neighbor.
- Treat each point as a Gaussian splat with that radius.
- The density is the sum of all the splats.
Computing the density at a point, we need to do a query over a set of balls of different radii. So this step is no longer exactly k-Nearest Neighbors, but it shouldn't be computationally much harder.
We can also use these splats as weights for local model fitting (similar to loess curves).
Olsen, Lindrup, and Mørup (2024) describe essentially this idea. They add some elaborations such as giving each splat its own locally estimated covariance matrix. They use the term K-Nearest Neighbor Kernel Density Estimation (KNN-KDE), which is a good description.