.Net library for fast approximate nearest neighbours search using the HNSW (Hierarchical Navigitable Small Worlds) algorithm used in Pinecone, DataStax, Azure AI Search and other commercial vector databases.
Exact k nearest neighbours search algorithms tend to perform poorly in high-dimensional spaces. To overcome curse of dimensionality the ANN algorithms come in place. This library implements one of such algorithms described in the "Efficient and robust approximate nearest neighbor search using Hierarchical Navigable Small World graphs" article. It provides simple API for building nearest neighbours graphs, (de)serializing them and running k-NN search queries.
Benchmark search results from the https://github.com/bartczernicki/VectorMathAIOptimizations GitHub repository over 1 million real vectors from Wikipedia.
Linear - Non-optimized benchmark using O(n) linear performance degredation
Complete - Optimized benchmark with multi-threading, DotProduct, .NET 8 Tensors, AVX extensions using O(n) linear performance degredation
CompleteRealDataANN - Optimized benchmark using the HNSW algorithm
| Method | Mean | Error | StdDev | Ratio | RatioSD | Search queries / second |
|-------------------- |--------------:|----------:|----------:|---------:|--------:|------------------------:|-
| Linear | 1,664.7444 ms | 0.3980 ms | 0.3723 ms | baseline | | 0.63 queries / sec|
| Complete | 94.5493 ms | 1.8721 ms | 4.0299 ms | -94.3% | 3.5% | 10.69 queries / sec|
| CompleteRealDataANN | 0.6453 ms | 0.0012 ms | 0.0011 ms | -100.0% | 0.2% | 1,550.39 queries / sec|
Check out the following code snippets once you've added the library reference to your project.
var parameters = new SmallWorld<float[], float>.Parameters()
{
M = 15, // defines the amount of neighbor connections for each vector, more connections create dense graphs with potentially higher recall
LevelLambda = 1 / Math.Log(15),
};
float[] vectors = GetFloatVectors();
var graph = new SmallWorld<float[], float>(DotProductDistance.DotProductOptimized, DefaultRandomGenerator.Instance,
parameters, threadSafe: true);
graph.AddItems(vectors);
SmallWorld<float[], float> graph = GetGraph();
float[] query = Enumerable.Repeat(1f, 100).ToArray();
var best20 = graph.KNNSearch(query, 20);
var best1 = best20.OrderBy(r => r.Distance).First();
SmallWorld<float[], float> graph = GetGraph();
byte[] buffer = graph.SerializeGraph(); // buffer stores information about parameters and graph edges
// distance function must be the same as the one which was used for building the original graph
var copy = new SmallWorld<float[], float>(DotProductDistance.DotProductOptimized);
copy.DeserializeGraph(vectors, buffer); // the original vectors to attach to the "copy" vertices
The only one distance function supplied by the library is the cosine distance. But there are 4 versions to address universality/performance tradeoff.
CosineDistance.NonOptimized // most generic version works for all cases
CosineDistance.ForUnits // gives correct result only when arguments are "unit" vectors
CosineDistance.SIMD // uses SIMD instructions to optimize calculations
CosineDistance.SIMDForUnits // uses SIMD and requires arguments to be "units"
DotProductDistance.DotProductOptimized // Seperated, optimized with .NET 8 Tensor primitives for AVX (can fall back to non-hardware)
But the API allows to inject any custom distance function tailored specifically for your needs.
Your contributions and suggestions are very welcome!
If you've found a bug or have a feature request then please open an issue with detailed description. We will be glad to see your pull requests as well.
- Prepare workspace.
git clone https://github.com/bartczernicki/hnsw-sharp
cd HNSW.Net
git checkout -b [username]/[feature]
- Update the library and add tests if needed.
- Build and test the changes.
cd Src
dotnet build
dotnet test
- Send the pull request from
[username]/[feature]
tomaster
branch. - Get approve and merge the changes.
The library is distributed as a bundle of sources.