A faster replacement to the Dictionary<TKey, TValue>

I need a fast replacement for the System.Collections.Generic.Dictionary<TKey, TValue>. My application should be really fast. So, the replacement should support:

  • Generics
  • Add
  • Get
  • Contains

... and that's it. I don't need any support in LINQ or anything. And it should be fast.

A simple code like:

Stopwatch stopWatch = Stopwatch.StartNew();

Dictionary<string, string> dictionary = new Dictionary<string, string>();
dictionary.Add("fieldName", "fieldValue");
dictionary.Add("Title", "fieldVaaaaaaaaaaaaaaaaalue");

Console.WriteLine(stopWatch.Elapsed);

... prints 00:00:00.0001274, which is alot of time for me, because my application is doing many other things, some of them from old slow libraries that I must to use and are not dependent on me.

Any ideas on how to implement a faster one?

Thank you.

Answers


Chances are you're seeing JIT compilation. On my box, I see:

00:00:00.0000360
00:00:00.0000060

when I run it twice in quick succession within the same process - and not in the debugger. (Make sure you're not running it in the debugger, or it's a pointless test.)

Now, measuring any time that tiny is generally a bad idea. You'd need to iterate millions of times to get a better idea of how long it's taking.

Do you have good reason to believe it's actually slowing down your code - or are you basing it all on your original timing?

I doubt that you'll find anything significantly faster than Dictionary<TKey, TValue> and I'd be very surprised to find that it's the bottleneck.

EDIT: I've just benchmarked adding a million elements to a Dictionary<TKey, TValue> where all the keys were existing objects (strings in an array), reusing the same value (as it's irrelevant) and specifying a capacity of a million on construction - and it took about 0.15s on my two-year-old laptop.

Is that really likely to be a bottleneck for you, given that you've already said you're using some "old slow libraries" elsewhere in your app? Bear in mind that the slower those other libraries are, the less impact an improved collection class will have. If the dictionary changes are only accounting for 1% of your overall application time, then even if we could provide an instantaneous dictionary, you'd only speed up your app by 1%.

As ever, get a profiler - it'll give you a much better idea of where your time is going.


I agree with Jon Skeet's supposition that this is most likely JIT compilation.

That being said, I wanted to add some other information here:

Most of the speed issues relating to using Dictionary<T,U> are not related to the implementation of Dictionary. Dictionary<T,U> is VERY fast, out of the box. It would be difficult to beat it.

Speed issues relating to Dictionary instances are almost always actually hash code implementation issues. If you're having speed issues when using Dictionary<MyCustomClass,MyValue>, revisit the GetHashCode() implementation you have defined on MyCustomClass. This is even more critical if you're using a custom struct as your key.

In order to get good performance out of Dictionary, GetHashCode() should be:

  1. Fast
  2. Able to provide hash codes that generate few conflicts. Unique instances should, when possible, generate unique hash values.

If you get that right, I think you'll be very happy with the default Dictionary implementation.


Don't forget, you're timing the Dictionary constructor in that code as well. I did a test, moving the call to the constructor out of the measurement, and looped 10 times. Here's my test code:

for (int i = 0; i < 10; i++)
{
    Dictionary<string, string> test = new Dictionary<string, string>();

    System.Diagnostics.Stopwatch watch = System.Diagnostics.Stopwatch.StartNew();

    test.Add("fieldName", "fieldValue");
    test.Add("Title", "fieldavlkajlkdjflkjalkjslkdjfiajwelkrjelrkjavoijl");

    Console.WriteLine(watch.Elapsed);
}

Console.ReadKey();

Below are the results:

00:00:00.0000607
00:00:00.0000025
00:00:00.0000015
00:00:00.0000015
00:00:00.0000016
00:00:00.0000017
00:00:00.0000016
00:00:00.0000016
00:00:00.0000016
00:00:00.0000015

I'm not sure how much faster you could get than that...

Update

Looks like this mirrors Jon Skeets results too...JIT.


If you really need better performance, you're going to have to give up something major - like generics, dynamic memory allocation, etc. All those features sacrifice some performance.

I would avoid using Contains if at all possible and look at TryGetValue etc.


USE INTS AS KEYS FOR MAXIMUM PERFORMANCE:

For anyone who came here from Google, if you want to squeeze every last bit of performance out of a Dictionary, then use Ints as keys. Here's a benchmark comparing Int vs String Keys: https://jacksondunstan.com/articles/2527

The author of the article even mentions that converting strings to ints is worthwhile if you have such a need.

Also, note that this same behavior occurs in some other languages like PHP. Php associative arrays -are in fact- dictionaries, and if you use Ints in ascending order in PHP7, they outperform string keys tremendously.


Odds are you are not going to find anything much faster than Dictionary. I would just use Dictionary. Then, when you see you are not meeting your perf goals, and a profiler indicates that adding/removing from Dictionary are your bottlenecks you can consider replacing with a more targeted class.

Note that features such as LINQ due not incur any performance loss if you do not use them.


Could you use a List and define an enum such that, for example, fieldName = 0, Title = 1 and use each propery's unique index as a lookup index into the list? That would be the fastest solution, though the least flexible since you'd be tied to an enum.


How many items do you plan to add to the dictionary? While Dictionary/Hashtable is usually the fastest, depending on what you are doing, there may be something faster (aka better suited) than a Hashtable (the underlying structure in a Dictionary). Based on the usage, it's possible that SortedList could be faster if combine with some kind of Skip List or even a self-balancing tree or tries. Especially if you wish to return a range of values rather than a single value.

A Hashtable is a good fit when:

  1. You know how many items you intend to store before population of the table begins. Dynamic resizing will be very painful!
  2. You have a good hash algorithm with even distribution, which .NET does
  3. There is a good mechanism in place for collision resolution, which .NET does
  4. You are looking for a single value
  5. You can guarantee that all values will be unique

If you're doing some compression, for example, a RB-Tree is better than a Hashtable.

Source: http://en.wikipedia.org/wiki/Hashtable#Dynamic_resizing


Need Your Help

How to return a partial JSON response using Java?

java json cxf jax-rs

I'm building a RESTful API and want to provide developers with the option to choose which fields to return in the JSON response. This blog post shows examples of how several API's (Google, Facebook,

How to make an element's border flash?

javascript jquery css

I want to flash or blink the border of a particular div when the user mouse over an element and stop flashing when mouse out. I have several elements that can user user mouseover. I need to flash t...