After months of working mostly in a business analyst/architect role und only occasionally writing code, I was finally given a project that now has me coding quite a bit for a change. The objective was to rewrite an existing Delphi application in C# and add new functionality. As with most of the applications I get to work on, performance was key.
Before embarking on this project, I bought a copy of Pro .NET Performance: Optimize Your C# Applications by Sasha Goldshtein et alia. This book is a great read and I can recommend it without hesitation. You could probably stop reading this article and just get the book and you would be fine.
I also found Writing High-Performance .NET Code, but since I haven’t read it, can’t comment on its merits.
You may also want to check out my collection of links on the matter Reading up on Concurrent Programming.
While .NET offers automatic memory management, this by no means implies that you did not have to think about it. In fact, for high-throughput applications, I’ve found garbage collection (GC) to be one of the most crucial determinants of performance. The Goldshtein book discusses memory management at length.
In addition, there is also a wealth of more detailed information on various MSDN blogs, particularly the CLR Garbage Collector blog by Maoni Stephens.
- Understanding .NET Garbage Collection is not from Microsoft but Telerik, a large .NET component vendor, and provides an excellent introduction into the workings of the .NET GC.
- More information on the large object heap touched upon in the article above.
- Two articles on using GC efficiently (part 1, part 2), good summary of information you’ll also find in a bit more detail in the book.
- The single most important GC settings: selecting a GC flavor. Contains an excellent table that summarizes the trade-offs between these.
- A video featuring Maoni Stephens discussing another aspect of GC flavor: background garbage collection.
- GC improvements in .NET 4.5.
Other, non-GC related memory management topics:
- Know Thine Implicit Allocations highlights situations where memory allocations are made by the framework that you might be unaware of. The good news is in newer version of the framework steps are taken to reduce these.
- Choosing between class and struct, as (generally speaking) instances of the latter will be collected with the objects that contain them and not result in new objects that need to be tracked during garbage collection.
I’ve actually become quite a fan of structs. One central type in my application, with tens of thousands of instances that store information in the form of a couple of ints and longs, is now implemented as a struct. Instances are stored in an array and instead of passing references, I pass the elements as reference parameters or refer to them by their index. The code in this part of the application might not be very object-oriented, but that is often the price to pay for excellent performance.
As a general rule, I recommend using the well-known classes from the .NET framework, e.g. for collections. Every once in a while, however, there are situations, where you have special (performance) requirements that may not be met by standard collections. There is actually a chapter in the Goldshtein book about writing your own collections. Sometimes, however, there are already classes that do what you need, they are just hidden away in a different namespace.
- BitVector32 for storing bit flags in a single 32 bit field (see also this discussion on Stack Overflow).
- OrderedDictionary which supports accessing values by index, so you don’t have to perform a lookup on the key every time. Unfortunately, this is a non-generic class, so you might want to look at this CodeProject or this Stack Overflow article for one.
Speaking of different implementations of dictionaries, I’ve found this article about Choosing The Right Collection Class extremely helpful.
Finally, LazyInitializer is an excellent alternative to the Lazy<T> class, particularly when you are watching your memory footprint and/or the number of objects you create.