How to deal with costly building operations using MemoryCache?

For the conditional add requirement, I always use ConcurrentDictionary, which has an overloaded GetOrAdd method which accepts a delegate to fire if the object needs to be built.

ConcurrentDictionary<string, object> _cache = new
  ConcurrenctDictionary<string, object>();

public void GetOrAdd(string key)
{
  return _cache.GetOrAdd(key, (k) => {
    //here 'k' is actually the same as 'key'
    return buildDataUsingGoodAmountOfResources();
  });
}

In reality I almost always use static concurrent dictionaries. I used to have 'normal' dictionaries protected by a ReaderWriterLockSlim instance, but as soon as I switched to .Net 4 (it's only available from that onwards) I started converting any of those that I came across.

ConcurrentDictionary's performance is admirable to say the least :)

Update Naive implementation with expiration semantics based on age only. Also should ensure that individual items are only created once - as per @usr's suggestion. Update again - as @usr has suggested - simply using a Lazy<T> would be a lot simpler - you can just forward the creation delegate to that when adding it to the concurrent dictionary. I'be changed the code, as actually my dictionary of locks wouldn't have worked anyway. But I really should have thought of that myself (past midnight here in the UK though and I'm beat. Any sympathy? No of course not. Being a developer, I have enough caffeine coursing through my veins to wake the dead).

I do recommend implementing the IRegisteredObject interface with this, though, and then registering it with the HostingEnvironment.RegisterObject method - doing that would provide a cleaner way to shut down the poller thread when the application pool shuts-down/recycles.

public class ConcurrentCache : IDisposable
{
  private readonly ConcurrentDictionary<string, Tuple<DateTime?, Lazy<object>>> _cache = 
    new ConcurrentDictionary<string, Tuple<DateTime?, Lazy<object>>>();

  private readonly Thread ExpireThread = new Thread(ExpireMonitor);

  public ConcurrentCache(){
    ExpireThread.Start();
  }

  public void Dispose()
  {
    //yeah, nasty, but this is a 'naive' implementation :)
    ExpireThread.Abort();
  }

  public void ExpireMonitor()
  {
    while(true)
    {
      Thread.Sleep(1000);
      DateTime expireTime = DateTime.Now;
      var toExpire = _cache.Where(kvp => kvp.First != null &&
        kvp.Item1.Value < expireTime).Select(kvp => kvp.Key).ToArray();
      Tuple<string, Lazy<object>> removed;
      object removedLock;
      foreach(var key in toExpire)
      {
        _cache.TryRemove(key, out removed);
      }
    }
  }

  public object CacheOrAdd(string key, Func<string, object> factory, 
    TimeSpan? expiry)
  {
    return _cache.GetOrAdd(key, (k) => { 
      //get or create a new object instance to use 
      //as the lock for the user code
        //here 'k' is actually the same as 'key' 
        return Tuple.Create(
          expiry.HasValue ? DateTime.Now + expiry.Value : (DateTime?)null,
          new Lazy<object>(() => factory(k)));
    }).Item2.Value; 
  }
}

We solved this issue by combining Lazy<T> with AddOrGetExisting to avoid a need for a lock object completely. Here is a sample code (which uses infinite expiration):

public T GetFromCache<T>(string key, Func<T> valueFactory) 
{
    var newValue = new Lazy<T>(valueFactory);
    // the line belows returns existing item or adds the new value if it doesn't exist
    var value = (Lazy<T>)cache.AddOrGetExisting(key, newValue, MemoryCache.InfiniteExpiration);
    return (value ?? newValue).Value; // Lazy<T> handles the locking itself
}

That's not complete. There are gotchas like "exception caching" so you have to decide about what you want to do in case your valueFactory throws exception. One of the advantages, though, is the ability to cache null values too.