Caching reflection data

You should check out the fasterflect libary

You could use normal reflection to dynamically generate new code & then emit/compile it and then caching the compiled version. I think the collectible assembly idea is promising, to avoid the memory leak without having to load/unload from a separate appdomain. However, the memory leak should be negligible unless you're compiling hundreds of methods.

Here's a blogpost on dynamically compiling code at runtime: http://introspectingcode.blogspot.com/2011/06/dynamically-compile-code-at-runtime.html

Below is a similar concurrent dictionary approach I've used in the past to store the MethodInfo/PropertyInfo objects & it did seem to be a faster, but I think that was in an old version of Silverlight. I believe .Net has it's own internal reflection cache that makes it unnecessary.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;
using System.Collections.Concurrent;

namespace NetSteps.Common.Reflection
{
    public static class Reflection
    {
        private static ConcurrentDictionary<Type, Dictionary<string, PropertyInfo>> reflectionPropertyCache = new ConcurrentDictionary<Type, Dictionary<string, PropertyInfo>>();
        public static List<PropertyInfo> FindClassProperties(Type objectType)
        {
            if (reflectionPropertyCache.ContainsKey(objectType))
                return reflectionPropertyCache[objectType].Values.ToList();

            var result = objectType.GetProperties().ToDictionary(p => p.Name, p => p);

            reflectionPropertyCache.TryAdd(objectType, result);

            return result.Values.ToList();
        }

    }
}

ConcurrentDictionary<WeakReference, CachedData> is incorrect in this case. Suppose we are trying to cache info for type T, so WeakReference.Target==typeof(T). CachedData most likely will contain reference for typeof(T) also. As ConcurrentDictionary<TKey, TValue> stores items in the internal collection of Node<TKey, TValue> you will have chain of strong references: ConcurrentDictionary instance -> Node instance -> Value property (CachedData instance) -> typeof(T). In general it is impossible to avoid memory leak with WeakReference in the case when Values could have references to their Keys.

It was necessary to add support for ephemerons to make such scenario possible without memory leaks. Fortunately .NET 4.0 supports them and we have ConditionalWeakTable<TKey, TValue> class. It seems the reasons to introduce it are close to your task.

This approach also solves problem mentioned in your update as reference to Type will live exactly as long as Assembly is loaded.