Cecil performance issues 02 Sep 2008


Then it showed up At the beginning, Cecil was written to be an assembly manipulation library. The initial goal was to be able to read assemblies without loading them in an AppDomain, and also, to expose more that what .net 1.1 provided. I implemented the writing parts of Cecil during my first Summer of Code, and barely touched it after, at it turns out it was working. But this summer, two great hackers, Jeroen, author or IKVM, and Rolf, our VB 8 compiler author, decided that they gave enough blood and sweat to work around System.Reflection.Emit issues, and prototyped a version of their project based on Cecil instead. If Rolf decided to go the full Cecil way, and replaced the usage of System.Reflection and System.Reflection.Emit (his branch is available), Jeroen decided to write a layer on top of Cecil that mimics the System.Reflection.Emit API. Both wrote a sum-up about their findings, here for Jeroen, and there for Rolf. And both came to the conclusion that Cecil performs a lot less well that System.Reflection.Emit, both in terms of speed and memory consumption. Rolf also uses the delay loading branch of Cecil, created by Mainsoft for their CIL to Java bytecode translator. So the point is that Cecil uses too much memory, and is not fast enough in those scenarios, while it performs better in the reading only or reading + manipulating. The good news is that, thanks to those hackers, we have now two amazing test cases to work on and to optimize. I've also started a few weeks ago, as a night hack, a refactoring of Cecil, that removes the intermediate structures that the current version of Cecil uses to read and write assemblies. If that makes reading and writing a little bit more complicated, it should also save a lot of memory. And who knows, maybe one day, when Cecil will be optimized properly, and that the SRE on top of Cecil layer will be sufficient, it will require only a couple of changes to base mcs on Cecil. Anyway, kudos to those hackers, let see what I can do to catch up. We'll let you know!