When discussing TTFX and latency we often talk about which compiled methods of a package may have been invalidated once another package was loaded. But to find invalidations this way you have to have the “right invalidator” handy which actually defines subtypes or new methods that throw a wrench into the compiled code. You may not be aware that some code of yours is vulnerable just because you haven’t invalidated it yet.
As far as I know the main reasons for invalidation are world splitting, where compiled code is optimized given a small amount of applicable types in the world. Or the definition of new methods for existing functions where before there was only a small possible number.
So what I’m wondering is if there’s a way to query the compile cache of a package somehow which functions were compiled with each of those rules being in effect. This would mean being able to compute some kind of score how likely a package would be to be invalidated. Ideally, one could group the compiled methods by supertypes which would be vulnerable, and one could sort by highest impact.
For example I imagine something like “this package compiled 53 methods under the assumption that AbstractString has only subtypes String and SubString and is therefore prone to invalidation through definition of another subtype for AbstractString.”
With such information the fight against latency and invalidation would be much more doable because you wouldn’t have to wait for problems to crop up, you could proactively analyze what kind of problems you can expect. If such a report returns no problems you can be sure you’re fine. Or if the vulnerable supertypes are your own but you don’t intend for anybody to add more subtypes, you can also ignore that.