Like many others I’m drooling over the new images from the JWST, e.g. this one:
https://webbtelescope.org/contents/media/images/2022/034/01G7DA5ADA2WDSK1JJPQ0PTG4A
So I thought it would be fun to try to align it with one of the old Hubble images:
https://hubblesite.org/contents/media/images/2009/25/2606-Image.html
I skimmed through a related thread and might give this a try during the weekend, but since I know very little about image processing I can’t even guess where this task lies on a scale from trivial to impossible. The full size images are very large: 150 and 29 megapixels. Any experts want to chip in with their gut feeling? Is this feasible at all?
1 Like
Not my domain but I recall the Celeste project doing that or something similar at a pretty impressive scale.
I can’t say anything about the capability of existing packages as I have no familiarity with them, but the problem itself is entirely feasible, at least as far as the sizes of the images go. Image registration lends itself naturally to coarse-to-fine approaches and you can use random sampling techniques for the finer scales to cut down on computations. The biggest difficulties are usually in finding a good similarity measure and choosing an appropriate deformation model. If the model is too rigid you might not be able to register the images well and if it’s too flexible you might not get a robust estimate of its parameters. Given the nature of these images I suppose the deformation should be possible to model globally with rather few parameters, which helps a lot.
Shouldn’t be too bad. I haven’t done any of this work in Julia before, but you could get a pretty simple solution with only a little bit of manual effort.
First, you’ll need to identify matched points in both images. The natural candidates in astronomy are point sources like unresolved, unsaturated stars. So pick a handful of stars, and get the (x, y) pixel coordinates for each star in both images.
You’ll now two (one per image) 2 by N matrices of points where N is the number of chosen stars and a column represents the (x, y) coordinates of the star. You’re now looking for a transformation that will take one of those image coordinate matrices and map them to the coordinates in the other image. For a quick solution, you could just do an affine transform of the form Y = A * X .+ b
, where A
is a 2 by 2 matrix, b
is a two element coordinate vector, and X and Y are your coordinate matrices for your two images.
For some off-the-shelf Julia implementations, you could use MultivariateStats to solve the linear regression problem of finding the affine transformation. This CoordinateTransformations package seems to play nicely with Images.jl. And if you wanted to really go off the deep end to do some astronomical image analysis, you could look at Photometry.jl to extract and measure point sources.
For what it’s worth, a more robust solution is going to take into account the non-linear mapping between coordinate systems. But that might be beyond the work of a single weekend