When one has to deal with the ground-based telescope data, instead of the satellite data, it is necessary to remove the distortion caused by the Earth atmosphere. So when you go from raw images to the reconstructed images, the ones with removed distortions, the same pixel does not belong to the same structure in your little time series of images. If any serious analysis is to be done, they have to be shifted with respect to each other, so that same pixel shows the same structure through the time series.
You can see in this gif how it looks like when you observe through the ground-based telescope. This particular set of data was taken in 2007 at the DUNN solar telescope, located in New Mexico.
You can see how the atmospheric distortions are blurring and shifting images.
At the beginning of my research career, I made a code that is capable of calculating this shift in Fourier space. (Special mathematical construct, very handy if one has to work with the frequencies. It is used widely in science and engineering.)
Now, all I had to do is rewrite the same code in Python. It took me a day to do so. But then code ended up calculating such a big shifts that I knew that something is wrong.
In coding, the worse kind of errors are the ones when your code is working, but it is giving the wrong result.
I started debugging, trying to find out what is wrong with the code, and despairing because everything seemed as it supposed to be. But the result did not match an expected shift of the images.
This careful debugging took me almost a week.
In the end, I remembered that original code had a built-in limitation. It worked perfectly only on the images from photospheric or lower chromospheric layer of the Solar atmosphere. If one tries to coalign anything else, it will give wrong shifts.
So I went back to the SDO data repository. AIA instrument takes images in several spectral lines covering the whole Solar atmosphere from the photosphere to the corona. And images are taken almost simultaneously, definitely faster than the solar rotation shift. All I needed were images of the photosphere or lower chromosphere and used them to calculate shift.
And that worked. That set of images got a perfect shift, exactly the amount I would expect for the solar rotation. The catch was, those images were sampled with the lower temporal resolution, requiring of me to calculate interpolation of the shift to the higher temporal resolution of the AIA 171 images.
Again, this problem was solved with 2 lines in Python. Handy.
So now I had my nicely correlated set of images, and I could start detecting the sound from them.
Remember the look from the telescope in the gif above? Well, that data set, after speckle image reconstruction (removes atmospheric disturbance) and aligning the resulting images gives this.
What you see is the Sun’s photospheric layer boiling. Yep, Sun boils. The small bubbles all over the images are convection cells, where hot plasma is rising from the interior, cools down and falls back into the hot interior of the Sun.
One of those bubbles is an approximate size of Spain, or for US-bound readers, twice the size of Oregon. The layer you see is photosphere, and that is the first layer of the Sun we see. Although Sun is basically the ball of hot gas and does not have the surface, as the Earth does, often photospheric layer will be referred as ‘surface’ of the Sun.