Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's worth noting that stars can be seen even in daylight with the right software.

That's because, while no individual star is visible, the exact angles between all stars is fixed, so you can do a brute force search of all possible orientations of the sky to find a matching one. In a 6 megapixel image of half blue sky, you effectively get an oversampling ratio of 3 million:1, so even with very bright sunlight obscuring the stars to the naked eye, your algorithm will pick them out.



Very interesting! Any links for further reading?


Nope - years ago I accidentally discovered this while trying to align star images for stacking. Some of my images were taken in daylight, and I was surprised to find my rudimentary image aligner still worked just fine. Never wrote it up into a blog post.


Do you mind expanding a bit more ? Because I don't understand. Even if you have oversampling, as you say, it would be after you know the locations of the stars, and also, how can you brute force every possible right ascension/declination/rotation ? Without a calibrated camera how do you account for the distortions ? Thanks


In my case, I had nighttime images from the same camera, so all the calibration was already done... I was just looking for a rotated version of the nighttime image in the daytime image.

But even in the general case, smallish image patches don't have enough distortion to matter, while still having plenty of oversampling, and only have 3 unknowns. You can probably do some kind of fft trickery to reduce it to 1 unknown. And you can probably use some kind of hill-climbing to further reduce the search effort.


What does an optimization algorithm have to do with a brute force search?

Even if your image patch is small, without previous images to compare, you need to brute force the entire sky?

And how can "fft trickery" reduce the attitude state to 1 unknown ? 1 unknown with what units ?

Sorry but I suspect you are making all these up.


The claims from pifm_guy are reasonable.

This is a scenario with extremely high coding gain[1]. The DC value of the sky can be removed by high pass filtering. Finding the rigid transform (rotation+translation) between two images captured by the same camera, assuming no distortion[2], assuming the image pair has substantial intersection, doesn't require calibration.

Dropping the "same camera" and "substantial intersection" assumptions requires searching the entire sky and also finding an unknown scaling factor. It all comes down to the images having low enough noise and high enough resolution, relative to the search space. The math is sound.

You're not entitled to a tutorial in signal processing, though.

[1] https://en.wikipedia.org/wiki/Coding_gain

[2] Astronomy lenses have very small FOV and very small distortion.


I’m not trying to call you guys out. Just curious for more information. I would say I have enough knowledge about signal processing to be dangerous, as it’s quite adjacent to my job. But this isn’t clicking with me. Assuming we’re not including cameras outside the visible light spectrum, are there cameras that actually have enough dynamic range to capture a recoverable amount of information about stars against a bright sky? Beyond that, the original commenter takes about oversampling, but oversampling of what exactly? Intuition says that even with a high resolution sensor (like a modern iPhone) the visual size of a star would still be less than a single pixel. If my intuition there is correct, aren’t we spatially under sampling? The shutter is open for some period of time, so I guess you could say we’re temporally oversampling, but then we’re applying the same gain both to the stars and the sky, so that doesn’t seem to helpful. I feel like taking a high pass filter of a daylight sky would leave you with next to nothing save some of the sensor’s inherent noise. Are you discussing exquisite sensors or optics here? Genuinely curious. I tried some basic google searches on the matter of recovering stats in daytime images but didn’t get much back.


About planets and stars in daytime: https://www.skysurfer.eu/daystars.php

The concept here is processing gain, which is algorithmic. It's what enables the miracle of GPS, even though satellites are 20000km far away, there may be clouds, rain, etc and your phone doesn't come with a satellite dish. Also used in CDMA radio for civilian and military communications, it allows reliable transmission at negative SNR by spreading the message over a huge space. In the case of GPS/radio, the space is frequency. Here, the message is coordinates of a rectangular crop of the sky, encoded in the (huge) space of few million pixels. On a per-pixel basis, the SNR can be negative -- looking at a single pixel I can't tell you with confidence if it contains a star, because the shade of blue is just too close to average. But correlating the entire image against the correct "code" (a clean star field), tells me whether I got the location of the sky correctly. Intuitively: accumulated over the entire image, the noise statistics are such that the noise stays bounded, and the tiny amount of signal adds up.

Appropriate optics will increase the SNR so that bright stars can actually become visible. Per the link above, mag 2-3 stars are visible with a 10cm telescope (looking through the eyepiece; no filters or digital sensors needed). So if the conditions are right and the optics are good, you may actually see a few stars reliably, and the scenario reduces to the B-21 hangar scenario.

Finally, it's worth noting that even a consumer grade CMOS sensor will sample at 12 bits per pixel (4096 linear levels) with low noise. The human eye cannot discriminate a luminance difference of 1/4096 relative to full scale, not even close. Using the naked eye for intuition here is misleading, because this is a scenario where a sensor is far better at registering the small additive contribution of a star under daylight, and this is the key to enabling the processing gain through math.


> oversampling, but oversampling of what exactly?

If you have 3 million pixels of sky, and 3 variables to find, then you have oversampled by 1 million times.

In the ideal noise-free world with no other unknowns, any 3 of those pixels would be enough to solve the problem perfectly.


If the "DC value of the sky" could be removed with a simple high pass filter, then there would be no need of doing anything else. The positions of the stars in the sky would be known, and techniques like the one in the article would apply.

I have been working fine so far in startrackers for spacecraft without your tutorial in signal processing, but thanks for the offer


> If the "DC value of the sky" could be removed with a simple high pass filter, then there would be no need of doing anything else. The positions of the stars in the sky would be known, and techniques like the one in the article would apply.

This doesn't make any sense. Even if one could estimate ground truth DC, removing it would leave noise at every other frequency bin.

But this is beside the point. Your questioning isn't coming from a place of curiosity.

I direct you to the HN guidelines: "Be kind. Don't be snarky. Have curious conversation; don't cross-examine."


you were the one suggesting "removing the DC value of the sky with high pass filtering". anyway I'm done. have a nice day.


It might be helpful if you read about what platesolving is: https://en.wikipedia.org/wiki/Astrometric_solving

It's a well defined problem of taking star positions and pinpointing its celestial coordinates. There's even a term for being able to do that without knowing anything about the camera, telescope or distortion, it's called blind solving. Plenty of tools to do it are available online and they're an essential part of an astrophotographers tools.

Fun fact, most of our ICBMs rely on plate solving to navigate so they can hit their targets even when GPS signals are disrupted.


I don't understand what they are saying but it's definitely true if you are careful, you can take images of the locations of the top magnitude stars, point your camera at that, take a long exposure (~15 seconds, so you need a way to keep the moving star at exactly the same pixel over the whole exposure), do some contrast scaling, and you will see point pixels lit up in the right locations.

If you don't already know a star location, I'm sure you could construct a very sensitive camera and some noise reduction and do this with short exposures without any rotation.


GPS works like that too, the GPS signal is so weak it's below the thermal noise floor when it arrives at the receiver, but you know each satellite's special signal pattern, so you just search all satellites at all positions in the noise and see if you get some correlation (a fix).


How long does this take? Could this be run on, say, a smartphone to calculate its rough position from a photo of the sky?


I've never considered that but it makes perfect sense. Might try to code something up given some freetime.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: