this post was submitted on 10 Apr 2025
1237 points (99.6% liked)
Space
10463 readers
12 users here now
Share & discuss informative content on: Astrophysics, Cosmology, Space Exploration, Planetary Science and Astrobiology.
Rules
- Be respectful and inclusive.
- No harassment, hate speech, or trolling.
- Engage in constructive discussions.
- Share relevant content.
- Follow guidelines and moderators' instructions.
- Use appropriate language and tone.
- Report violations.
- Foster a continuous learning environment.
Picture of the Day
The Busy Center of the Lagoon Nebula
Related Communities
๐ญ Science
- !astronomy@mander.xyz
- !curiosityrover@lemmy.world
- !earthscience@mander.xyz
- !esa@feddit.nl
- !nasa@lemmy.world
- !perseverancerover@lemmy.world
- !physics@mander.xyz
- !space@beehaw.org
- !space@lemmy.world
๐ Engineering
๐ Art and Photography
Other Cool Links
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"actual image your camera sees" is a term that is hard to define with astrophotography, because it's kinda hard to define with regular digital photography, too.
The sensor collects raw data on its pixels, where the amount of radiation that makes it past that pixel's color filter actually excites the electrons on that particular pixel and gets processed on the image processing chip, where each pixel is assigned a color and it gets added together as larger added pixels in some image.
So what does a camera "see"? It depends on how the lenses and filters in front of that sensor are set up, and it depends on how susceptible to electrical noise that sensor is, and it depends on the configuration of how long it looks for each frame. Many of these sensors are sensitive to a wide range of light wavelengths, so the filter determines whether any particular pixel sees red, blue, or green light. Some get configured to filter out all but ultraviolet or infrared wavelengths, at which point the camera can "see" what the human eye cannot.
A long exposure can collect light over a long period of time to show even very faint light, at least in the dark.
There are all sorts of mechanical tricks at that point. Image stabilization tries to keep the beams of focused light stabilized on the sensor, and may compensate for movement with some offsetting movement, so that the pixel is collecting light from the same direction over the course of its entire exposure. Or, some people want to rotate their camera along with the celestial subject, a star or a planet they're trying to get a picture of, to compensate for the Earth's rotation over the long exposure.
And then there are computational tricks. Just as you might physically move the sensor or lens to compensate for motion, you may just process the incoming sensor data to understand that a particular subject's light will hit multiple pixels over time, and can get added together in software rather than at the sensor's own charged pixels.
So astrophotography is just an extension of normal photography's use of filtering out the wavelengths you don't want, and processing the data that hits the sensor. It's just that there needs to be a lot more thought and configuration of those filters and processing algorithms than the default that sits on a typical phone's camera app and hardware.