Think of it like seeing underwater for an exaggerated example. You only have a certain range of visibility due to disturbances to the photons travelling through the water. You can see very clearly when in close range, but when far away the objects are faded and hard to make out. The photons are no longer arriving at your eyes. They have been, blocked and attenuated and diffracted to the point where you have trouble identifying objects. How do you think technology could improve this vision underwater using light? How can technology help to reconstruct information that is not available? This is like using a directional microphone to record a bird call and expecting it to also pick up the sound of the ants crawling on the ground near the bird. It is physically impossible because the ambient noise is higher than the noise of the ants crawling. Aerial photography is like having the mike within mm of the ants and the bird, you can hear both. Skybox satellite photography at 600 km up or 60 times higher than a jumbo jet at cruising altitude, is only able to supply the bird.
If google earth were to buy skybox and use their images for capital city areas, they would be seriously downgrading the resolution, as their current cap city and built up areas imaging is from aerial photography shot from low altitude planes.
- Forums
- ASX - By Stock
- $99 personal plans
Think of it like seeing underwater for an exaggerated example....
-
- There are more pages in this discussion • 2 more messages in this thread...
You’re viewing a single post only. To view the entire thread just sign in or Join Now (FREE)
Featured News
Add NEA (ASX) to my watchlist
Currently unlisted public company.
The Watchlist
LPM
LITHIUM PLUS MINERALS LTD.
Simon Kidston, Non--Executive Director
Simon Kidston
Non--Executive Director
SPONSORED BY The Market Online