Human infra-red thermal visionMonday, 15 September, 2008
Infra-red (IR) vision refers to sensation AND visual integration of the wavelengths of light between 750 nm and 1mm. In contrast, the human eye can only detect light with wavelengths from 400nm (violet) to 700nm (red). It would be useful to detect infra-red light because although very hot objects emit visible light (hence the terms red hot and white hot) even modestly warm objects emit infrared light. For example, the human body emits light at a wavelength of 10 microns (or 10000nm), and if one could see these wavelengths, one would have the ability to see humans against a colder background (i.e. room temperature). Much like Predator.
Biological IR vision has many problems, the least of which is that n o visual pigment exists that can pick up infra-red light. There are many pigments that can detect ultraviolet (UV) light, with many birds, reptiles, fish and even some mammals (and insects, but their eye is sufficiently different to be irrelevant here) having visual pigments that absorb maximally at around 350-360nm (but can still detect light with a wavelength of 280nm, well into the near UV spectrum). It appears to be relatively easy to evolve UV vision, but IR vision is much harder. As far as I know, the deep-sea dragonfish, a fish that lives deep underwater where almost no light exists, has a porphyropsin that can detect into the near-infrared (these fish also produce their own deep red light, allowing them to see while no other fish can – not unlike a man-made night vision goggles) but even that has can barely detect photons of wavelengths of 750nm. Just barely into infra-red and certainly far short of the 10000nm required for heat-vision.
The animals most famous for their infra-red ‘heat vision’ are the pit vipers (Crotalinae), though Boas and pythons (Boidae) have also independently evolved a more primitive version. They detect thermal radiation with wavelengths from 400nm (i.e. violet) all the way to 10600nm, but they do so not with their eyes, but with loreal pits (in Crotalinae) or labial pits/scales (in Boidae). These are small indentation just around the mouth – which in the Crotalinae are shaped very much like a pinhole camera – and detect heat not infra-red photons. In that sense, they are really just specialised structures that sense heat in the same way we humans can if we hold our hands over an open fire, and so probably utilise a heat-gated ion channel like the vannilloid receptor channel activated in human nerves by heat, pain, rubbing alcohol and chilli peppers. That said, the sensory information from these thermoreceptive pits (or scales) do integrate into the optic tectum, so it is very likely that the snakes are able to see the heat map of their world visually. But is it likely to have low spatial and temporal resolution and would be very short-sighted (a few metres at the most).
So, what is the issue with infrared photoreceptions? Why don’t they exist in nature already? Well, I’ve actually come up with four issues, only one of which may be easily solved (or may not be a problem in the first place).
First, immediately in front of the photoreceptors in the eye of all vertebrates is a few centimetres of water-based vitreous and aqueous fluid. Infra-red radiation is much lower in energy than visual light (remember the energy of a photon is inversely proportional to its wavelength), and water is very good at absorbing infra-red light. Therefore the eye would not actually be transparent for infra-red light, so may absorb a substantial fraction of the photons (van der Berg, 1997) before it hits the retina. It’s might not be so bad, though, because at the low energy of 10000nm, the light may even be too weak to be absorbed by water.
Second, neither of the two photo-active molecules in vertebrates photoreceptors, 11-cis-retinal (A1) or 11-cis-3,4-dehydroretinal (A2), would not stable if it was absorbing at this wavelength. In order to activate these pigments, energy must be supplied from either the light or from the thermal energy (Ala-Laurila et al, 2004). The lower the wavelength of light, the less energy the photons have, so the greater the thermal energy must contribute to the activation energy. The thermal energy, however, can activate the photoreceptor by itself – one of the reasons why cold-blooded creatures are theoretically more sensitive to single photons (because a single photon even is indistinguishable from random thermal activation, and cold creatures will have less thermal activity) (Aho et al, 1988). If we were trying to set up a molecule that would activate from an infra-red photon, even one with a much shorter wavelength than 10 microns, we’d have a molecule that would activate very easily due to thermal noise. In other words, we’d have a lot of ‘static’ in a thermal visual system – so much so, that I doubt we’d be able to even see anything.
The third problem is the neural circuitry that would be required to process another colour. The retina has colour opponent ganglion cells, which feed into the colour-sensitive parvocellular layers of the lateral geniculate nucleus, which in turn stimulate sets of neurons called blob cells in the the visual cortex (Dacey, D.M., 1996). These systems are responsible for processing colour differences (for example, the difference between green and blue). There is a rare syndrome known as cerebral achromatopsia, where a person has a loss of colour vision (see the world in shades of gray) despite fully function cone photoreception, because of lesions to the parts of the brain responsible for this colour discrimination (Barbur et al, 1994). That said, it may be an undamaged brain will be able to adapt to accommodate any changes to colour-related inputs, even during adulthood (Neitz et al, 2002).
The last problem and perhaps the greatest problem with allowing humans to visualise thermal emissions, is that the eye itself it an emission source. The eye is covered in warm blood vessels, each emitting at the same wavelength as the other humans who we would like to detect with our thermal vision. Thermal signals from behind the photoreceptors could perhaps be shielded, allowing the photoreceptor cells to only detect signals from directly in front of it, but even the aqueous and vitreous humours and corneal cells are emitting thermal signals. The internal signal would still likely be greater than that reaching the photoreceptors from external emissions sources (like another person) . Pythons and pit vipers are cold-blooded, so they don’t have this problem. But to a warm-blooded human, it would be like trying to look out into the night from inside a brightly-lit room.
Therefore, I can only see two options for thermal vision in humans. One, we could have separate eyes mounted on a cold antennae, or use thermal-vision goggles. Considering that the latter option is already used, I say that method is the easiest to follow – one day, we may even integrate the output from such goggles into our visual cortex. Until then, despite its popularity as a possible enhancement, biological ly integrated infra-red thermal vision in humans will have to wait.