Updated: Feb 22, 2019
Our eyes can only see a tiny amount of the electromagnetic spectrum, that is light with a wavelength of around 400 nm to 700 nm. You might be surprised to find out that your camera can pick much more than this, the sensitivity looks something like:
It is determined by the band gap of the doped-silicon detector, and you can see that it actually peaks in the near-infrared (NIR ~ 700 - 2000 nm) and even goes a little into the ultraviolet (UV ~ 10 - 400 nm).
Manufacturers have spent a long time trying to suppress this, and for good reason otherwise the colours would be completely wrong. They usually do this by placing a filter in front of the sensor which blocks UV and NIR light passing through. But if you want to unleash the full power of the camera, that's not going to stop anyone with a little determination and a set of screw drivers.
In this project I want to focus on photography in the NIR. Its far simpler than the UV and in my opinion there is a little more science to be seen.
Firs thing to choose is the camera; I went with a Sony NEX 5 which can be had for around £50 and has a huge APS-C sized sensor. I also already have a bunch of E-mount lenses but these can be pretty cheap on ebay.
The filter is a NIR-long pass type, meaning it blocks visible light and lets NIR pass through, you can see its transmittance that I measured in a spectrometer. These cost around £10 and come in a range of cut-off wavelengths, mine is 760 nm but this doesnt make much difference as features in NIR spectra tend to be broad.
Now the big task is to take apart the camera to get to the sensor, unfortunately the sensor is buried right in the center of the camera which doesn't make it easy. But go slow and lay out all the part so that they can be put back together in exactly the same order:
If you have been careful then you should get to something that looks like this:
This is the main sensor, it has a redish sheen to it which comes from the stack of filters which we need to get rid of. Unfortunately. Sony don't make this easy either, the filters are attached to a piezo shaker, which is easy to remove, but they are also tightly glued to the frame with some kind of hard foam in between. This means that the filter has to be literally cut out with a scalpel.
The sensor can then be left exposed, or you can buy replacement filters which leave the bare sensor covered but let NIR light pass through. I went with the former as these filters cost more than my camera.
The filter taken out, usually called the 'hot mirror' is pretty interesting. Its a sandwich of two dichroic mirrors, meaning that if you are a photon of visible light then to you it looks like a clear piece of glass. But if you are a photon of NIR or UV light then it will look like a mirror, hence the redish sheen:
After the filter is removed, the camera can be painstakingly put back together and with a bit of luck it still turns on. Next you can slap on the NIR-pass filter and see what the world looks like!
This actually shows the first use of NIR photography; spots and other skin blemishes are all the same "colour" beyond 700 nm so you can have perfect skin with #nofilter, apart from the NIR one.
But you might be thinking that this is just an expensive and over-complicated way to get black and white photos. Well there are some objects that say otherwise, click the arrow on the right to see the difference between visible and NIR:
Some things appear black in the visible but become white or completely transparent in the NIR and vise-verse, which seems completely alien. The white spot on the phone is the 940 nm laser used to measure proximity, the £10 note is printed in different black inks to prevent fraud, the chemical is a strongly absorbing dye which I had laying around but it only absorbs in the visible and finally the filters are NIR pass as well as the original one removed from the camera.
A strange one that occurs naturally is chlorophyll. In the visible it appears green, which is why most plants are green. But chlorophyll does not absorb in the NIR so most plants appear white with this camera:
Interestingly enough this is the exact technique that NASA uses to monitor the health of forests and plants in general from space.
Because the colours are so different you can also add some crazy colour grading, although mine is not quite as cool as JR Korpa's shot at the top.
You can also see that the sky appears much darker in NIR. This is because Rayleigh scattering, which usually makes the sky blue, is inversely proportional to wavelength. So at longer NIR wavelengths there is much less light scattered.
You've probably heard of infrared imaging before as a way of measuring temperature of objects. This is because anything above absolute zero will emit light. Sadly for things around room temperature this is around 10,000 nm so cant be picked up by this camera. However if something is heated higher than 300 degrees then it starts to emit a small amount of NIR light. Here's a picture of a soldering iron:
There is absolutely no visible light here and even a NIR picture took 20 seconds of exposure.