Not long ago, I discovered that by covering my camera’s lens with aluminum foil, I could take interesting pictures. Not pictures of actual objects, but pictures of all the pixel errors and thermal noise in my camera’s image chip. I figured that, since I have an iPhone 4S handy, I’d see what darkness looks like according to its camera. Apparently, the abyss is pink and green:
While I was messing around learning amazing things about my camera, I remembered another cool experiment I tried once. I covered the camera’s lens in two layers of aluminum foil to block as much light as possible, then took a picture. As you can imagine, the picture came out black. You are probably now wondering what’s wrong with me, but as always (okay, as usually), there is a method to my madness. For you see, when you take a photo without any external light, the image you get consists entirely of weird in-camera effects: thermal noise from where electrons jiggled loose and made the camera think a photon had hit the image sensor, and, if you’re very lucky, a bright streak where a particle of radiation struck the chip. In my first picture, I didn’t get any cosmic rays, but I got some surprisingly pretty thermal noise:
I always liked things like this. It’s like a sensory deprivation chamber, where you can suddenly see all the weird static in your retinas and eventually you start hallucinating. You learn a lot about the way the brain works when you stop giving it any input and see what happens. And you learn cool things about how cameras and their image chips work when you take pictures of nothing at all.
While I was taking a picture of a stinkbug in my room (Don’t judge me.), I noticed that, in closeup shots, my camera could resolve some pretty small details. My camera’s anything but top-of-the-line. It’s a little Sony DSC-W690 point-and-shoot thing. Perfectly good for my needs, but not the kind of thing a real photographer would touch, even on a dare. I got curious, though: just how high-def is my camera? So I did what anybody would do (Okay, maybe not anybody…): I took a closeup shot of the millimeter scale on my ruler:
This is hardly a lab-grade ruler, but I think we can trust the manufacturer to know what a millimeter is. Before I shrank it down to post on the Internet, the millimeter ticks measured 78 pixels across (from the center of one white line to the center of the next). If you take the inverse of 78 pixels per millimeter, you get 0.0128 millimeters per pixel. That’s 12.8 microns per pixel. Although it’s not quite microscope resolution, it still means that a paramecium would show up as a visible blob 5 to 25 pixels across. Which is pretty damn cool.
But I thought that in my enthusiastic mania, I might have miscalculated something, so like a good boy, I checked my work. I pulled out one of my hairs (For you, Internet. I pulled out one of my very own hairs for you.) and took another photo with both ruler and hair in frame.
In this picture, I calculated the resolution at 56 microns per pixel, which, working from the hair’s visible width in pixels, means that its diameter is around 168 microns, which is about the right diameter for a human hair.
So now you know: even with a generic little point-and-shoot camera, you can see the microscopic. There are a lot of things about the modern world that suck, but then there are things like this, which are awesome.