Not all instant cameras are chemical, though. For years we've had tiny, battery-powered bluetooth printers that can simulate the instant sensation by connecting to your phone. While this is technically brilliant, there are a few things that bug me about the arrangement:
Admittedly this could all be fixed with some duct tape and enabling auto-print somehow. But for a modern instant camera, I think we need to go in a different direction.
My first thought was to invest in a thermal receipt printer. In addition to being a small, low-quality printer, it has that perfectly satisfying method of extruding its printed goodness in the same fashion as a motorized Polaroid. I may still pursue this method (after all, it would just take a raspberry pi and half an hour's work) but the enthusiasm waned when I found out many other people already had the same idea.
The concept of an instant camera for the modern human, digital yet retro, technologically advanced yet at the same time rather pathetic at producing pictures, has rattled around my brain for years now. Something spurred me into rethinking this lately, and I realized with the receipt printer we were perhaps aiming too high. There's a step further we could (and should) take.
Picture, if you will, a computer terminal from before the age of digital displays. Its only in and outs are UART messages to a mainframe, a keyboard and perhaps a daisywheel printer. The printer can only manage the basic ASCII charset. Now imagine if that serial data to the printer happened to describe a scene in not quite a thousand words, but in ASCII art.
The whole thing could be made portable, like an unwieldy box camera to be operated only from a tripod, its innards mostly the print mechanism and presumably, somewhere amongst the bits, a CCD. We pose for the scene and the flashbulb fires, and then we wait, as line-by-line the daisywheel types out its glorious message onto not-quite-premium paper, a picture in an 80-character wide terminal.
Technical details: The software in the link above is quite simple. First it looks at an image of the Courier New font and works out the average pixel brightness for each character. Then it builds a lookup table which maps 256 input shades of grey to the nearest character in terms of brightness. It's a very patchy table. Once we've generated the table it could be cached, but I left the code there for educational reasons.
The rest of the script jumps through the hoops needed to request camera access, scales the video feed down to 80 by 32 pixels (it assumes an 80-character-wide terminal, a video feed aspect ratio of 4:3, and a font character aspect ratio of 15:8) then squirts each pixel through the lookup table to get the right character.
It could do with a better form of auto-contrast, since so little of the possible shades can be represented by ASCII characters. The next step is to have this program email you the output, or better yet, send it to a printer.