Frankencamera 4: A Heterogeneous Platform for Computational Photography

Computational photography is rapidly breaking out of the laboratory and into the wild, from panoramas and high-dynamic-range imaging in smartphones to light-field cameras or wearable devices. However, these algorithms typically run on a CPU or GPU and consume large amounts of energy, which is at a premium on mobile devices.

Conversely, fixed-function hardware such as conventional image signal processors (ISPs) consume relatively little energy, but the lack of programmability makes such hardware almost useless for computational photography and computer vision applications. Unfortunately, developing custom hardware - and the software that drives it - is extremely time consuming and expensive.

In this project, we’re building a programmable camera to explore answers to these challenges.

For the hardware, we’ve coupled a Xilinx Zynq FPGA SOC with an NVIDIA Tegra X1. The Zynq provides reconfigurable logic fabric for acclerating custom applications, while the Tegra provides a convenient platform for CPU and GPU-based image processing. The Tegra also provides a responsive user interface via an HDMI touchscreen. On the front end, we have a Micro 4/3 lens mount and a 1080p image sensor, with plans to upgrade to a 10MP sensor in the future.

Frankencamera hardware

Our hardware currently works as a collection of parts on a desk, and we are in the process of assembling a complete physical prototype.

Frankencamera hardware

The name “Frankencamera” is inherited from previous iterations of the programmable camera project, which were named for their appearance as disfigured collections of camera parts. The previous iterations of the Frankencamera focused on fine-grained control of the image sensor and peripherals; this iteration will extend control to the processing.

To make this hardware useful, we’ve created a system that can generate FPGA bitstreams from Halide image processing pipelines, which is described in more detail on the Halide to Heterogenous Computing project page.

On top of this, we’re building an API for controlling the camera’s processing pipeline, somewhat akin to the FCam API from the previous Frankencamera project, or to the Android camera2 interface which evolved from it.

We were awarded “Best Student Demo” at the Center for Future Architectures Research (CFAR) semi-annual meeting in May 2016.