Optogenetics For GaN LEDs And CMOS Sensors
Imaging the brain of a mouse with arrays of 470 nm LEDs and silicon pixels.
RESEARCHERS at Nara Institute of Science and Technology, Japan, claim that they have built the first integrated optical neural stimulation and observation device incorporating an LED and a CMOS image sensor. This device could aid researchers in the field of optogenetics, which involves the use of light to alter the behaviour of cells.
It is possible to build a similar system with an avalanche photodiode array rather than a CMOS sensor, which is an approach that has been adopted by engineers at the University of Strathclyde, UK. According to lead-author Takashi Tokuda from Nara Institute of Science and Technology, one of the advantages of the avalanche photodiode array is that it can deliver high-speed detection, which is essential for timeresolved fluorescence measurements. But he adds that this type of detector is unsuitable for on-chip imaging of biological cells and tissues, because each of the photodiodes has dimensions of the order of 10 μm.
“The resolution of a conventional CMOS image sensor can be as small as 1-2 μm ," says Tokuda, who admits that he and his co-workers are still to shrink their pixels to such small dimensions.
The team builds its neural interface device by flip-chip bonding an LED-on-sapphire array to a CMOS image sensor. Thanks to very low levels of absorption of visible light in the sapphire and nitride layers of the LED, it is possible to place samples, such as a slice of brain, on the backside of the substrate. The neural interface device is formed by combining an array of 470 nm LEDs with a 128 by 268 array of detector pixels, each 15 μm by 7.5 μm. This has been used to image a slice of brain taken from a mouse.
This work is still in its infancy, and Tokuda admits that there is much to do before he and his co-workers will start to acquire high-quality images. In order to realise such images, an on-chip filter is needed to distinguish between emission resulting from fluorescence and light originating from scattering of the excitation source. In addition, detector sensitivity must be improved so that it is possible to measure very small changes in intensity, and the instrument needs to provide a higher spatial resolution, which will require reductions to pixel sizes and the distance between cell and target.
Tokuda and his co-workers will try to tackle many of these issues. Their goals for the future include shrinking the size of their LEDs and improving image performance.
T. Tokuda et al. Electronics Lett. 48 312 (2012)