Very interesting stuff from the people at MIT regarding imaging through scattering media. Recently, multiple approaches taking advantage of temporal focusing (TF) increased efficiency inside scattering media when using two-photon microscopy have been published, and this goes a step further.
Here, the authors use wide-field structured illumination, in combination with TF, to obtain images with a large field-of-view and a slow number of camera acquisitions. To do so, they sequentially project a set of random structured patterns using a digital micromirror device (DMD). Using the pictures acquired for each illumination pattern in combination with the point-spread-function (PSF) of the imaging system allows to recover images of different biological samples without the typical scattering blur.
De-scattering with Excitation Patterning (DEEP) Enables Rapid Wide-field Imaging Through Scattering Media
From multi-photon imaging penetrating millimeters deep through scattering biological tissue, to super-resolution imaging conquering the diffraction limit, optical imaging techniques have greatly advanced in recent years. Notwithstanding, a key unmet challenge in all these imaging techniques is to perform rapid wide-field imaging through a turbid medium. Strategies such as active wave-front correction and multi-photon excitation, both used for deep tissue imaging; or wide-field total-internal-refection illumination, used for super-resolution imaging; can generate arbitrary excitation patterns over a large field-of-view through or under turbid media. In these cases, throughput advantage gained by wide-field excitation is lost due to the use of point detection. To address this challenge, here we introduce a novel technique called De-scattering with Excitation Patterning, or ‘DEEP’, which uses patterned excitation followed by wide-field detection with computational imaging. We use two-photon temporal focusing (TFM) to demonstrate our approach at multiple scattering lengths deep in tissue. Our results suggest that millions of point-scanning measurements could be substituted with tens to hundreds of DEEP measurements with no compromise in image quality.
Last week I posted a recently uploaded paper on using positive-only patterns in a single-pixel imaging system.
Today I just found another implementation looking for the same objective. This time the authors (from University of Warsaw, leaded by Rafał Kotyński) introduce the idea of simplexes, or how any point in some N-dimensional space can be located using only positive coordinates if you choose the correct coordinate system. Cool concept!
Single-pixel imaging with sampling distributed over simplex vertices
We propose a method of reduction of experimental noise in single-pixel imaging by expressing the subsets of sampling patterns as linear combinations of vertices of a multidimensional regular simplex. This method also may be directly extended to complementary sampling. The modified measurement matrix contains nonnegative elements with patterns that may be directly displayed on intensity spatial light modulators. The measurement becomes theoretically independent of the ambient illumination, and in practice becomes more robust to the varying conditions of the experiment. We show how the optimal dimension of the simplex depends on the level of measurement noise. We present experimental results of single-pixel imaging using binarized sampling and real-time reconstruction with the Fourier domain regularized inversion method.
A group of researchers working in France and USA, leaded by N. Ducros, has uploaded an interesting paper this week.
When doing single-pixel imaging, one of the most important aspects you need to take into account is the kind of structured patters (functions) you are going to use. This is quite relevant because it is greatly connected with the speed you are going to achieve (as the number of total measurements needed for obtaining good images strongly depends on the set of functions you choose). Usually, the go-to solution for single-pixel cameras is to either choose random functions, or a set (family) of orthogonal functions (Fourier, DCT, Hadamard, etc.).
The problem with random functions is that they are not orthogonal (it is very hard to distinguish between two different random functions, all of them are similar), so you usually need to project a high number of them (which is time consuming). Orthogonal functions that belong to a basis are a better choice, because you can send the full basis to get “perfect” quality (i.e., without losing information due to undersampling). However, usually these functions have positive and negative values, which is something you cannot directly implement in lots of Spatial Light Modulators (for example, in Digital Micromirror Devices). If you want to implement these patterns, there are multiple workarounds. The most common one is to implement two closely-related patterns sequentially in the SLM to generate one function. This solves the negative-positive problem, but increases the time it takes to obtain an image in a factor two.
What Lorente-Mur et al. show in this paper is a method to generate a new family of positive-only patterns, derived from the original positive-negative family. This makes it possible to obtain images with a reduced number of measurements when compared to the dual or splitting approach I mentioned earlier, but still with high quality. Nice way to tackle one of the most limiting factors of single-pixel architectures.
Handling negative patterns for fast single-pixel lifetime imaging
Pattern generalization was proposed recently as an avenue to increase the acquisition speed of single-pixel imaging setups. This approach consists of designing some positive patterns that reproduce the target patterns with negative values through linear combinations. This avoids the typical burden of acquiring the positive and negative parts of each of the target patterns, which doubles the acquisition time. In this study, we consider the generalization of the Daubechies wavelet patterns and compare images reconstructed using our approach and using the regular splitting approach. Overall, the reduction in the number of illumination patterns should facilitate the implementation of compressive hyperspectral lifetime imaging for fluorescence-guided surgery.