Научная статья на тему 'NEUROMORPHIC LOCALIZATION MICROSCOPY FOR CELL BIOLOGY'

NEUROMORPHIC LOCALIZATION MICROSCOPY FOR CELL BIOLOGY Текст научной статьи по специальности «Медицинские технологии»

CC BY
49
6
i Надоели баннеры? Вы всегда можете отключить рекламу.

Аннотация научной статьи по медицинским технологиям, автор научной работы — Rohit Mangalwedhekar, Deepak Nair

Neuromorphic cameras have emerged as a novel sensor technology inspired by dynamic vision that work on the principle of detecting intensity changes as discrete events. These events are sampled asynchronously, independent of each pixel, resulting in sparse measurements. This inherent feature makes neuromorphic cameras ideal for imaging dynamic processes. By leveraging the unique properties of event-based sensors, we reconstruct event streams into images with temporal scales as low as 100 microseconds. We capitalize on the asynchronous recording of ON and OFF polarities, corresponding to increasing and decreasing intensity changes, respectively. This allows us to study the temporal variations of fluorescence emissions from single fluorescent particles, capturing their response to changes in excitation intensities. Moreover, we exploit the distinct characteristics of event-based sensors for precise localization of individual particles. By analyzing the ON and OFF processes independently, we achieve localization precision within the range of 10 nm. Additionally, through mathematical combinations of these independent processes from the same object, we surpass the diffraction limit and achieve sub-10 nm precision, pushing the boundaries of resolution. The white paper demonstrates the power of neuromorphic cameras in capturing dynamic processes and advancing the field of nanoscale imaging.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «NEUROMORPHIC LOCALIZATION MICROSCOPY FOR CELL BIOLOGY»

DOI 10.24412/cl-37136-2023-1-181-185

NEUROMORPHIC LOCALIZATION MICROSCOPY FOR CELL BIOLOGY ROHIT MANGALWEDHEKAR12 AND DEEPAK NAIR1

1Centre for Neuroscience, Indian Institute of Science, India 2Department of Atomic and Molecular Physics, Manipal Academy of Higher Education, India

[email protected] rohitrm [email protected]

ABSTRACT

Neuromorphic cameras have emerged as a novel sensor technology inspired by dynamic vision that work on the principle of detecting intensity changes as discrete events. These events are sampled asynchronously, independent of each pixel, resulting in sparse measurements. This inherent feature makes neuromorphic cameras ideal for imaging dynamic processes. By leveraging the unique properties of event-based sensors, we reconstruct event streams into images with temporal scales as low as 100 microseconds. We capitalize on the asynchronous recording of ON and OFF polarities, corresponding to increasing and decreasing intensity changes, respectively. This allows us to study the temporal variations of fluorescence emissions from single fluorescent particles, capturing their response to changes in excitation intensities. Moreover, we exploit the distinct characteristics of event-based sensors for precise localization of individual particles. By analyzing the ON and OFF processes independently, we achieve localization precision within the range of 10 nm. Additionally, through mathematical combinations of these independent processes from the same object, we surpass the diffraction limit and achieve sub-10nm precision, pushing the boundaries of resolution. The white paper demonstrates the power of neuromorphic cameras in capturing dynamic processes and advancing the field of nanoscale imaging.

MAIN

In addition to its indispensable role in biological research, optical microscopy remains invaluable in the realms of materials science and nanotechnology. By enabling detailed observations and analyses, this technique contributes to advancements in various fields and continues to shape our understanding of the microscopic world. However, the resolution of the images is limited by the diffraction criterion as given by Abbe to be around 200nm. While this resolution is sufficient for many applications at the cellular level, there are several applications with regard to biomolecular interactions and organization that require higher resolutions in the order of a few nanometers. Super resolution microscopy encompasses a diverse range of advanced optical imaging techniques that surpass the diffraction limit of light, enabling the visualization and analysis of nanoscale structures and processes. The key principle underlying these techniques involves acquiring sparse data, either spatially or temporally, from fluorescently labelled samples thus allowing for imaging resolution limited molecules. By circumventing the limitations imposed by diffraction, super resolution microscopy opens up new avenues for investigating the intricate details of biological samples and other materials at unprecedented levels of spatial resolution [1]. While most microscopy techniques have been aided by state of the art detectors and cameras such as Electron multiplying Charge coupled devices(EMCCD) and scientific Complementary Metal oxide semiconductor(sCMOS), the detectors do not actively create sparsity in data [2]. Neuromorphic cameras, also known as event cameras or silicon retinas, revolutionize the field of light sensing by drawing inspiration from dynamic vision principles [3,4]. Unlike their counterparts, neuromorphic cameras asynchronously record these intensity changes as spikes, independent of other pixels in the visual field. Each pixel autonomously samples and detects events, resulting in two types of events: ON events (positive polarity) representing increases in light intensity, and OFF events (negative polarity) indicating decreases in intensity. Notably, pixels that perceive no change in intensity do not generate any data, minimizing the camera's digital footprint [3, 4]. The sparse nature of measurements obtained from neuromorphic cameras makes them

particularly well-suited for imaging dynamic processes characterized by sporadic events, such as the stochastic emission of individual molecules [2]. The unique detection paradigm of event cameras sets them apart from conventional cameras, such as EMCCDs, in terms of data output. Instead of capturing a stream of images at a fixed frame rate, event cameras provide data in the form of arrays or tuples containing timestamps, pixel coordinates, and polarity information. These sparse and asynchronous events represent changes in intensity, with ON events denoted by 1 and OFF events by 0. To reconstruct frames from the event data, a Computer Vision-based algorithm is employed on the Python platform. The algorithm accumulates events of different polarities into separate channel frames within user-defined temporal windows. One notable advantage of this post-acquisition image reconstruction approach is its flexibility in analyzing phenomena at different time scales. By reconstructing frames at varying time windows, researchers can gain insights into the dynamics of the phenomenon under investigation. Furthermore, the frame reconstruction process allows for flexibility in defining unequal time windows or even operating in the event regime, where each frame represents an arbitrary number of events and may not align in the time domain. This versatility enables researchers to explore and analyze the event data from different perspectives, enhancing our understanding of dynamic processes captured by neuromorphic cameras [2, 4]. Immobilized 100nm single fluorescent beads were illuminated by a 647 nm laser coupled with an acousto-optic tunable filter from Roper Scientific. The imaging process was controlled using MetaMorph software in streaming mode, with exposure times of 25 ms, 50 ms, 100 ms, and 200 ms. Fluorescence signals were captured using a highly sensitive EMCCD camera and a neuromorphic camera (DAVIS346, Inivation) using the DVS software. Freely diffusing beads in aqueous solution were also imaged using the same setup [2]. These cameras save continuously acquired data in the form of asynchronous event streams, typically in *.aedat4 or *.raw file formats depending on the camera model. To visualize and analyze the events, image reconstruction is performed. In this study, we developed a Python 3.7 code using the Computer Vision library (commonly known as OpenCV or cv2) to reconstruct the event stream data. The reconstruction of events into images is achieved by accumulating the events within a specified time or event window [2]. The accumulation or integration of polarities occurs by visualizing the ON events in the green channel and the OFF events in the red channel. It is important to note that when there are no intensity fluctuations, no measurements are recorded. Unlike standard cameras where the frame rate is predetermined, neuromorphic cameras allow the user to choose the frame rate post-acquisition. This flexibility enables the study of ensemble or single-particle fluorescence with a wide range of temporal resolutions [2].

Figure 1: Schematic of the workflow showing spatial(above) and temporal analysis. The image reconstruction process involved converting the recorded neuromorphic events into image frames at a rate of 5 ms per frame. The resulting time series clearly reveals the periodic nature of both the ON and OFF events, exhibiting a phase difference that accurately represents the true nature of the events. The

minimal overlap observed between the two sets of events highlights the low latency capabilities of neuromorphic cameras, further confirming their suitability for highspeed dynamic imaging applications [2].

To visualize the net polarity change occurring within a single particle over a specific time frame, the ON and OFF channels were summed together. This representation effectively demonstrates the overall polarity change exhibited by a single particle within the given time interval. Furthermore, the net polarities were cumulatively summed up in order to gain insight into the nature of integrated intensity of the fluorescence. In all three cases, the Fast Fourier Transform confirmed that the fluctuations in fluorescence was synchronous with the laser intensity [2].

ON OFF <P0N> <Poff> I<Pqn >-<Poff >1 |<POM >x<POFF >j

1 u « _ J * 0

■ £ It ■ w \ i I I m a m m a^ ■

2 ■ 1 I _ ■ ^r ■ ■ - wk T

■ T ■ if *_ ....... ■ : ■

3 ■ m m ■ > Ml r.

a to ■ ■ ■ ■ 4 j

4 fl ■■ ■ ■ ■ n{H|l Q ffl 1 • ■_■ i * i i 1 * B

■ *_ ■f ■ L J

<P0N > <POFF > |<P0N > - <P0FF >1 l<PON > X <P0FF >1

1 A ? £ A 1 a9 £ A & £

2 A 9" $ Q A £ A $ o A £ a

Figure 2: Distributions of ON and OFF localizations and their combinations by subtraction and

multiplicationfrom [2])

By reconstructing images from the neuromorphic events and analyzing the resulting time series, we gain valuable insights into the dynamic behavior of the captured phenomena. This approach not only showcases the power of neuromorphic cameras in high-speed imaging but also allows for a more comprehensive understanding of the underlying processes taking place at a microscopic level [2].

The localization precisions of the immobilized single fluorescent particles were calculated by localizing the particles using DeepTrack, a deep learning based centroid tracking algorithm, and using a wavelet algorithm. The localization was performed on ON and OFF events independently. Furthermore, we were also able to show that the mathematical combination of the distributions of ON and OFF localizations could lead to further improving the localization precision. The localization precisions achieved are given in Table 1 [2].

Table 1: Localization precisions [2]

Localisation precision ON OFF ON-OFF ONxOFF

X(nm) 18.63 21.57 8.69 13.2

Y(nm) 17.8 21.65 22.54 11.47

On confirming the fidelity of the camera in the temporal and spatial regimes were confirmed, diffusing single particles were tracked thereby allowing us to study phenomena varying in the spatiotemporal paradigm. The event data was accumulated into various time windows and the single particles were tracked across this range of time periods. A panel of 5 single particles representing various amount of fractalization depending on the accumulation time is shown in Figure 3 [2].

10 ms 20 ms 25 ms 50 ms 100 ms

1

r-j ' \ V 1 vt <J—\ M .«SSSs1—> sC-x. \ \

3

fC ' < m f'\ » [_Jt* ^ tA v

5 1 - -r

Figure 3: A panel of 5 particles representing increasing fractalization with decreasing accumulation

time[2]

The unique capabilities of neuromorphic cameras enable them to effectively capture transient events, including the disappearance of fluorescence, making them a powerful tool for surpassing the diffraction limit. In our study, we have demonstrated that by combining the occurrence of ON/OFF events rather than analyzing them separately, we can achieve localization precision below the diffraction limit [2]. The localization precision

of diffraction limited single fluorescent paritcles relies on the number of photons (N) emitted per unit time (o K N -(1/2)). Nonlinear detection methods such as neuromorphic cameras, allow the localization of independent ON and OFF events at a higher precision, allowing these measurements to achieve the super Heisenberg limit (o KN -(A/2), where A >2) [2].

REFERENCES

[1] Hell SW, Sahl SJ, Bates M, Zhuang X, Heintzmann R, Booth MJ, Bewersdorf J, Shtengel G, Hess H, Tinnefeld P, Honigmann A. The 2015 super-resolution microscopy roadmap. Journal of Physics D: Applied Physics. 2015 Oct 14;48(44):443001.

[2] Mangalwedhekar R, Singh N, Thakur CS, Seelamantula CS, Jose M, Nair D. Achieving nanoscale precision using neuromorphic localization microscopy. Nature Nanotechnology. 2023 Jan 23:1-0.

[3] Gallego G, Delbrück T, Orchard G, Bartolozzi C, Taba B, Censi A, Leutenegger S, Davison AJ, Conradt J, Daniilidis K, Scaramuzza D. Event-based vision: A survey. IEEE transactions on pattern analysis and machine intelligence. 2020 Jul 10;44(1):154-80.

[4] Lakshmi A, Chakraborty A, Thakur CS. Neuromorphic vision: From sensors to event- based algorithms. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery. 2019 Jul;9(4):e1310.

i Надоели баннеры? Вы всегда можете отключить рекламу.