• Ei tuloksia

Bacteria detection and tracking in lensless phase microscopy

N/A
N/A
Info
Lataa
Protected

Academic year: 2022

Jaa "Bacteria detection and tracking in lensless phase microscopy"

Copied!
24
0
0

Kokoteksti

(1)

BACTERIA DETECTION AND TRACKING IN LENSLESS PHASE MICROSCOPY

Bachelor’s thesis Faculty of Information Technology and Communication Sciences Examiner: Peter Kocsis December 2021

(2)

Ville Vartiainen: Bacteria detection and tracking in lensless phase microscopy Bachelor’s thesis

Tampere University

Bachelor’s Programme in Information Technology December 2021

When imaging transparent specimens (e.g., red blood cells, tissue, bacteria) using an ordi- nary light microscope, the captured images are low-contrast due to low light absorption. However, the light diffracts and its phase changes when passing through the specimen and this property can be used to determine the object. Since only the intensity of the light radiation can be mea- sured, methods for reconstructing phase information from intensity values have been developed.

A phase-retrieval algorithm is required alongside to get make use of the image signal.

Lenses are easy to manufacture today and they are very useful for controlling the path of light in optical systems. However, they introduce aberrations to the system, and compound lenses tend to make the systems rather bulky. This study discusses phase microscopy from the lensless imaging approach. A MATLAB program is developed for tracking bacteria movement on video as a part of an existing lensless phase retrieval system. The study also discusses various structural microscopy and imaging methods. Furthermore in this study different focus measures used in autofocus are compared.

The developed detection and tracking software is only usable in post-processing at its current state. In theory the same principles can be applied in a real-time processing. Machine learning and vision could be key to improving the performance of the software since this way using brute force could be avoided e.g. in autofocus. With the current approach, tracking in the depth direction is not feasible. For lateral tracking, finding only one focus distance is enough, which is a trade-off in terms of image sharpness, if the tracked object moves along the z-axis.

Keywords: microscopy, bacteria, phase retrieval, autofocus, MATLAB

The originality of this thesis has been checked using the Turnitin OriginalityCheck service.

(3)

TIIVISTELMÄ

Ville Vartiainen: Bakteerien havaitseminen ja seuranta linssittömässä vaihemikroskopiassa Kandidaatintyö

Tampereen yliopisto

Tietotekniikan kandidaattiohjelma Joulukuu 2021

Läpinäkyviä näytteitä (esim. punasoluja, kudosta, bakteereja) kuvannettaessa käyttäen taval- lista valomikroskooppia ongelmana on, että otettujen kuvien kontrasti on heikko, koska näyte ei absorboi tarpeeksi valoa. Kuitenkin valo diffraktoituu ja sen vaihe muuttuu läpäistessään näytteen ja tätä ominaisuutta voidaan hyödyntää objektin määrittämisessä. Koska vain valosäteilyn intensi- teettiä voidaan mitata, on kehitetty menetelmiä vaiheinformaation rekonstruoimiseksi intensiteet- tiarvoista. Kuvasignaalin hyödyntämiseksi, tarvitaan vielä lisäksi vaiheenhakualgoritmi (en. phase retrieval algorithm).

Linssejä on nykyään helppo valmistaa, ja ne ovat erittäin hyödyllisiä valon ohjaamisessa op- tisissa järjestelmissä. Ne aiheuttavat kuitenkin optisia aberraatioita järjestelmiin, ja yhdistelinssit (en. compound lens) tekevät optisista järjestelmistä melko kookkaita. Tässä työssä käsitellään vaihemikroskopiaa linssittömyyden näkökulmasta. Työssä kehitetään myös osaksi linssitöntä vai- heenhakujärjestelmää MATLAB-ohjelma, jolla seurataan bakteerien liikkumista videolla. Työssä käsitellään erilaisia rakennemikroskopia- ja kuvantamismenetelmiä. Lisäksi työssä vertaillaan eri- laisia automaattitarkennuksessa käytettyjä tarkennusmittoja.

Tuotettu bakteerien havaitsemis- ja seurantaohjelma on nykyisessä tilassaan käytettävissä vain jälkiprosessoinnissa. Teoriassa samoja periaatteita voitaisiin soveltaa reaaliaikaiseen käsit- telyyn. Koneoppiminen ja -näkö voivat olla avaimia ohjelman tehokkuuden parantamiseen, sillä näiden avulla on mahdollista välttää raa’an voiman (en. brute-force) tarvetta esimerkiksi kuvan tarkentamisessa. Nykyisellä lähestymistavalla syvyyssuntainen seuranta ei ole toteuttamiskelpoi- nen. Vaakasuuntaiseen seurantaan riittää löytää vain yksi tarkennusetäisyys, mikä on kuitenkin kompromissi kuvan terävyyden suhteen, jos seurattava objekti liikkuu z-akselilla.

Avainsanat: mikroskopia, bakteerit, vaiheenhaku, autofokus, MATLAB

Tämän julkaisun alkuperäisyys on tarkastettu Turnitin OriginalityCheck -ohjelmalla.

(4)

1. INTRODUCTION . . . 1

2. BACKGROUND . . . 3

2.1 Common methods in 3D imaging . . . 3

2.1.1 Computed tomography . . . 3

2.1.2 Cone beam computerized tomography . . . 3

2.1.3 Structured illumination microscopy . . . 3

2.1.4 Phase contrast microscopy . . . 4

2.2 Diffraction-based methods . . . 4

2.2.1 Coherent diffractive imaging. . . 4

2.2.2 Digital holographic microscopy. . . 5

2.3 Resolution and super-resolution . . . 5

3. METHODS . . . 7

3.1 Optical setup . . . 7

3.2 Image formation . . . 8

3.3 Simulations and movement detection . . . 9

3.4 Object detection and tracking . . . 9

3.5 Autofocus . . . 10

3.5.1 Deep learning . . . 11

4. EXPERIMENT RESULTS . . . 12

4.1 Comparison of the focus measures. . . 12

5. CONCLUSION . . . 14

References . . . 15

Appendix A: Focus Measures . . . 19

(5)

LIST OF SYMBOLS AND ABBREVIATIONS

λ Wavelength

NA Numeric aperture

θ Angle

d Distance

f Frequency

BRIEF A general-purpose feature descriptor

FAST A computationally efficient corner detection method

ORB A robust keypoint detector combining oriented FAST and rotated BRIEF

(6)

1. INTRODUCTION

Most optical systems utilize lenses for controlling the path of light. Lenses have many useful features in addition to the ability to gather and focus light. Making spherical lenses from glass is relatively simple and they were already used in classical optics. Modern lens manufacturers have a range of different kinds of materials to make lenses from. The type of material affects many of the lens properties such as the refractive index and chromatic dispersion. In classical optics, lens design relies on manually calculating and approximat- ing the path of light rays. Nowadays lens behaviour is well understood and modern lens design software can model optical systems efficiently. Thereby, modern optics design is heavily assisted by software and most designs can be automatically optimized by given constraints and parameters. Although lenses have many advantages, they have some disadvantages too. Optical aberrations are phenomena caused by lenses that change the focus in an unwanted way. The most common kind is spherical aberration, in which the light refracts on the edge of the lens more intensively than on the center of it. Natu- rally, this can lead to the formed image quality to suffer. Compound lenses which consist of consecutive lenses with opposite curvatures, cancel out the aberrations produced by each lens which retains the image quality. Since compound lenses require extra space, many lens systems tend to be rather bulky. The most adequate solution to overcome the errors generated by the lenses is simply omitting them. The lensless approach allows a larger field of view [1] and the optical design can be much more compact. However, there are some major disadvantages with lensless imaging too. One of which is that the image is not focused on the sensor. This is because the sensor gathers all of the diffraction orders of light. To make use of the signal, some technique (e.g. algorithm) is required to get rid of the unnecessary diffraction orders. Another issue with lensless imaging is that the problem with the lost phase is ill-posed.

Bright-field microscopy encounters a particular problem when imaging transparent spec- imens (e.g., red blood cells, tissue, bacteria). The problem is that the captured images are low contrast due to the specimen not absorbing enough light. The phase of the light, however, changes when passing through the specimen and it can be taken advantage of [2, ch. 7]. Since only the intensity of the light radiation can be measured, methods for reconstructing phase information from intensity values have been developed. In 1913 Lawrence Bragg described the interaction between a lattice and a quantum wave front

(7)

(e.g. consisting of photons, neutrons, or electrons) and it is known as the Bragg’s law or Bragg diffraction [3]. Bragg’s law can be used to calculate the phase shift of the wave front upon scattering off a crystalline solid. In 1952, David Sayre found out that when measuring intensities at a frequency higher than the Bragg’s law imposes, one can solve the crystallographic phase problem more easily [4]. There is a general consensus in the field of imaging that this was an important step in the development of lensless imaging methods. However, solving the phase problem was not feasible using Sayre’s findings until 1980 when there was an increase in computing power. At that time, James Fienup introduced Hybrid input-output (HIO) algorithm for phase retrieval, which was a modifi- cation of his earlier Error Reduction (ER) algorithm [5]. The HIO algorithm was used for retrieving the phases of a diffraction pattern i.e., the Fourier transform of the object. Fi- nally, in 1999 Jianwei Miao introduced computational methods for recovering lost phase information using a secondary image [6]. Along with phase retrieval, holography is also a precursor for today’s coherent diffraction imaging (CDI), 3D imaging, and other phase imaging techniques. The theory behind holography was invented by Dennis Gabor in 1948 [7]. Gabor suggested that using a specific optical arrangement, one could record the 3D information of an object rather than only its projection on the image plane. The pi- oneering works in digital holography were done in the early 1970s [8]. After that it took 20 years until digital holography became feasible. It was also due to the dramatical increase in computational processing power and storage capacity along with that high resolution image sensors were easier to manufacture.

In this study, a bacteria detection and tracking software [9] is developed in MATLAB and implemented into an existing lensless phase-retrieval system. The bacteria are tracked in 3D due to a support liquid which allows omnidirectional movement. This study servers as a literature review for common methods of 3D imaging, phase imaging and other related topics, and proposes a MATLAB solution for the tracking problem. The thesis is structured as follows. Chapter 2 discusses the fundamentals of imaging methods which fall under the umbrella of phase imaging. In chapter 3 the optical setup and used methods for object detection and tracking are investigated. Chapter 4 presents results of autofocus experiments. The final chapter contains a feasibility analysis of the detection and tracking system and further development ideas.

(8)

2. BACKGROUND

2.1 Common methods in 3D imaging 2.1.1 Computed tomography

Computed tomography (CT) a.k.a. computerized axial tomography (CAT) is a macro- scopic 3D imaging method which makes use of X-rays for generating cross-section im- ages of the body. An X-ray source and detector revolve around the patient and collect snapshots of the radiation passing through the body. The collected radiation data can be processed into 3D images on a computer. [10]

2.1.2 Cone beam computerized tomography

Cone beam computerized tomography (CBCT) is an improvement on the previously in- troduced CT imaging method. CBCT imaging allows better 3D reconstruction of the soft and hard tissue, but it is more sensitive to patient movement and foreign objects in the body. [10]

2.1.3 Structured illumination microscopy

Structured illumination microscopy (SIM) is a fluorescence microscopy method. The method can be used for 3D imaging which allows greater depth of focus (DOF), but it requires multiple exposures of the specimen. Therefore, the specimen should contain high density of a fluorescent label and be reasonably illuminated. [2, ch. 15]

The method in general extends the resolution of widefield microscopy twofold by modify- ing the diffraction pattern of the illumination by using a fine grating. The highly structured illumination pattern is able to carry high-frequency image details even beyond the diffrac- tion limit since they are incorporated to the image of a grid. The details are visible in the reconstructed image after extracting the high-frequency information. [2, ch. 15]

(9)

2.1.4 Phase contrast microscopy

Unlike the name suggests, phase contrast microscopy (PCM) is not based on phase retrieval. Instead, its working principle is related to the interference between surrounding background lightSand the diffracted lightDfrom the unstained specimen. A PCM system contains a positive (or negative) phase plate which causes a half-period difference in phase between the lightsSandDwhich allows destructive (or constructive) interference on the image plane. [2, ch. 7]

The results of both positive and negative phase contrast are compared in Figure 2.1.

(a)Positive phase contrast (b)Negative phase contrast

Figure 2.1. Comparison of positive and negative phase contrast images. Bar = 20 µm.

From [2, fig. 7.8]

2.2 Diffraction-based methods

Phase-retrieval refers to the reconstruction of a complex-valued function using some transform of the function and other structural information. The complex function con- sists of an amplitude and a phase. [11][5] Phase-retrieval algorithms can be divided into two main groups: deterministic and iterative. The aim of the deterministic approach is finding a closed-form relation between intensity and phase. The iterative algorithms use the intensity measurements as constraints under which the iterations are done until a satisfying solution is found.

Phase imaging covers all methods involving the reconstruction of phase such as digital holography, ptychography, transmission electron microscopy (TEM), and other imaging methods for phase retrieval.

2.2.1 Coherent diffractive imaging

Coherent diffractive imaging (CDI) is traditionally a 2D or 3D lensless structural microscopy technique which was pioneered by Miao et al. in 1999. In principle, CDI is a phase re-

(10)

of the CDI recovery algorithm needed an a priori secondary image as the support con- straint. Later, this has been solved by algorithms using an auto-correlation function which allows imposing an iteratively evolving support constraint. [13]

Conventionally, x-rays were used in CDI since they have two important properties com- pared to other sources of illumination. X-rays have a smaller wavelength compared to visible light, which improves the resolving power. Also, the penetrative property of x-rays allows imaging specimens thicker than 0.5 µm contrary to using a beam of electrons, for example. [14]

2.2.2 Digital holographic microscopy

Digital holographic microscopy (DHM) is closely related to phase retrieval since both are derived from the Bragg diffraction [3]. The working principle of digital holography is as follows. A light beam from by a coherent light source (e.g. laser) is split in two, creat- ing an object beam and a reference beam. The object beam illuminates the (unstained) specimen and the surface of object generates a diffracted object wavefront. The refer- ence beam is used to create a reference wavefront which creates an interference pattern (hologram) with the object wavefront on the image sensor. Mirrors are used to reflect the reference beam so that the interfering wavefronts are at an angle (usually 90°). [8] The digital hologram can be used to reconstruct 3D images since it contains both intensity and phase information.

2.3 Resolution and super-resolution

In optics, resolution means the resolving power of an optical system and it can be mea- sured in different ways. Optical resolution refers to the ability to distinguish details from one another. Spatial resolution refers to the number of pixels an objective lens or a sen- sor can gather. Both spatial and optical resolution of an objective lens is dependent to its numerical aperture (NA) which is defined as

NA =n sin(θ), (2.1)

wheren is the refractive index of the surrounding medium. The angleθ is the half-angle of the maximum cone of light that the objective is able to capture at a fixed distance from the specimen. [2, ch. 6]. Because the refractive index of air is approximately 1, the NA of so-called dry objectives rely entirely on angleθ. Thus, the estimated maximum value

(11)

of NA for dry objectives is approximately 0.95 which corresponds to an angle of 72°.

However, any oil immersion objective surpasses this limit due to larger refractive index of the medium and using a modern immersion oil the NA can be increased up to 1.49 or 1.65 if using a quartz coverslip as the first medium. [2] [15]

The resolution of an objective in light microscopy is defined in Abbe’s equation as

d= λ

2 NA, (2.2)

wheredis the shortest distance between two resolved points,λis the wavelength of the light andNAis the numerical aperture of the objective. The equation defines diffraction- limited imaging in both bright-field and fluorescence modes of widefield microscopy imag- ing. [2, ch. 15] To give an example for scale, with a NA of 1 and green light with a wavelength of roughly 500 nm the resulting resolution is 0.25 µm which is larger than most viruses (0.1 µm) but smaller than most bacteria (0.5 to 5 µm) [16].

(12)

3. METHODS

3.1 Optical setup

One of the main requirements from this optical setup is that it must work in near-field.

Previously, the common approach for lensless imaging worked in far-field where Fourier optics can be applied. The transfer function of the wavefront propagation is modelled using angular spectrum method which is detailed in Sec. 3.2.

The optical setup is simple. It consists of a laser illuminator, a specimen (object), a binary phase modulation mask and a camera sensor (Figure 3.1). The pattern on the phase mask is randomized.

Figure 3.1.Illustration of the optical setup. [17, Fig. 1]

The wavelength of the illuminating laser is 532 nm and the pixel size of the mask is 1.73 µm which is half of the camera sensor’s. A cut of the binary phase mask is shown in Figure 3.2. The height difference between the mask pixels is 500 nm with a 10% uniform deviation due to dullness errors in the mask surface. [17].

(13)

Figure 3.2. Cut of the phase mask [17, Fig. 7]

3.2 Image formation

Since the optical setup has relatively small distances between the components, the diffrac- tions are considered in the near-field. The propagation of the wavefronts is modelled with angular spectrum (AS) method [8, Ch. 1]. This allows predictions in both forward and backward propagation. The wavefront propagation and its transfer function are defined as:

u(x, y, d) = F−1{H(fx, fy, d)·F{u(x, y,0)}} (3.1)

H(fx, fy, d) =

⎩ exp[︂

iλ d√︂

1−λ2(fx2+fy2)]︂

, fx2+fy2λ12,

0, otherwise.

(3.2)

Function u(x, y, d) is the result of free space propagation of u(x, y,0) where x and y describe the pixel location. The AS operatorH(fx, fy, d), is defined by distanced, spatial frequencies fx and fy and wavelength λ. Some of the support constraints used in the algorithm are non-negativity and window size. The circular window in the center of image determines how the image is cropped.

(14)

2. Until a criterion is met, repeat:

(a) Forward propagation: Propagate the object wavefront (3.1) to the mask plane with distanced1, add the mask, then propagate it to the sensor plane with distanced2.

(b) Wavefront update: Replace wavefront amplitude with the amplitude of the illumination source

(c) Backward propagation: Propagate the updated wavefront back to the mask plane with distanced2and subtract mask. Then propagate the wavefront back from the mask plane to the object plane with distanced1

(d) Filters:DRUNet, BM3D filter, Apodization 3. Output [17]

The criterion for stopping the iteration can be an error measure such as RRMSE (Relative Root Mean Square Error).

3.3 Simulations and movement detection

Before the physical experiments, the first videos of bacteria were created artificially using a still image from the MATLAB sample image set. The image was translated laterally frame by frame in MATLAB to create a simulation of movement. The first iteration of the tracking algorithm was able to detect and track moving parts of the image distinguished from the background. This was a proof of concept which encouraged to take this ap- proach. At this point the major disadvantage was that only moving objects were detected.

By modifying the detection part of the algorithm we were able to fix this.

3.4 Object detection and tracking

The object tracking algorithm uses ORB key point detector and blob detection. The idea is to find groups of ORB points which correlate with each other. Morphological operations were applied to the ORB image after which blob detection was able to find the supposed objects from the image. Part of the noise was filtered out from the binary image but some were left. ORB detects details when there is not too much noise, otherwise there are false positives. Moreover, the algorithm tries to predict the objects’ movement statistically using Kalman filters.

In Figure 3.3 there are shown screen captures of three windows running the tracking

(15)

program. The red rectangle in Figure 3.3c is the region of interest (ROI) where the ORB key points are searched. Figure 3.3a shows what the found ORB points are. Figure 3.3b shows target 1, the detected object bordered with the yellow rectangle. We can draw the rectangles on the original video as shown in Figure 3.3c to illustrate the size and location of the object.

(a)ORB key points. (b)Morphological operations and blob de- tection.

(c)The detected bacteria from a video.

Figure 3.3. Still image of the windows.

Sometimes the object is not perfectly inside the yellow rectangle and this is due to errors in the key points. The most common source of error is noise or low contrast.

3.5 Autofocus

Finding the correct focus level, i.e. object-mask distance is needed for reconstructing clear images of the bacteria. Autofocus is an algorithm that optimizes the distance ac-

(16)

image is the same as used in the moving object simulations.

(a)Focused phase image of bacteria (b)Defocused phase image of bacteria Figure 3.4.The phase image in and out of focus

To see the feasibility of the autofocus combined with the tracking part of the software, first only six focus measures were compared to each other: CURV [18], GLVA [19], LAPV [20], LAPD [21], sum of wavelets [22], and WAVR [23]. The initial comparison showed that GLVA [19] performed well compared to the other focus measures. However, after further experimenting, none of the above were actually sufficient focus measures. The more in-depth comparison results are shown in Chapter 4.

Even though the comparison was not very informative at this point, it became apparent that even slightly blurry objects could be tracked passably. This became the leading idea for autofocus since focusing between frames can be expensive when using a brute-force method such as this.

3.5.1 Deep learning

Deep learning (DL) is a possible solution for autofocusing and a well trained network could be able to do it efficiently. In 2021, Wang et al. proposed an efficient DL pipeline for the AF problem. This solution outperforms traditional contrast maximization in terms of efficiency and is able to produce all-in-focus static images or video. [24]

(17)

4. EXPERIMENT RESULTS

4.1 Comparison of the focus measures

According to Idinyang & Russell (2011) GLLV [20], VOLA [25] and GRAT [26] are the most accurate focus measures [25]. Our experiments agree with these results mostly. However, the comparison of the measures showed that GRAT is not a reliable focus measure in this application as its behavior differs noticeably from the other two. The results of the comparisons are compiled in Figure 4.2. The test image used in the focus tests was of a bacterium captured with the SSR-PR optical setup reconstructed without using the phase mask and wavefront propagations. This way it was much faster to adjust the focus level in small intervals and calculate according focus values. A total of 20 different focus levels was used for finding the optimal values.

Although GLLV performed the best in this comparison, it leaves room for improvement.

Visually sharpest image (Figure 4.1) had a focus distance of 0.1 mm but according to Figure 4.2a the distance predicted by GLLV is 0.5 mm.

Figure 4.1.The visually sharpest image without using the phase mask.

(18)

(a)The relative focus values of GLLV. (b)The in-focus image according to GLLV.

(c)The relative focus values of VOLA. (d)The in-focus image according to VOLA.

(e)The relative focus values of GRAT. (f)The in-focus image according to GRAT.

Figure 4.2.Focus measures on different focus distances.

After comparing 28 different focus measures in Appendix A, the results show that GLLV and VOLA are the most reliable focus measures.

(19)

5. CONCLUSION

The developed tracking algorithm was able to follow most of the bacteria moving on the video. The main detection condition is that the target must be visible for a few consecutive frames. If the detected objects remain invisible (out of frame, obstructed, out-of-focus) for long enough, they are lost. Lost objects are treated as new targets if they happen to be detected again.

The detection and tracking software is at its current state only usable in post-processing, but in theory the same principles can be applied in a real-time processing. Due to the brute-force nature of the implemented autofocus algorithm, it is expensive to apply to each frame. An alternative, faster method for autofocus would be using a deep learning system as noted in Sec. 3.5.1. However, as long as real-time processing is not a requirement, the implemented autofocus might be sufficient.

(20)

[1] Ozcan, A. and McLeod, E. Lensless Imaging and Sensing.Annual review of biomed- ical engineering18.1 (2016), pp. 77–102.ISSN: 1523-9829.

[2] Murphy, D. B. and Davidson, M. W. Fundamentals of Light Microscopy and Elec- tronic Imaging. 2nd ed. Hoboken, NJ: Wiley–Blackwell, 2013.ISBN: 1-283-64427-4.

[3] Henry, B. W. and Lawrence, B. W. The reflection of X-rays by crystals.Proc. R. Soc.

Lond. A88 (1913), pp. 428–438.DOI:10.1098/rspa.1913.0040.

[4] Kirz, J. and Miao, J. David Sayre (1924–2012). Nature 484.7392 (2012), p. 38.

ISSN: 1476-4687.DOI: 10.1038/484038a. URL:https://doi.org/10.1038/

484038a.

[5] Fienup, J. R. Phase retrieval algorithms: a comparison.Applied Optics21.15 (Aug.

1982), pp. 2758–2769.DOI:10.1364/AO.21.002758.

[6] Miao, J., Sayre, D. and Chapman, H. N. Phase retrieval from the magnitude of the Fourier transforms of nonperiodic objects.J. Opt. Soc. Am. A15.6 (June 1998), pp. 1662–1669.DOI:10.1364/JOSAA.15.001662.URL:http://www.osapublishing.

org/josaa/abstract.cfm?URI=josaa-15-6-1662.

[7] Gabor, D. A New Microscopic Principle. Nature 161.4098 (1948), pp. 777–778.

ISSN: 1476-4687.DOI:10.1038/161777a0. URL:https://doi.org/10.1038/

484038a.

[8] Picart, P.New Techniques in Digital Holography. 1st ed. London, UK; Hoboken, NJ:

Wiley, 2015.ISBN: 1848217730.

[9] Vartiainen, V.Bacteria Detection. Version 1.0. Dec. 2021.URL:https://github.

com/Vartiaiv/BacteriaDetection.

[10] Karatas, O. H. and Toy, E. Three-dimensional imaging techniques: A literature re- view.European journal of dentistry8.1 (2014), pp. 132–140.ISSN: 1305-7456.DOI: 10.4103/1305-7456.126269.

[11] Jaming, P. and Pérez-Esteva, S. The phase retrieval problem for solutions of the Helmholtz equation.Inverse Problems 33.10 (2017), p. 105007.ISSN: 0266-5611.

DOI:10.1088/1361-6420/aa8640.

[12] Marchesini, S., Chapman, H. N., Hau-Riege, S. P., London, R. A., Szoke, A., He, H., Howells, M. R., Padmore, H., Rosen, R., Spence, J. C. H. and Weierstall, U.

Coherent X-ray diffractive imaging: applications and limitations.Opt. Express11.19 (Sept. 2003), pp. 2344–2353.DOI:10.1364/OE.11.002344.URL:http://www.

osapublishing.org/oe/abstract.cfm?URI=oe-11-19-2344.

(21)

[13] Marchesini, S., He, H., Chapman, H. N., Hau-Riege, S., Noy, A., Howells, M. R., Weierstall, U. and Spence, J. C. H. X-ray image reconstruction from a diffrac- tion pattern alone. Phys. Rev. B 68 (14 Oct. 2003), p. 140101. DOI: 10 . 1103 / PhysRevB . 68 . 140101. URL: https : / / link . aps . org / doi / 10 . 1103 / PhysRevB.68.140101.

[14] Miao, J., Charalambous, P., Kirz, J. and Sayer, D. Extending the methodology of X-ray crystallography to allow imaging of micrometre-sized non-crystalline speci- mens.Nature400.6746 (July 1999), pp. 324–344.ISSN: 1476-4687.DOI:10.1038/

22498.URL:https://doi.org/10.1038/22498.

[15] Pluta, M. Variable wavelength microinterferometry of textile fibres. Journal of mi- croscopy (Oxford) 149.2 (1988), pp. 97–115.ISSN: 0022-2720. DOI:10.1111/j.

1365-2818.1988.tb04567.x.

[16] Velimirov, B. Nanobacteria, Ultramicrobacteria and Starvation Forms: A Search for the Smallest Metabolizing Bacterium.Microbes and Environments 16.2 (2001), pp. 67–77.DOI:10.1264/jsme2.2001.67.

[17] Kocsis, P., Shevkunov, I., Katkovnik, V., Rekola, H. and Egiazarian, K. Single-shot pixel super-resolution phase imaging by wavefront separation approach. Opt. Ex- press 29.26 (Dec. 2021), pp. 43662–43678. DOI: 10 . 1364 / OE . 445218. URL: http://www.osapublishing.org/oe/abstract.cfm?URI=oe-29-26-43662. [18] Helmli, F. S. and Scherer, S. Adaptive shape from focus with an error estimation in

light microscopy.- ISPA 2001. Proceedings of the 2nd International Symposium on Image and Signal Processing and Analysis. In conjunction with 23rd International Conference on Information Technology Interfaces (IEEE Cat.2001, pp. 188–193.

DOI:10.1109/ISPA.2001.938626.

[19] Krotkov, E. and Martin, J. .-P. Range from focus.Robot 3 (1986), pp. 1093–1098.

DOI:10.1109/ROBOT.1986.1087510.

[20] Pech-Pacheco, J., Cristobal, G., Chamorro-Martinez, J. and Fernandez-Valdivia, J.

Diatom autofocusing in brightfield microscopy: a comparative study.ICPR3 (2000), pp. 314–317.ISSN: 1051-4651.DOI:10.1109/ICPR.2000.903548.

[21] Thelen, A., Frey, S., Hirsch, S. and Hering, P. Improvements in Shape-From-Focus for Holographic Reconstructions With Regard to Focus Operators, Neighborhood- Size, and Height Value Interpolation. IEEE Transactions on Image Processing;

IEEE Trans Image Process 18.1 (2009), pp. 151–157.ISSN: 1057-7149. DOI:10.

1109/TIP.2008.2007049.

[22] Yang, G. and Nelson, B. J. Wavelet-based autofocusing and unsupervised segmen- tation of microscopic images.- Proceedings 2003 IEEE/RSJ International Confer- ence on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453). Vol. 3.

ID: 1. 2003, 2143–2148 vol.3.DOI:10.1109/IROS.2003.1249188.

(22)

Autofocus. IEEE Transactions on Computational Imaging 7 (2021), pp. 258–271.

DOI:10.1109/TCI.2021.3059497.

[25] Idinyang, S. U. and Russell, N. A. Real-time auto-focus implementation.2011 Func- tional Optical Imaging. 2011, pp. 1–2.DOI:10.1109/FOI.2011.6154844.

[26] SANTOS, A., ORTIZ DE SOLÓRZANO, C., VAQUERO, J. J., PEÑA, J. M., MALPICA, N. and DEL POZO, F. Evaluation of autofocus functions in molecular cytogenetic analysis. Journal of Microscopy 188.3 (1997), pp. 264–272. DOI:https://doi.

org/10.1046/j.1365-2818.1997.2630819.x.

[27] Shirvaikar, M. An optimal measure for camera focus and exposure. Vol. 36. Feb.

2004, pp. 472–475.ISBN: 0-7803-8281-1.DOI:10.1109/SSST.2004.1295702. [28] Nanda, H. and Cutler, R. Practical calibrations for a real-time digital omnidirectional

camera.Proceedings of CVPR, Technical Sketch(Jan. 2001).

[29] Shen, C.-H. and Chen, H. Robust focus measure for low-contrast images.2006 Di- gest of Technical Papers International Conference on Consumer Electronics. 2006, pp. 69–70.DOI:10.1109/ICCE.2006.1598314.

[30] Lee, S.-Y., Yoo, J.-T., Kumar, Y. and Kim, S.-W. Reduced Energy-Ratio Measure for Robust Autofocusing in Digital Camera. IEEE Signal Processing Letters 16.2 (2009), pp. 133–136.DOI:10.1109/LSP.2008.2008938.

[31] Geusebroek, J., Cornelissen, F., Smeulders, A. and Geerts, H. Robust autofocusing in microscopy. Cytometry 39.1 (Jan. 2000), pp. 1–9. ISSN: 0196-4763. DOI: 10 . 1002/(SICI)1097-0320(20000101)39:1<1::AID-CYTO2>3.0.CO;2-J. [32] Subbarao, M., Choi, T.-S. and Nikzad, A. Focusing techniques. Machine Vision

Applications, Architectures, and Systems Integration. Ed. by B. G. Batchelor, S. S.

Solomon and F. M. Waltz. Vol. 1823. International Society for Optics and Photonics.

SPIE, 1992, pp. 163–174.URL:https://doi.org/10.1117/12.132073.

[33] Eskicioglu, A. and Fisher, P. Image quality measures and their performance.IEEE Transactions on Communications 43.12 (1995), pp. 2959–2965. DOI: 10 . 1109 / 26.477498.

[34] Firestone, L., Cook, K., Culp, K., Talsania, N. and Jr, K. P. Comparison of autofocus methods for automated microscopy. eng. Cytometry 12.3 (1991). LR: 20071114;

GR: CA45047/CA/NCI NIH HHS/United States; JID: 8102328; 1991/01/01 00:00 [pubmed]; 2001/03/28 10:01 [medline]; 1991/01/01 00:00 [entrez]; ppublish, pp. 195–

206.ISSN: 0196-4763.DOI:10.1002/cyto.990120302[doi]. [35] Nayar, S. and Nakagawa, Y.Shape from Focus. Tech. rep. Nov. 1989.

[36] Minhas, R., Mohammed, A. A., Wu, Q. M. J. and Sid-Ahmed, M. A. 3D Shape from Focus and Depth Map Computation Using Steerable Filters. Image Analysis and

(23)

Recognition. Ed. by M. Kamel and A. Campilho. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009, pp. 573–583.ISBN: 978-3-642-02611-9.

[37] Pertuz, S., Puig, D. and Garcia, M. A. Analysis of focus measure operators for shape-from-focus. Pattern Recognition 46.5 (2013), pp. 1415–1432. ISSN: 0031- 3203.DOI:https://doi.org/10.1016/j.patcog.2012.11.011.

[38] Focus Measure. Aug. 31, 2017.URL:https://www.mathworks.com/matlabcentral/

fileexchange/27314-focus-measure(visited on 08/04/2021).

(24)

APPENDIX A: FOCUS MEASURES

ACMO: Absolute central moment [27]

BREN: Brenner’s focus measure [26]

CONT: Image contrast [28]

CURV: Image curvature [18]

DCTE: DCT Energy measure [29]

DCTR: DCT Energy ratio [30]

GDER: Gaussian derivative [31]

GLVA: Gray-level variance [19]

GLLV: Gray-level local variance [20]

GLVN: Gray-level variance normalized [26]

GRAE: Energy of gradient [32]

GRAT: Thresholded gradient [26]

GRAS: Squared gradient [33]

HELM: Helmli’s measure [18]

HISE: Histogram entropy [19]

HISR: Histogram range [34]

LAPE: Energy of Laplacian [32]

LAPM: Modified laplacian [35]

LAPV: Variance of laplacian [20]

LAPD: Diagonal Laplacian [21]

SFIL: Steerable filters-based [36]

SFRQ: Spatial frequency [33]

TENG: Tenegrad [19]

TENV: Tenengrad variance [20]

VOLA: Vollat’s correlation-based [26]

WAVS: Wavelet sum [22]

WAVV: Wavelet variance [22]

WAVR: Wavelet ratio [23]

Note: The implementations provided in this package are those of the author based on his interpretation of the original papers. For futher details, refer to S. Pertuz et al. [37].

Code for Focus Measure available at MATLAB Central File Exchange [38].

Viittaukset

LIITTYVÄT TIEDOSTOT

Rewrite the above minimisation problem as a convex optimisation problem where the objective and constraint functions

The inverse problem: estimate the density of the bacteria from some indirect measurements. Computational Methods in Inverse Problems,

Normal corneal sublayers imaged by in vivo confocal microscopy: (A) surface epithelium with prominent, bright nuclei (arrow); (B) Basal epithelial cells showing only

AdV DFA and virus culture are used for detection of the virus in respiratory samples. Electron microscopy is utilized in analysis of stool samples. PCR is also

Does it make sense compared with the phase function of the electric field vector?. (Hint: use the complex Poynting vector in Equation (7-79) of

A feature of the quenching problem that has been ex- tensively investigated is the qualitative behavior of solutions and in particular the asymptotic behavior of solutions in space

To study whether the AFB 1 binding by the probiotic bacteria has an effect on its transport and toxicity to the intestinal tissue, Caco-2 cells were incubated with AFB 1

Confocal and transmission electron microscopy (TEM) imaging of HSV-1 infected nuclei showed that the marginalized host chromatin forms an uneven ring, which contains gaps that