• Ei tuloksia

Optimal Spectral Bands for Instrument Detection in Microscope-Assisted Surgery

N/A
N/A
Info
Lataa
Protected

Academic year: 2022

Jaa "Optimal Spectral Bands for Instrument Detection in Microscope-Assisted Surgery"

Copied!
6
0
0

Kokoteksti

(1)

UEF//eRepository

DSpace https://erepo.uef.fi

Rinnakkaistallenteet Luonnontieteiden ja metsätieteiden tiedekunta

2020

Optimal Spectral Bands for Instrument

Detection in Microscope-Assisted Surgery

Puustinen, Sami

IEEE

Artikkelit ja abstraktit tieteellisissä konferenssijulkaisuissa

© 2021 IEEE All rights reserved

http://dx.doi.org/10.1109/CBMS49503.2020.00068

https://erepo.uef.fi/handle/123456789/24811

Downloaded from University of Eastern Finland's eRepository

(2)

Optimal spectral bands for instrument detection in microscope-assisted surgery

Sami Puustinen University of Eastern Finland

School of Medicine Kuopio, Finland

sampuu@uef.fi Piotr Bartczak University of Eastern Finland

Joensuu, Finland bartczak@uef.fi Antti-Pekka Elomaa Kuopio University Hospital

Eastern Finland Center of Microsurgery Kuopio, Finland Antti-Pekka.Elomaa@kuh.fi

Jani Koskinen University of Eastern Finland

School of Computing Joensuu, Finland

jankos@uef.fi Ahmed Hussein Kuopio University Hospital

Eastern Finland Center of Microsurgery Kuopio Finland Ahmed.Hussein@kuh.fi

Hana Vrzakova Kuopio University Hospital

Eastern Finland Center of Microsurgery Kuopio, Finland Hana.Vrzakova@kuh.fi

Samu Lehtonen University of Eastern Finland

School of Medicine Kuopio, Finland

samule@uef.fi

Abstract—Optic image-guidance systems enable minimally invasive (MIS) approaches in surgery. However, available MIS-techniques limits both ergonomics and field of view (FoV), which can be detrimental for anatomical awareness and safe manipulation with tissues. Contemporary navigation techniques (i.e. neuronavigation) support spatial awareness during surgery. However, these techniques require time- consuming instrumentation and lack real-time precision needed in soft-tissue surgery. In this work, we utilize operative microscopes FoV as an unobtrusive source to support MIS- navigation with micro-instrument tracking. The FoV instrument tracking has been investigated in laparoscopy, however, high magnification, selection of instruments and bimanually variant characteristics of microneurosurgery make the current computational approaches challenging to adopt. In this work, we investigate potentials of spectral imaging for micro-instrument tracking. We present a spectral-imaging system suitable for the use at the operation rooms. Using a hyperspectral camera mounted to the side ocular of operation microscope and Xenon white light illumination, we collected samples of standard microsurgical instruments (reflective and non-reflective) that were positioned in a biological tissue (placenta). In the analysis of contrasts, we compared spectral images to traditional RGB. We observed 8-13% contrast enhancement with the optimal wavelength bands and 20.4%

improvement in instrument-tracking time. Our results encourage application of wavelength-tuned cameras to improve efficiency of optic tracking in MIS-systems.

Keywords—spectral imaging, tool detection, instrument recognition

I. INTRODUCTION

Miniaturization of surgical procedures and advances on imaging systems have led to the development of minimally invasive surgical techniques (MIS) [1]. Compared to techniques used in open surgeries, MIS ensure more precise approaches that cause less damage to healthy tissues.

Consequently, patients may recover faster, experience less

postoperative pain, and have lower risk of infection. Thus, the MIS approaches are at frontiers of most surgical procedures [2].

The MIS-techniques are technically demanding since a surgeon needs to master not only microprecisive dexterity but also precise spatial localization of instruments and bimanual eye-hand coordination in an already-limited field of view (FoV). To elevate these challenges, navigation techniques such as instrument tracking (i.e. neuronavigation) support anatomical orientation [3]. However, they require obtrusive external instrumentation and are inapplicable and inaccurate with most microinstruments. Therefore, prior research has explored machine-vision methods for real-time instrument tracking in FoV [4], [5]. However, FoV tracking in MIS conditions is challenging due to numerous reasons.

First, a surgical FoV is highly magnified, unevenly illuminated, and polluted with motion noise. In addition, the FoV is affected by approach-related aspects such as the depth and volume of operative cavity, physiological changes (i.e., pulsation and perfusion), and fluids (i.e., bleeding).

Consequently, traditional machine-vision algorithms and techniques for FoV tracking underperform in the MIS context [5].

To elevate the current challenges in FoV tracking, we explore the use of spectral imaging in MIS. Spectral imaging extends the recorded spectrum beyond visible light, which can improve the visualization of normal and pathologic anatomy during surgery [6]. Spectral imaging could improve the recognition of surgical instruments and synthetic materials in the challenging FoV. Additionally, spectral imaging may help optimize various tasks, as removing noisy or non-informative bands can improve the classification stage as demonstrated previously [7]. From engineering point of view, the lower resolution images of highly sensitive spectral bands, in contrast to high-resolution images of less

© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

(3)

sensitive RGB-cameras, could support near-real time computational demands of future applications,.

In this work, we propose a spectral imaging system (Fig.

1.) for differentiating the microsurgical instruments from biological tissues in mini-invasive surgeries. In this work, we present following contribution:

• We test spectral imaging and optimal band selection for instrument detection in a contemporary optic operative microscope setting

• We compare optical properties of reflective and non- reflective microsurgical instruments and biological tissue surfaces

Fig. 1. Proposed spectral imaging system for improving instrument detection. (A) Senop HSC-2 hyperspectral camera mounted to surgical microscope. Small physical size and mounting of the camera allow for effortless manipulation of the microscope. (B) Placement of instruments against the placental tissue with perfused colored fluid.

II. BACKGROUND A. Prior research on instrument detection

Automated detection of surgical instrument has potential to complement various surgical operations, such as ophthalmologic, neurosurgical and MIS procedures. Distinct methods are used for tool detection, typically during live surgeries or using phantom tissue backgrounds. Generally, three main strategies have been used in detection in FoV:

discriminative, generative, and ad-hoc methods [8].

Prior research has examined various computational methods for automatic segmentation and localization of surgical tools [9], [10]. Garcia-Peraza-Herrera et al. used deep neural networks to segment non-rigid surgical tools under white light in ex-vivo and in-vivo conditions [9]. Their offline and real-time segmentation reached 89.6% and 78.2%

accuracy, respectively. Sznitman et al. proposed an optimization method to both detect and track instruments during phantom retinal microsurgery [10]. Their results showed that in situations where the tool enters and leaves the field regularly, active testing filter (ATF) outperforms other methods and is computationally efficient (> 90FPS).

However, these methods require prior knowledge of instrument’s appearance and FoV’s background, and segment only one tool at a time at 66.5% accuracy.

Automatic instrument segmentation has been beneficial also in robot-assisted surgery [11], [12]. Recently, U-net based segmentation algorithms were applied in gastrointestinal-surgery images and videos [11], [12]. Tuned segmentation algorithms improved segmentation time and binary classification between instrument’s background and different parts of the instrument in contrast to older versions

of the algorithm. Leppänen et al. employed convolutional neural networks (CNN) in the context of microsurgical training. The algorithm was trained on microsurgical practice videos, for which experts manually gathered a large corpus of instrument positions. The CNN correctly detected a microsurgical instrument (needle-holder) with 78.3%

accuracy with recognition speed above 15 FPS [5]. In video- based training, real-time instrument detection may help to understand expert’s actions and lead to faster learning of the skills.

Spectral imaging bares vital potentials to enhance current computational methods for instrument detection. First, majority of these systems are based on data sources with white light illumination and use color as a primary or sole feature for detection [8]. Thus, extending the imaging spectrum may overcome some of the challenges, such as visual occlusion and uneven lightning. Second, spectral- imaging methods can be adopted without modification to the design of surgical instruments (i.e., visual markers or instrument coloring). Instrument modifications can be problematic since they often violate regulatory and sterilization demands and prevent the methods’ adoption to clinical practice. Therefore, in this work we investigate how spectral imaging can enhance instrument detection, namely instrument contrast and detection efficiency.

B. Spectral imaging in surgery

In principle, a spectral imaging system comprises a light source, lenses, dispersive elements, such as grating or a prism, and a detector [13]. The acquired data cube (or spectral cube) has three dimensions (x, y, λ), where the first two dimensions present spatial information and the third the intensity of light at each wavelength (spectral band). The amount of data is voluminous compared to conventional RGB cameras and, thus, applications of spectral imaging often require computational models for data manipulation [14].

The main advantage of spectral cameras over the RGB is in their ability to capture object’s reflectance at many wavelength bands including those beyond visible light. Each object’s material is unique and has its own spectral signature that characterize object’s reflectance in the entire spectrum [15]. Such properties have been leveraged for enhanced object detection in various domains, such as industry and food quality analysis [16]. Spectral imaging has been particularly beneficial for diagnostic applications and border enhancements, such as detecting cancerous lesions and retinal disease, also [17], [18]. Intraoperative applications of spectral imaging have been recently investigated due to its non-invasiveness and design flexibility [19]. To our best knowledge, spectral imaging has not been investigated in the microsurgical tasks to enhance surgical instrument contrast.

III. METHODS

We conducted a set of spectral-imaging experiments with reflective and non-reflective microsurgical instruments that were situated against a cannulated placenta under an operation microscope. We aimed to create authentic conditions that would mimic the settings at operation rooms.

A. Tissue preparation

A human placenta was used as a biological tissue in the experiment as it presents a validated simulation model for authentic tissue handling and surgical training tasks [20]. The

A B

(4)

main advantage is that the placenta comprises a variety of distinct structures, such as stroma, membranes, blood vessels, and clots, which allows comprehensive training. The sample placenta was collected with a consent from the donors at the department of obstetrics, at the university hospital. Prior to the experiment, we cannulated the main vessels with artificial blood circulation to recreate pulsation of the tissue. Such a setting provided an authentic biological tissue similar to the setting at the operation.

B. Microscope spectral imaging system

The spectral-imaging system consisted of a custom-made spectral camera (Senop HSC-2) attached to an operation microscope (Möller-Wedel VM-900) with a custom-made C- mount. The spectral camera captures reflectance in the wavelength range 510 - 903 nm with a tunable Fabry-Pérot interferometer. The camera employs a frame-based spectral system and a CMOS sensor to provide snapshot images in visible spectrum (VIS) and near-infrared (NIR) spectral ranges with resolution of 1024x1024 pixels. The camera was mounted to the operation microscope and did not limit microscope’s mobility, manipulation, or balance, as opposed to the previous setups [21]. Fig. 1 illustrates the spectral- imaging system mounted to the microscope.

The operation microscope receives light from the Xenon illumination unit through an optical fiber Fig. 2. The diameter of the light spot in the object plane changes upon adjustments to the degree of magnification. When viewed through the eyepiece, the image appears at a constant brightness.

Fig. 2. Moller-Wedel VM 900 microscope illumination. Measured spectral power distribution from the OM optics. Microscope applies NIR filtering to e.g. reduce tissue heating shown as sharp reduction of intensity above 700 nm. Wavelengths below 500 were progressively dim, also. Measurements were obtained with Photonic multichannel analyzer, PMA-12, Hamamatsu.

C. Spectral calibration and data acquisition

We first calibrated the spectral-imaging system to accommodate for microscope’s optics and illumination intensity [22]. We pointed the microscope vertically 30 cm above the measuring area, set the microscope lamp to 90% of available intensity, and recorded the set of white and dark references. The spectral camera was set to capture spectral images in the wavelength range 510 – 900 nm (step 5 nm).

The exposure time for both the dark and the white reference was set to 50 ms, and the white reference was tilted at a 45- degree angle towards the microscope. The setting was developed iteratively to ensure that the resulted spectral images were correctly exposed.

After the calibration of the camera, we captured a set of images of reflective and non-reflective instruments as illustrated in Fig. 3. The non-reflective instrument was captured together with the placenta using the exposure time

200 ms for all wavelengths. The same exposure time was used to capture the dark reference. Given the exposure time, the duration of spectral imaging for one spectral cube was approximately 30 s. Because the white reference and the sample were recorded with different exposure times, the reflectance values had to be normalized by multiplying them with (twhite/tsample) where t is the corresponding exposure time.

Fig. 3. Instrument and placenta at wavelength 600 nm. Green spots show the areas where the reflectances were randomly sampled.

D. Data analysis of constrast in spectral images

In the analysis of spectral cubes, we examined which wavelength maximizes the contrast between microsurgical instrument and the biological tissue. This is multiparameter problem since the observed contrast depends on microscope’s optics and illumination, and specifications of the spectral camera.

For calculating the Michelson contrast [23] (see Fig. 4), we made the following simplifying assumptions: the light source illuminates the field of view with equal intensity over all wavelengths, and the camera is a monochrome camera that is equally sensitive at all wavelengths. Calculating the contrast then reduces to comparing the reflectance values at different wavelengths.

Fig. 4. Data processing steps. The spectra of the white reference and the samples were recorded with different exposure times t’ and t, respectively, and normalized to a common exposure time when calculating reflectance.

The weighing function w(λ) = 1 if the wavelength λ is included in the optimized band and 0 otherwise.

First, we obtained the reflectances from a random sample of 600 data points from an evenly lit areas of the placenta and the instrument, as demonstrated in Fig. 3. Using these data points, we calculated the mean reflectance of the instrument

(5)

and the placenta. Then, we evaluated the contrast between these two at each wavelength to determine the optimal wavelength range with the highest contrast. Finally, we paired each sampled instrument point with a point from the placenta and compared the increase in contrast between the original wavelength range and the optimal wavelength range.

After obtaining the optimal wavelength band, we examined whether using only this optimal band improves detection of surgical instruments. We used unsupervised k- means clustering to label each pixel in the image and compared the clustering results between the optimal band and an RGB image calculated from the original reflectance measurements. The calculation of the RGB values requires the reflectance of the sample, the illuminant spectrum, and the RGB sensitivities of the imaging system. For the first two components we used the previously described reflectance measurements. For the third component we used the RGB spectral sensitivities of a Nikon D800 digital camera (measured by one of the authors). The RGB values were then balanced by multiplying the R values by 2 and the G values by 1.5 to make the colors more natural. The total of 200 000 points from the spectral and RGB image were randomly sampled to calculate the cluster centers. K-means algorithm was used to initialize the cluster centers [24].

The number of clusters was set to k = 4, determined by running the clustering on the RGB image at different k values and visually inspecting the outcome. One of four clusters detected the instrument, the other three samples detected other structures. The sampling and clustering process was repeated 12 times. Before classification, the specular reflections from the placenta were masked and set to 0. The classification was run in Windows 10 on an Intel Core i7-7500U CPU (2 cores, 2.70 GHz) and 16.0 GB RAM.

Pixels labeled with k-means clustering were compared to a manually annotated image of tool borders.

IV. RESULTS

Fig. 6. displays the reflectance of the placenta and the instrument together with the Michelson contrast at each wavelength. The placenta can be seen to have high reflectance in the wavelength range 620–700 nm. The mean contrast was calculated from a random sample of 600 instrument’s and placenta’s reflectances. The contrast for the entire wavelength range was 0.68 (SD = 0.05). Within the range 620–700 nm, the mean contrast increased to 0.73 (SD

= 0.04). Compared to the original wavelength range, the mean percentage increase in contrast was 8.4% (SD = 2.3%).

The maximum contrast between the placenta and the instrument is 0.77 (13.2% increase compared to the entire

wavelength range) at wavelength 660 nm.

Results of pixelwise k-means clustering applied to the RGB-images and the image at the 660 nm wavelength band are illustrated in Table 1 and visualized in Fig 5. The main improvement using only the 660 nm band was observed with the prediction time (Table 1), which decreased by 20.4%.

Minor improvements were presented in every classification metric.

TABLE I. AVERAGE RESULTS FROM 10 REPETITIONS OF UNSUPERVISED K-MEANS (K=4) CLUSTERING OF RGB AND 660 NM 1024 X

1024 IMAGE PIXELS.

Performance RGB 660 nm

Accuracy (SD) [%] 90.3 (<0.1) 93.2 (0.1) True positive (SD) [%] 95.3 (<0.1) 96.1 (0.1) True negative (SD) [%] 87.9 (0.2) 91.8 (0.02) False positive (SD) [%] 12.1 (0.2) 8.2 (0.2) False negative (SD) [%] 4.7 (<0.1) 3.9 (0.2) Training time (SD) [sec] 3.233 (0.329) 2.578 (0.223) Prediction time (SD)

[sec]

0.152 (0.008) 0.121 (0.008)

Fig. 6. Mean of the reflectances of the tool and the placenta with the Michelson contrast as a function of wavelength. The red line shows the contrast over the entire wavelength range. Means were calculated using 600 points sampled from the areas shown in Fig. 3.

V. DISCUSSION AND CONCLUSION

In this work, we investigated the use of wavelength optimization for contrast enhancement between microsurgical instruments and biological tissues. Our results indicate that using the wavelength range 620–700 nm can improve the contrast between non-reflective surgical instruments and biological t issues by 8.4% and by 13.2%

Fig. 5. Original RGB image (left), clustering applied to the RGB image (center) and clustering applied to the 660 nm image (right).

(6)

using a single optimal wavelength (660 nm). In addition, when using the single wavelength, time required for prediction was reduced by 20.4%.

Our results were not without limitations. The contrast improvement was affected by the already low and uniform reflectance of non-reflective instruments. Although we aimed to compare the results to standard reflective instruments, strong specular reflections rendered the data unusable and it was omitted from the analysis. The specular reflections have been one of the obstacles in prior attempts in instrument detection [5]. To overcome specular reflections in other fields, such as eye-tracking, near-IR has been employed to improve detection performance. However, such an approach is limited in microsurgery since microscope’s optics are designed to block the near-infrared light from the microscope’s illumination system.

In operational microscopes that use broadband lights such as Xenon or Tungsten bulb, the generated heat is a potential source of iatrogenic damage to the operated tissues [25]. In order to avoid excessive heat, optic filters are standard in modern surgical microscopes. The filters (heat absorbing glass) limit the amount of energy transmitted to the tissues, hence minimize heat damage. However, fluorescent applications such as indocyanine green (ICG), utilized for visualizing perfusion in vessels, are excited in reddish and emits NIR light. Therefore, standard filters are not applicable. On the other hand, other tracers such as 5-ALA are used in tumor surgery to visualize at around 400 nm blue light. Therefore, modern microscopes utilize more complex illumination systems that allow using the excitation light (UV or NIR range) that could be valuable in instrument detection tasks.

In future research, we will investigate the spectrum range 400–500 nm. Solutions to reduce the reflection of non-coated instruments are considered to allow for spectral data collection. Additionally, we plan to extend our investigation within near infrared range, however, this might require modification of the microscope illumination.

REFERENCES

[1] A. Darzi and S. Mackay, "Recent advances in minimal access surgery," Bmj, vol. 324, (7328), pp. 31-34, 2002.

[2] Westebring–van der Putten, Eleanora P, R. H. Goossens, J. J.

Jakimowicz and J. Dankelman, "Haptics in minimally invasive surgery–a review," Minimally Invasive Therapy & Allied Technologies, vol. 17, (1), pp. 3-16, 2008.

[3] D. J. Mirota, M. Ishii and G. D. Hager, "Vision-based navigation in image-guided interventions," Annu. Rev. Biomed. Eng., vol. 13, pp.

297-319, 2011.

[4] M. Alsheakhali, "Machine Learning for Medical Instrument Detection and Pose Estimation in Retinal Microsurgery." , Technische Universität München, 2017.

[5] T. Leppänen et al, "Augmenting Microsurgical Training:

Microsurgical Instrument Detection Using Convolutional Neural Networks," 2018 IEEE 31st International Symposium on Computer- Based Medical Systems (CBMS), pp. 211-216, 2018.

[6] E. L. Wisotzky, F. C. Uecker, P. Arens, S. Dommerich, A. Hilsmann and P. Eisert, "Intraoperative hyperspectral determination of human tissue properties," J. Biomed. Opt., vol. 23, (9), pp. 1-8, 2018.

[7] B. Martinez et al, "Most Relevant Spectral Bands Identification for Brain Cancer Detection Using Hyperspectral Imaging," Sensors (Basel), vol. 19, (24), pp. 10.3390/s19245481, 2019.

[8] D. Bouget, M. Allan, D. Stoyanov and P. Jannin, "Vision-based and marker-less surgical tool detection and tracking: a review of the literature," Med. Image Anal., vol. 35, pp. 633-654, 2017.

[9] L. C. García-Peraza-Herrera et al, "Real-time segmentation of non- rigid surgical tools based on deep learning and tracking," in International Workshop on Computer-Assisted and Robotic Endoscopy, 2016.

[10] R. Sznitman, R. Richa, R. Taylor, B. Jedynak and G. Hager, "Unified Detection and Tracking of Instruments during Retinal Microsurgery,"

IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, pp. 1263-73, 2013.

[11] D. Pakhomov, V. Premachandran, M. Allan, M. Azizian and N.

Navab, "Deep residual learning for instrument segmentation in robotic surgery," in International Workshop on Machine Learning in Medical Imaging, 2019.

[12] A. A. Shvets, A. Rakhlin, A. A. Kalinin and V. I. Iglovikov,

"Automatic instrument segmentation in robot-assisted surgery using deep learning," in 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), 2018.

[13] B. Boldrini, W. Kessler, K. Rebner and R. W. Kessler, "Hyperspectral Imaging: A Review of Best Practice, Performance and Pitfalls for in- line and on-line Applications," J.Near Infrared Spectrosc., vol. 20, (5), pp. 483-508, 2012.

[14] J. Han, J. Pei and M. Kamber, Data Mining: Concepts and Techniques. 2011.

[15] V. V. Tuchin and V. Tuchin, "Tissue optics: light scattering methods and instruments for medical diagnosis," 2007.

[16] D. Liu, X. A. Zeng and D. W. Sun, "Recent developments and applications of hyperspectral imaging for quality evaluation of agricultural products: a review," Crit. Rev. Food Sci. Nutr., vol. 55, (12), pp. 1744-1757, 2015.

[17] V. Zheludev et al, "Delineation of malignant skin tumors by hyperspectral imaging using diffusion maps dimensionality reduction," Biomedical Signal Processing and Control, vol. 16, pp.

48-60, 2015.

[18] V. Nourrit et al, "High-resolution hyperspectral imaging of the retina with a modified fundus camera," J. Fr. Ophtalmol., vol. 33, (10), pp.

686-692, 2010.

[19] H. Akbari and Y Kosugi, "Hyperspectral imaging: A new modality in surgery," INTECH Open Access Publisher, pp. 223-240, 2009.

[20] Ribeiro de Oliveira, M. M. et al, "Face, content, and construct validity of human placenta as a haptic training tool in neurointerventional surgery," J. Neurosurg., vol. 124, (5), pp. 1238-1244, 2016.

[21] P. Bartczak et al, "Spectral video in image-guided microsurgical applications: Integrating imaging technology into the clinical environment and ergonomic considerations," in 2018 IEEE 31st International Symposium on Computer-Based Medical Systems (CBMS), 2018.

[22] D. G. Pelli and P. Bex, "Measuring contrast sensitivity," Vision Res., vol. 90, pp. 10-14, 2013.

[23] H. Kukkonen, J. Rovamo, K. Tiippana and R. Nasanen, "Michelson contrast, RMS contrast and energy of various spatial stimuli at threshold," Vision Res., vol. 33, (10), pp. 1431-1436, 1993.

[24] D. Arthur and S. Vassilvitskii, "K-means++: The advantages of careful seeding," in Proceedings of the Eighteenth Annual ACM- SIAM Symposium on Discrete Algorithms, 2007.

[25] P. Gayatri, G. G. Menon and P. R. Suneel, "Effect of operating microscope light on brain temperature during craniotomy," J.

Neurosurg. Anesthesiol., vol. 25, (3), pp. 267-270, 2013.

Viittaukset

LIITTYVÄT TIEDOSTOT

In this work, we propose intelligent instrument detection using Convolutional Neural Network (CNN) to augment microsurgical training.. The network was trained on real

Figure S2: Relative standard deviation (RSD) of impedance magnitude and phase in repeated bioimpedance spectroscopy (BIS) measurements for white tissue paper using (a) the

relative of the thermoregulation dynamics by using thermal images and spectral imaging in different patients and to obtain the accurate data for average and standard deviation

In this thesis a novel unsupervised anomaly detection method for the detection of spectral and spatial anomalies from hyperspectral data was proposed.. A proof-of-concept

Brain surgery spectral imaging integrated to the Zeiss Pentero brain surgery microscope MEMS-based. hyperspectral imager demo

Vuonna 1996 oli ONTIKAan kirjautunut Jyväskylässä sekä Jyväskylän maalaiskunnassa yhteensä 40 rakennuspaloa, joihin oli osallistunut 151 palo- ja pelastustoimen operatii-

In this work, we propose intelligent instrument detection using Convolutional Neural Network (CNN) to augment microsurgical training.. The network was trained on real

Using a hyperspectral camera mounted to the side ocular of operation microscope and Xenon white light illumination, we collected samples of standard microsurgical