• Ei tuloksia

5.1. The FZ5-system

To begin this project, images where first logged with a single camera and used to per-form offline tests. Images of only Autonet and Gold material were logged. Figures 27 – 30 shows some images with defects that the “Gravity and Area” and “Defect” methods were tested on in offline measurements. Parameters where set so that the system correct-ly judged all faultless images.

(a) (b)

(c) (d)

Figure 27. These large area defects with deviant intensities were reliably detected by the Omron system in the offline tests. (a) a joint were the layers do not lie flat against each other casts a shadow visible as a dark band. (b) shows another similar joint as in (a). (c) the net-material have been joined by stitching. (d) a hole have been ripped in the material.

(a) (b)

Figure 28. These defects were detected in some images by the Omron system in of-fline tests (several images were captured of the same defect as it moved by). (a) a band with slightly higher intensity than normal. The cause of this band is unknown. (b) someone have used a marker pen to mark a specific spot.

(a) (b)

Figure 29. Defects in Gold material that could not be detected by the FZ5-system.

(a) a glued joint with nearly identical intensities at both sides. After a bit of processing, a very thin zigzag line can be seen, but texture analysis of the prints would be required to detect the joint in software. (b) a thin line drawn by a marker pen. The line is too narrow, and has an intensity too close to the intensity of the printings, to be detected. Shape analysis would be required to detect this line.

(a) (b)

(c) (d)

Figure 30. Defects in Autonet that could not be detected by the FZ5 system. (a) a scratch visible as a bright line. (b) a horizontal fold. (c) these bright marks are caused by unsuccessful coating processes. Glue or some other chemical may have formed small local accumulations. Image processing has been used to enhance visibility. (d) another fold in the material.

When the results were satisfying, the online system was programmed and taken into use. Both the algorithm and parameters were now slightly adjusted, to match the two camera setup and to work for all materials. A two hour test was performed on Autonet.

The machine was not stopped due to a single false alarm. In that time, two different de-fects were detected, a joint similar to the one in Figure 27 (c) and a hole similar to the one in Figure 27 (d). At least one defect passed by undetected, a small spot with a size of about 1 cm2.

A several hours long test was done also on Gold material. During this time, several de-fects were detected, but the machine was also stopped due to false alarms a couple of

times. Parameters were adjusted to lower the sensitivity. Figure 31 shows an example of a typical case that causes problems. Figures 32 – 39 shows defects that were detected during the test. Check the list on the right side of the GUI to see which methods detect-ed the defect. The icon of a method that detects a defect turns rdetect-ed.

Figure 31. A dark shadow is formed as the material is slightly folded near the right edge. If this happens frequently, the sensitivity must be lowered.

Figure 32. A joint is detected, but only by one of the six methods.

Figure 33. A different kind of joint, detected by the “Defect” method in both imag-es.

Figure 34. The “Defect” method detects a flag in the left edge of the material. Flags are used to mark defects that have been detected in earlier stages.

Figure 35. Loose grip cloth is detected by the “Defect” method when a dark shadow is formed near the left edge. Uniformly coloured defects cannot be de-tected, since the defect covers entirely the second image, it is not detect-ed there.

Figure 36. The edge of overlapping grip cloth is detected, since now there is a nota-ble difference in intensities and the “Gravity and Area” methods reacts.

Figure 37. A bright band is detected in the left image by “Gravity and Area”.

Figure 38. The width of this stripe is near the limit of what can be detected. It is not detected in the upper image, but in the bottom image it is slightly broader in a short section. At this location it is detected by the “Defect” method.

Figure 39. A joint has been strengthened with some tape, which is detected.

5.2. Texture analysis with the Fourier transform

Testing of this algorithm was done on the images logged of Gold and Autonet in the be-ginning of the project. It became clear that this algorithm also would require parameter adaption over time when it runs. Two images of Gold material logged at different days are shown in Figure 40. The majority of Gold images were logged during a single run lasting a couple of hours. Five images alone were logged at another day to see if the re-sults were consistent. They were not. One reason for this appeared to be the amount of dust on the lights. The algorithm should not be sensitive to the total light intensity, but it seems that the dust scatters the light so that the illumination becomes more diffuse and uniform.

The generated membership functions are shown in Figures 41 – 48. Feature one is not mapped trough a membership function, so the plots shows features 2 – 8 for both Gold and Autonet. Data points for Gold are shown as green crosses, while data points for Au-tonet are shown as blue circles. As discussed earlier, only the feature values for a single material are used at a time. The data points for the other material are included in the plots merely for visualization. The y-value of a data point is the membership assigned to the point by the fuzzy c-means clustering algorithm. Note that for Gold, there are five points that have been assigned significantly lower values than the rest. For Autonet, there are also a couple of points that cause problems. A k-value of 10 is used.

(a) (b)

Figure 40. Images of Gold logged at different days. The environment is quite dirty and the difference in illumination between the two images is probably due to the amount of dust on the line lights.

Figure 41. Membership functions for feature 2, magnitude of peak frequency in x-direction, and feature 3, sum of frequency power in x-x-direction, for Gold.

Figure 42. Membership functions for feature 4, magnitude of peak frequency in y-direction, and feature 5, standard deviation of frequency power in x-direction, for Gold.

1 1.5 2 2.5 3 3.5 4 4.5

Magnitude of peak frequency, x-direction

5 6 7 8 9 10

Sum of frequency power, x-direction

10 11 12 13 14 15 16 17 18 19 20

Magnitude of peak frequency, y-direction

1.5 2 2.5 3 3.5 4 4.5 5

Standard deviation of magnitudes, x-direction

Figure 43. Membership functions for feature 6, standard deviation of frequency power in y-direction, and feature 7, centre of gravity for frequency power in x-direction, for Gold.

Figure 44. Membership function for feature 8, centre of gravity for frequency power in y-direction, for Gold.

4 5 6 7 8 9

Standard deviation of magnitudes, y-direction

11 12 13 14 15 16 17 18 19

Centre of gravity for magnitudes, x-direction

6 6.5 7 7.5 8 8.5 9 9.5

Centre of gravity of magnitudes, y-direction

Figure 45. Membership functions for feature 2, magnitude of peak frequency in x-direction, and feature 3, sum of frequency power in x-x-direction, for Au-tonet.

Figure 46. Membership functions for feature 4, magnitude of peak frequency in y-direction, and feature 5, standard deviation of frequency power in x-direction, for Autonet.

1 1.5 2 2.5 3 3.5 4 4.5

Magnitude of peak frequency, x-direction

5 6 7 8 9 10

Sum of frequency power, x-direction

10 11 12 13 14 15 16 17 18 19 20

Magnitude of peak frequency, y-direction

1.5 2 2.5 3 3.5 4 4.5 5

Standard deviation of magnitudes, x-direction

Figure 47. Membership functions for feature 6, standard deviation of frequency power in y-direction, and feature 7, centre of gravity for frequency power in x-direction, for Autonet.

Figure 48. Membership function for feature 8, centre of gravity for frequency power in y-direction, for Autonet.

A large k-value, in this case 10, is needed to generate membership functions that assign acceptable memberships to all data points, see for example feature 4 and 6 for Gold.

However, this large a value makes the algorithm quite insensitive, and in an online ap-plication where we do not analyse images taken at different days at the same time, a

Standard deviation of magnitudes, x-direction

11 12 13 14 15 16 17 18 19

Centre of gravity for magnitudes, x-direction

6 6.5 7 7.5 8 8.5 9 9.5

Centre of gravity of magnitudes, y-direction

The generated parameters are first used to analyse a set of faultless images. The total number of faultless images analysed is 108 images of Gold and 118 of Autonet. The ten lowest memberships for both materials are listed in Table 2. To correctly judge all im-ages, quite low limit values would have to be used.

Table 2. Lowest memberships for faultless images.

Gold 0.19 0.20 0.21 0.24 0.31 0.49 0.52 0.63 0.79 0.79 Autonet 0.13 0.20 0.21 0.32 0.37 0.37 0.40 0.62 0.67 0.69

The same parameters are then used to analyse a set of images that contain defects. The limit value is set to 0.3 for both materials, which would have caused some faultless im-ages to be misjudged. However, the attention should be on the membership values and not on the final conclusion. What follows is a summary of the results. A series of imag-es used in this timag-est, along with rimag-esults from the algorithm, can be found in Appendix D.

Large area intensity changes are detected, such as holes and tape joints. Even small changes in intensity are enough as long as the area is large enough to have impact on the magnitude spectrum. An example is shown in Figure 49.

(a) (b)

Figure 49. This image contains three bands that are just slightly brighter than nor-mal and barely visible to the eye. Area is large enough to be detected. (a) original image used in the test. (b) the same image that have been pro-cessed for better visibility of the defects.

Small area intensity changes are usually not detected, even if the defect is significant-ly brighter or darker than the surrounding material. The total number of pixels in the image is quite large, so if only a few differs from normal, the impact on the spectrum is unnoticeable. An example is shown in Figure 50.

Figure 50. The total area of this defect is quite small, and the algorithm has prob-lems detecting it, even though the intensity values are significantly lower than in the rest of the image.

Folds that changes positions of prints are reliably detected in at least the Autonet ma-terial. An example of this kind of defect was shown in Figure 21. Some images of Gold material with these kind of defect were also analysed, but only very few. Every one of these were detected, but they all contained also abnormal intensity values, and it is therefore hard to tell the cause of detection.

Scratches in the material are occasionally detected if their relative positions happen to cause a strong response to some particular frequency, but usually they do not. An ex-ample of scratches in Autonet is shown in Figure 51.

Figure 51. Example of scratches in Autonet.

5.3. Summary of the results

The FZ5-system detects intensity variations reliably, as long as the difference in intensi-ty is large enough compared to the background. Some false alarms are caused by shad-ows near the edges, especially on paper-type materials. Better detection capabilities and more stable performance could be achieved by using other available methods in the li-brary, but this would require a system version with more processing power. Changes in texture cannot be detected by the available methods.

The algorithm developed in Matlab reacts strongly to changes in relative positions of prints. It detects folds and joints that shift the prints, and also large area intensity chang-es. Defects that differ significantly in brightness are not necessarily detected if the area is too small.

6. CONCLUSIONS

A complete machine vision system, based on the Omron FZ5, has been implemented for defect detection in KWH Mirka’s production line. It fulfils the minimum requirements listed in Section 1.1. The algorithm reacts only to intensity variations, since there were no methods available in the library that was well suited for texture analysis. If im-provements are to be done, the first thing to do would be to improve the lighting near the edges of the material. Getting more uniform lighting would open up for a more sen-sitive system without the risk of stops due to misclassification of faultless material. In-vesting in a high-speed version of the FZ-controller would give better results, but it would probably only be a minor and expensive improvement.

The Omron FZ5-system is not suitable at all to detect changes in texture. If sensitivity to defects that do not cause intensity variations is needed, another platform must be used. A program was developed in Matlab to test whether it is possible to use the Fouri-er transform to analyse the printings on the backside of the abrasive matFouri-erials. The method was able to detect shifts in the relative positions of printings, which usually in-dicate folds or a joint. This method relies on parameters extracted from model images.

A function was written that automatically calculates these parameters, and in an online version, this function should be used to continuously adapt them.

6.1. Future work

More research on the effects of different defects on the frequency spectrum would open up for optimizing the features that are used. Correct choice of features is the key to suc-cessful detection. Limiting the use of the spectrum only to the x- and y-axes was a way to simplify the selection, but it would be interesting to analyse also other regions of the spectrum. Research on the behaviour of the FFT itself could also be beneficial, for ex-ample how border effects affect the result, and if the use of window functions could im-prove the result. An implementation that performs the Fourier Transform over several smaller areas would also be interesting to experiment with.

A combination of both texture and intensity analysis would probably yield the best re-sults for an online system. This requires replacing the FZ5-system with a more powerful platform. I would use two line scan cameras with relatively low resolution, along with a high quality, focused line illumination. The flexibility of a standard PC would be desir-able. Measuring and plotting the results for one image in the Matlab algorithm was done in about half a second on my computer with an Intel Core i5-3337U 1,80 GHz processor and Windows 8.1 operating system. Parallel processing or pipelining would be needed to implement both intensity and texture analysis along with parameter updates in online measurements. This could be realized by using a DSP or an FPGA located on a frame grabber board to implement the FFT, before sending the transformed images to the CPU that would handle feature calculations, mapping through membership functions and pa-rameter updates.

REFERENCES

Aziz, Mahmoud Abdel; Ali S. Haggag; Mohammed S. Sayed (2013). Fabric defect de-tection algorithm using morphological processing and DCT. 2013 First International Conference on Communications Signal Processing and Their Applications.

Chan, Chi-ho; Grantham K. H. Pang (2000). Fabric defect detection by Fourier analy-sis. IEEE Transactions on Industry Applications, Vol. 36, No. 5. 1267 – 1276.

Koljonen, Janne (2013). AUTO3110, Luento 5. [Lecture slides from course AUTO3110 Machine vision]. Facoulty of technology, University of Vaasa

Litwiller, Dave (2005). CMOS vs. CCD: Maturing Technologies, Maturing Markets.

Photonics Spectra, August 2005. Laurin publishing.

Martin, Daryl (2007). A Practical Guide to Machine Vision Lighting. White paper, Ad-vanced Illumination.

Microscan (2013). Eight tips for optimal machine vision lightning. White paper, Mi-croscan systems.

Microscan (2014). Shedding light on machine vision. White paper, Microscan systems.

Niederjohann, Britta (2014). Fundamentals of Image Processing Sytems. White paper, Basler AG.

Omron (2016). Vision Sensor-FH/FZ5 Series Vision System Processing Item Function Reference Manual.

Omron (2016). Vision Sensor-FH/FZ5 Series Vision System User’s Manual. Page 70.

Omron (2014). Quality Control & Inspection Guide 2014. 72, 77 – 80, 84 – 89.

Oppenheim, Alan; Ronald Schafer; John Buck (1999). Discrete-Time Signal Pro-cessing. Prentice Hall, 2nd Edition. Chapter 9.

Pinter, Matt. Strobe; a new era for LED’s and machine vision lighting. Articles and Ap-plication notes, Smart Vision Lights.

Rashidian, Behnam; Eric Fox (2011). The Evolution of CMOS Imaging Technology.

White paper, Teledyne DALSA.

Schwär, Michael; Daniel Toth (2014). Camera selection – How can I find the right camera for my image processing system? White paper, Basler AG.

Steger, Carsten; Markus Ulrich, Christian Wiedemann (2008). Machine Vision Algo-rithms and Applications. WILEY-VCH Verlag GmbH & Co. KgaA, 1st Edition, 4th Reprint. 89 – 93, 125 – 144.

Vision Light Tech (2014). LED light source unit PFBR-150SW series. Product Bro-chure, Vision Light Tech.

Von Fintel, René (2013). Comparison of the Most Common Digital Interface Technolo-gies in Vision Technology. White paper, Basler AG.

Yang, X.; G. Pang; Yung N. (2005). Robust fabric defect detection and classification using multiple adaptive wavelets. IEE Proceedings –Vision Image and Signal Pro-cessing, Vol. 152, No. 6, December 2005. 715 – 723.

Zuo, Haiqin; Yujie Wang; Xuezhi Yang; Xin Wang (2012). Fabric defect detection based on texture enhancement. 5th International Congress on Image and Signal Pro-cessing. 876 – 880.

APPENDIX A

Figure 1A shows the blocks used in the Omron FZ program. The figure is a print screen of the flow viewer window.

0. Camera Image Input. This block communicates with the connected cameras.

Used to set camera parameters and acquire images from both cameras.

1. Advanced filter. This block can apply several sequential filter operations to an image. Used for dilating the image.

2. Color Data. Calculates mean intensity and standard deviation when used on monochrome images. Used to classify materials based on the intensity value.

3. Conditional Branch. Used to make the program jump to specific locations de-pending on the result of a Boolean expression.

4. Calculation Macro. Can be used to write several macros that output numerical values. This block has a graphical window where one can set limits for the re-sults of the implemented calculations and thus get an output judge of “OK” or

“NG”. This instance of the block is used to calculate and set parameters for the

“Gravity and Area” and “Defect” methods if the material is of net-type.

5. Advanced Filter. Used to remove a single bright spot if the material is of net-type. Only one ROI can be specified, so several blocks are needed to remove several spots.

6. Advanced Filter. Used to remove a bright spot.

7. Conditional Branch. Skips block number 8 if material is of net-type. (The pro-graming of the system is performed by dragging every block into a single list) 8. Calculation Macro. Calculates and sets parameters for the “Gravity and Area”

and “Defect” methods if the material is of paper-type.

9. Gravity and Area. Measures one half of the input image.

10. Gravity and Area. Measures the other half of the input image.

11. Defect. Measures the input image for local intensity variations.

12. Camera switching. Used to change the measurement image when several imag-es have been acquired into memory. Used to change to the second camera im-age.

13 – 24. Perform the same steps on the second camera image.

25. Macro. Block for writing program macros in a text editor. Used to check the to-tal judgement result and perform a wait instruction if the result is “NG”.

Figure 1A. The flow of the FZ program as viewed in the flow viewer window.