|
1.IntroductionYarn hairiness refers to the number of fibers protruding outside the yarn core, which has a direct effect on yarn properties, weaving efficiency, and fabric appearance.1 Fiber properties (e.g., fiber length), spinning preparations (e.g., ring spinning), and spinning conditions (e.g., temperature) are the key factors that influence the severity of yarn hairiness. As an important indicator, the index of yarn hairiness has been included in yarn quality requirements. Despite the importance, yarn hairiness has not been used in yarn quality evaluations as widely as other properties, such as yarn strength and fineness, partly because it is more difficult to accurately measure.2 Since the 1950s, many different methods have been developed to measure hairiness.3 These measuring methods include appearance comparison, microscopy, singeing, electrical, and optical detections. In the method of yarn appearance, the inspectors used their naked eyes to observe the yarn and determined a hairiness level in reference to the control board.3 Pilay4 examined the yarn under the microscope and traced hairy fibers using a curve measuring instrument. Boswell and Townend5 estimated the total weight of the hairy fibers by contrasting the weight change before and after singeing the yarn. The singeing method has a shortcoming because it is difficult to control the timing so that only free fiber ends are burnt without damaging the yarn core. In the past several decades, electrical and optical methods became the dominant way for yarn hairiness measurements in two basic types—projection counting and a full feather photoelectric test. The latter is also called the diffuse reflectance method.6 The Germany Zweigle G566,7 China Chang ling YG172L,8 Japan Keisokki Laserspot Lst-III are examples of instruments based on projection counting. The yarn was projected into a plane and a light activated triode was placed at a monitoring point which was set at a certain distance, , from the yarn core. When the yarn passed by the monitoring point, hairy fibers whose lengths were more than blocked the light, causing the triode to generate electrical pulses. The yarn hairiness was measured by counting the pulses. In the full feather photoelectric method as applied in the Switzerland Uster tester,7 a monochrome light source was collimated through a convex lens to irradiate the yarn. In accordance with the change of light flux, the signals of yarn hairiness could be acquired.9 As shown in Fig. 1, there is a common drawback in the principle of the photoelectric approach, that is, the instruments can only perceive the projected lengths of protruding fibers in the two-dimensional plane, which distorts the hairiness data.10 Since the late 1990s, the method for yarn hairiness measurement has been advanced by adopting image processing technology,11 which allows individual fibers to be directly examined. Cybulska12 developed special methods to estimate the external structure of yarns and to provide both global and local numeric characteristics of hairiness and twist. Kuzanski and Jackowska-Strumiłło.13 presented the edge-detection algorithm for measuring the real setting of the length and the number of the protruding fibers. Carvalho et al.14 use the Lab VIEW platform to develop a custom-made application that automatically determines yarn hairiness. Zhang15 captured images of a yarn placed on the black velvet board and processed the images with gray transformation, tilt correction, denoising, and edge sharpening. With the use of morphological operations, the characteristic parameters of yarn hairiness such as yarn perimeter, yarn area, and shape, could be calculated. Guha et al.16 developed an algorithm capable of analyzing yarn images taken under varying lighting conditions and varying yarn positions, and defined a new hairiness index to replace the traditional hairiness indicators. Fabijanska et al.17 adopted new image analysis algorithms to process yarn images, such as the graph cut method for yarn core extraction and high pass filtering for fiber extraction. They also proposed two new measures, hair area index and hair length index, to quantify yarn hairiness. Since tracing hairy fibers require the yarn to be observed under a microscope, the imaging system used to acquire a yarn image often has a limited depth of field (DOF) because of high magnification (e.g., ). Protruding fibers beyond the DOF will be defocused and blurry in the image. If the fibers are directly extracted and traced with the known algorithms, their true lengths are often mistakenly calculated, introducing errors to the hairiness data. Thus, coping with fuzzy curly fibers in the yarn image is the common problem in developing effective image processing algorithms for hairiness measurement. This paper is aimed at solving the off-focus problem of hairy fibers in a yarn image by using an imaging scheme—multifocus image fusion. Since the 1950s, image fusion concepts and methods have been developed and applied in a wide range of fields,18 such as remote sensing, photogrammetry, medical imagery, feature extraction, object recognition, and so on. Multifocus image fusion is one of the fusion methods that uses a set of images of the same scene captured at various focal planes of a camera to construct a clear and shape image in which all objects are focused.19 Multifocus images comprise complementary information that can be seamlessly merged into one image more suitable for microscopic analysis of material properties, and they have been successfully used as accurate measurments of fiber diameters which could improve the fiber measurement accuracy by more than 20%20 for nonwoven structures with respect to the fiber orientation in the nonwoven fiber.21 In this new application for yarn hairiness measurement, the yarn passing through the microscope is intermittently advanced, and at each stop the yarn segment is imaged multiple times on different focal planes. The sequential images of the same view, called multifocus images, contain different sharp portions of curving fibers, which are compensatory in constructing a fused image. The new fused image consists of the sharpest pixels selected from various multifocus images, and permits more accurate image segmentation, which is fundamental to true measurements of fiber lengths sticking out of the yarn core. This paper will first introduce the acquisition method of multifocus images of yarn, briefly describe the algorithms of image fusion, segmentation and integration, and finally present the experiment results of six different yarns and the comparisons with the Uster measurements. 2.MethodThe yarn hairiness measurement system we developed in this project involved capturing microscopic images, multifocus image fusion and other processing algorithms, and modified hairiness indices. 2.1.Capturing Yarn ImagesIn this study, we utilized an optical microscope (M318, BEION, Shanghai) to obtain images of yarns unwrapped from a bobbin. A driving device was installed to control the movement of the microscope platform along the axis direction that could move the yarn through the microscope.20 The power of magnification of the objective lens was set at . A digital camera (C200, BEION) was mounted on the top of the microscope to capture images with a size of , which approximately covered an actual area of on the yarn surface. According to the definition of the yarn hairiness index ()-a total fiber length relative to the measurement field length of 1 cm,22 the measured yarn needs to be divided into 1-cm sections for the calculation. Therefore, the yarn sample was forwarded 10 times at a 1-mm interval to yield the measurements in 1-cm sections. At each stop, 10 images were grabbed at different depths of view by adjusting the -axis of the microscope,20 and then were fused into one sharp image where the protruding fibers were all well focused. Figure 2 shows two example images of a polyester yarn with hairy fibers protruding from the yarn core captured at different focal points. Figure 2(a) is the image taken in a closer view than Fig. 2(b), demonstrating that different partially focused images possess compensatory information for constructing a fully focused image. 2.2.Image FusionImage fusion is a process to integrate information from multiple images to alleviate the multifocal problem of microscopic imaging. We employed a point-based image fusion algorithm to fuse multifocus images of fibers.21 Multifocus images refer to a series of images of the same view focused at different depths. Each multifocus image represents one image layer at which the image is only partially focused due to the limited DOF of the microscope. The magnitude of the intensity gradient at a pixel is often regarded as a measure of the sharpness of the pixel. Let denote the number of image layers. The magnitudes of image gradients in each image layer are calculated, and the image gradients at the same coordinate among different image layers are compared. A matrix can be constructed by filling each element with the number of the image layer on which the image gradient has the largest value among all the layers. Through this gradient selection, the information extracted pixel by pixel from the multifocus images composes a new image whose pixels have the highest sharpness values among all the image layers. The new image is called the fused image. Figure 3 displays the fused image of the multifocus images of the polyester yarn in Fig. 2. The circles in the image marked a fiber end after the image fusion where the fuzzy fiber becomes sharply focused. The fused image warrants more complete detection of fringe fibers in the following process. 2.3.Image SegmentationImage segmentation is the process of extracting the regions of interest out of the image background. When the intensity distributions of objects and background pixels are sufficiently distinct, it is suitable to use a single threshold to segment an image. However, how to determine a threshold suitable for the entire image is always a challenge. An iterative algorithm17 capable of automatically estimating the threshold was adopted in the project. This image segmentation algorithm is described as follows:
Figure 4 shows the polyester yarn in Fig. 3 after the image segmentation. The fuzzy fiber in the multifocus image [Fig. 2(a)] was missed in the image segmentation [Fig. 4(a)] because of its low contrast to the background. Having been enhanced via the image fusion, that fiber was fully segmented from the background as shown in Fig. 4(b). 2.4.Image IntegrationAs explained in the previous section, the successive images of a yarn need to be integrated together to form 1-cm long sections for hairiness measurement. We performed the morphological erosion23 on the binary image to remove thin hairy fibers, and then the skeletonization to extract the middle axis of the yarn core.23 The middle axes in the separate images were the reference lines for the integration. The reference lines from 10 images were successively linked, and finally the yarn segment with length of 1 cm could be formed. Figure 5(a) displays the integrated image with 10 original images on the same location under the microscope which contain the clear and blurred areas of yarn hairiness. As shown in Fig. 5(c), the same fibers falling into adjacent images are perfectly connected and there are no missing fibers as seen in Fig. 5(b). In the literature,12 Cybulska divided the yarn into two basis elements—the yarn core and hairiness according to compact agglomeration and mechanical properties. In this paper, in order to obtain the information about the yarn hairiness, with the progress of the integration, the orientation of the yarn core in each image needs to be identified. The morphological opening, that is, consecutive erosion and dilation, can remove thin objects such as hairy fibers and retain thick objects such as the yarn core. Figure 5(d) shows the separated yarn core and protruding fibers which are marked in red and green, respectively. From Fig. 5(d), which represents a 1-cm yarn segment, we can measure the areas of the yarn core (), and the yarn hairiness area () and the total hairiness length () of the protruding fibers. Then, we can calculate the hairiness area ratio () and the hairiness value (). is defined as the ratio of the area of the protruding fibers divided by the total area of the yarn within the 1-cm segment, i.e., . refers to the total length of the protruding fibers on the 1-cm segment, that is, is equal to in cm. 3.ExperimentsTable 1 shows the specifications of the six yarns, which were made of polyester, viscose rayon, cotton and their blends, and spun with ring, siro, rotor, and compact spinning machines, respectively. In this experiment, the yarn hairiness measurements were achieved via three methods: (1) image processing without multifocus image fusion, (2) image processing with multifocus image fusion, and (3) the Uster tester. Since the image processing system was not fully automated, particularly in the yarn advancing device, and the yarn had to remain stationary for the acquisition of sequential images, we were only able to capture and analyze approximately 10 consecutive images in 1 min, that is, one 1-cm yarn segments per minute. Therefore, we measured one 1-cm segment each meter for a total length of 10-m yarns per sample, and calculated the yarn core diameter, hairiness area ratio (), and hairiness value (). On the other hand, the Uster tester processed 1000-m yarns per sample at a speed of with an output. Table 1Specifications of test yarns.
Figure 6 shows the comparisons of the total areas () and the areas () of yarn cores of the six yarns. It can be seen that the of a fused image is larger than that of the corresponding image without fusion, whereas (yarn core) exhibits an opposite trend in that the s of the fused images are slightly lower than those of the unfused images. This is because in the fused image, hairy fibers are more complete and the edges of the yarn core are sharper than those in the unfused counterpart. Tables 2Table 3–4 provide the averages and the standard deviations of the three yarn measurements (, , and ) for the 10 samples of each yarn, and the Pearson correlation coefficients24 of these measurements from the unfused and fused images. From Table 2, it was clear that the diameters () of the yarn cores became 2 to thinner after the image fusion, which was consistent with the change in the . However, the two sets of diameter data in each yarn were highly correlated with the minimal being 0.982 for yarn 4 and the maximal being 0.999 for yarn 5. In Table 3, the hairiness area ratios () increased by 4% to 10% among the six yarns after applying the image fusion to the process. The standard deviations (SDs) of the measurements from the fused images also seemed to be higher than those from the unfused images. The two sets of measurements had fairly high s (from 0.834 to 0.992). In Table 4, the measurements increased roughly 0.5 to 1.1 units after applying the image fusion technique, meaning that approximately a total of 1 cm more hairy fibers were detected per 1-cm yarn segment. But the correlations in between the fused and unfused images ranged from 0.884 (yarn 3) to 0.974 (yarn 5). The SDs of the six yarns were also slightly increased in the fused images. Table 2Average diameters (D) of yarn cores with the image system.
Table 3Measurements of hairiness area ratios (A) with the image system.
Table 4Measurement of hairiness value (H) with the image system.
In order to verify whether or not the mean values of the yarn data from the two image processing methods (fusion or no fusion) are significantly different, the t-test was performed as reported in Tables 5Table 6–7. Since the t-test requires the data to obey the normal distributions and to have a homogeneity of variances, we chose the Shapiro–Wilk test25 and the F-test26 to verify the normality and the homogeneity of the variances of the data. In the Shapiro–Wilk test, all the test statistic values () of the three measurements (, , and ) of the six yarns were larger than the critical value at the significant level . Therefore, these data satisfied the normality requirement. In the F-test, the degree of freedom was 9 (). The test statistic values of the three measurements (, , and ) of the six yarns with the methods of image processing system were below the critical value at the significant level . Thus, there was no significant difference in variances under the possibility of 95%, that is, the samples data conformed to the homogeneity of variances. Since the data of the two image processing methods (with and without image fusion) were collected from the same sets of images, the paired t-tests27 were suitable. In this case, the degree of freedom was 9 and the critical value was 2.262 for the two-tailed test (). From Tables 5Table 6–7, the -statistic values of the six yarns were all above 2.262. Therefore, the null hypothesis was rejected, i.e., the mean values of the yarn core diameters, hairiness area ratios and hairiness values of the paired two samples from the fused images, and the unfused images were statistically different. As demonstrated in Fig. 5, the multifocus image fusion technique enhanced the quality of yarn images, permitting more complete detections of hairy fibers which led to significant increases in the yarn hairiness measurements. Table 5Statistic analysis of yarn core diameters (D).
Table 6Statistic analysis of hairiness area ratios (A).
Table 7Statistic analysis of hairiness values (H).
The hairiness values of the six yarns were also measured by using a Uster Tester (Table 8). Each yarn was tested 10 times for a length of 1000 m/test and at a speed of . Compared with the image fusion system (Table 4), the Uster tester outputted much lower values, which suggested that the Uster tester could underestimate yarn hairiness because of the drawback pointed out in the introduction. Of course, this discrepancy also arose from the difference in the test conditions (sampling rate, speed, air-drag, etc.,) of the two systems. The image fusion system measured a shorter length, operated at a lower speed and kept hairy fibers in more natural positions than the Uster tester did. This also explained the reason why the correlation between the two sets of data is fairly low. Through the Shapiro–Wilk test,25 these data of Uster tester also satisfied the normality requirement; however, the F-test verified that there existed a significant difference between variances of yarn samples measured by the image fusion system and the Uster tester. In addition, since the data of the same kind of yarn for the two methods were from different sections and under different testing conditions, the unpaired and unequal variance t-tests were performed to check if the mean values of the two sets were significantly different. In this case, the degree of freedom in each yarn test was computed as follows:28 where and were the variances of each yarn sample with the image fusion system and Uster tester, and and are the numbers of detected yarn samples (both and were 10 in this experiment). It was found that the of yarn 5 was 10 (the SD of yarn 5 is lower than that of the rest, so the calculated value is 9.84, rounding down to 10) and the s of the remainder of the yarns were 9. At the significant level , the critical values are 2.228 at and 2.262 at . The -statistic data in Table 8 were all above their critical values. Thus, the null hypothesis was rejected. These two sets of data significantly differed in their mean values.Table 8Hairiness values from Uster tester and comparisons with the image system.
4.ConclusionsThis paper presented a new measurement for yarn hairiness with image processing method based on the multifocus image fusion technique. Because hairy fibers on the yarn core can easily extend outside of the DOF of the imaging system, they cannot be fully focused in one image. Therefore, the hairiness measurements are often underestimated due to missing fibers in the image. We used a microscope that is focusable at various depths to capture a series of partially focused images at each examining position, and a pixel-based image fusion algorithm to merge sharp pixels selected from the image series to form a well-focused image. In the fused image, hairy fibers protruding in different directions and lengths all become sharp and complete, which warrants more accurate segmentation of the fibers from the background. We chose six yarns of different fiber contents and spinning methods and the measured yarn hairiness with three methods: image processing system without image fusion (1) and with image fusion (2) and the Uster tester (3) for the comparison tests. Through the paired t-tests, it was found that the hairiness area ratios () and the hairiness values () with image fusion were significantly higher than those without image fusion. Therefore, we attested that the application of the image fusion technique could yield more yarn hairiness information. The yarn hairiness values from the Uster tester were less than the values from the image processing system, and the two sets values had low correlation coefficients and significantly different means according to the t-test results. This preliminary test suggests that the Uster tester could underestimate yarn hairiness because of the drawback of the used photoelectric approach. The presented image fusion technique appears to be an effective way to correct the measurement of yarn hairiness in current image analysis or other photoelectric systems. AcknowledgmentsThis research was supported by the Natural Science Foundation of China (Grant No. 61172119), the Fundamental Research Funds for the Central Universities, China, the Foundation for the Author of National Excellent Doctoral Dissertation of PR China (Grant No. 201168), and Program for New Century Excellent Talents in University (Grant No. NCET-12-0825). ReferencesZ. QinY. Chen,
“The overview of yarn hairiness testing methods,”
Wool Technol. J., 4 48
–51
(1999). WOLTAC 0372-7459 Google Scholar
R. S. Chauhan,
“Yarn hairiness: measurement, effect and consequences,”
Indian Text. J., 119
(5), 36
–43
(2009). INTJAV 0019-6436 Google Scholar
A. Barella, Yarn Hairiness, Textile Institute, Manchester, England
(1983). Google Scholar
K. P. R. Pillay,
“A study of yarn hairiness in cotton yarns: part I, effect of fiber and yarn factors,”
Text. Res. J., 34 663
(1964). http://dx.doi.org/10.1177/004051756403400802 TRJOA9 0040-5175 Google Scholar
H. R. BoswellP. P. Townend,
“Some factors affecting the hairiness of worsted yarns,”
Text. Inst. J., 48 T135
(1957). 0040-5000 Google Scholar
F. LeD. Zhang,
“The performance of the two kinds of hairiness tester,”
Cotton Text. Technol. J., 4
(38), 268
(2010). Google Scholar
H. GuoJ. Wang,
“Analyzing several hairiness testing methods,”
Heibei Text. J., 3 75
–80
(2007). Google Scholar
L. XieA. Jiao,
“Properties and the application of the yarn hairiness tester YG17IL,”
in Proc. National Modern Spinning Technology Seminar R,
69
–77
(2010). Google Scholar
Uster Statistics Application Handbook, Uster Technologies Ltd., Switzerland
(2013). Google Scholar
N. Brunk,
“Methods of testing yarn hairiness,”
Africa Mid. East Text., 1 20
–20
(2005). Google Scholar
M. Kuzariski,
“Measurement methods for yarn hairiness analysis—the idea and construction of research standing,”
in in Proc. of the 2nd Int. Conf. on Perspective Technologies and Methods in MEMS Design (MEMSTECH 2006),
87
–90
(2006). Google Scholar
M. Cybulska,
“Assessing yarn structure with image analysis methods,”
Text. Res. J., 69
(5), 369
–373
(1999). http://dx.doi.org/10.1177/004051759906900511 TRJOA9 0040-5175 Google Scholar
M. KuzanskiL. Jackowska-Strumiłło,
“Yarn hairiness determination the algorithms of computer measurement methods,”
in Proc. IEEE Int. Conf. on Perspective Technologies and Methods in MEMS Design,
154
–157
(2007). Google Scholar
V. Carvalhoet al.,
“Yarn hairiness determination using image processing techniques,”
in Proc. IEEE Int. 16th Conf. on Emerging Technologies and Factory Automation (ETFA),
(2011). Google Scholar
J. Zhang,
“Application study on the yarn hairiness detection based on image processing technology,”
Hebei Univ. of Science and Technology,
(2011). Google Scholar
A. Guhaet al.,
“Measurement of yarn hairiness by digital image processing,”
J. Text. Inst., 101
(3), 214
–222
(2010). http://dx.doi.org/10.1080/00405000802346412 JTINA7 0040-5000 Google Scholar
A. FabijanskaL. Jackowska-Strumiłło,
“Image processing and analysis algorithms for yarn hairiness determination,”
Mach. Vis. Appl., 23 527
–540
(2012). http://dx.doi.org/10.1007/s00138-012-0411-y MVAPEO 0932-8092 Google Scholar
Z. Wanget al.,
“A comparative analysis of image fusion methods,”
IEEE Trans. Geosci. Remote Sens., 43
(6), 1391
–1402
(2005). http://dx.doi.org/10.1109/TGRS.2005.846874 IGRSD2 0196-2892 Google Scholar
I. DeB. Chanda,
“A simple and efficient algorithm for multifocus image fusion using morphological wavelets,”
Signal Process., 86 924
–936
(2006). http://dx.doi.org/10.1016/j.sigpro.2005.06.015 SPRODR 0165-1684 Google Scholar
R. Wanget al.,
“Multi-focus image fusion for enhancing fiber microscopic images,”
Text. Res. J., 82
(4), 352
–361
(2012). http://dx.doi.org/10.1177/0040517511407377 TRJOA9 0040-5175 Google Scholar
R. WangB. XuC. Li,
“Accurate fiber orientation measurements in nonwovens using multi-focus image fusion technique,”
Text. Res. J., 84
(2), 115
–124
(2014). http://dx.doi.org/10.1177/0040517513490056 TRJOA9 0040-5175 Google Scholar
F. Richard, Textile Measuring Technology and Quality Control, 1st ed.Donghua University Press, Shanghai, China
(2006). Google Scholar
C. G. RafaelE. W. Richard, Digital Image Processing, 3rd ed.Publishing House of Electronics Industry, Beijing
(2012). Google Scholar
P. AhlgrenB. JarnevingR. Rousseau,
“Requirements for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient,”
J. Am. Soc. Inform. Sci. Technol., 54
(6), 471
–592
(2003). http://dx.doi.org/10.1002/(ISSN)1532-2890 JASIEF 1532-2882 Google Scholar
R. Patrick,
“Approximating of Shapiro-Wilk W-test for non-normality,”
Stat. Comput., 2 117
–119
(1992). http://dx.doi.org/10.1007/BF01891203 STACE3 0960-3174 Google Scholar
L. Tjen-SienL. Wei-Yin,
“A comparison of tests of equality of variances,”
Comput. Stat. Data Anal., 22
(3), 287
–301
(1996). http://dx.doi.org/10.1016/0167-9473(95)00054-2 CSDADW 0167-9473 Google Scholar
S. Daya,
“Understanding statistics: paired t-test,”
Evidence-Based Obstet. Gynecol., 5
(13), 105
–106
(2003). Google Scholar
G. D. Ruxton,
“The unequal variance t-test is an underused alternative to Student’s t-test and the Mann–Whitney U test,”
Behav. Ecol., 17
(4), 688
–690
(2006). http://dx.doi.org/10.1093/beheco/ark016 1045-2249 Google Scholar
BiographyRongwu Wang received his PhD degree from Donghua University at Shanghai, China, in 2008 and joined the faculty in the College of Textiles, Donghua University, after 2 years of postdoctoral research in the same university. Currently, he is an associate professor at the College of Textiles and conducts research on new applications of image processing and microimage recognition systems. |