Why Sample Size Matters and NanoMet is Your Solution
If you’re like most people, your nanomaterials metrology is based on high resolution images of sparse image fields that allow particles to be individually counted and measured by a human operator. In this 300KX image of Au nanoparticles there are 15 particles that can be readily inspected for shape characteristics and diameter. In five or ten minutes one can take the image in the SEM and use ImageJ to measure the particle diameters. The mean diameter is 47.5 nm with a standard deviation of 3.8 nm. Visual inspection shows that 13% (2 of 15) of the particles have triangular habits.
That seems like a great use of time until we look at the effects of sample size. With N = 15, counting statistics using the binomial distribution tell us we know that 13% of our particles have triangular habits only within about 66% accuracy. To measure their frequency to within just 10% accuracy would require a sample size of 650 particles. Accomplishing that would require 44 comparable images, and allowing for an optimistic 5 minute image acquisition and processing time that would demand 3 hours and 40 minutes of a human operator’s time. It is sample size that limits us on assessing the diameters with confidence levels appropriate for commercial product development. As an example, N = 15 is perfectly acceptable to place a 68.27% (1σ) confidence level on the particle diameter being within 1 nm of the measured mean. On the other hand, a 98% confidence level on the particle diameter being 0.5 nm of the measured mean would require N = 321. That would require 21 comparable images for a total human operator time of 1 hour 45 minutes.
As soon as a nanomaterial’s commercial application demands high accuracy statistics of low-frequency nanofabrication defects, or alternately, high precision and high confidence level statistical hypothesis testing—human operator processing of nanoscale images is ineffective. The time and cost involved in processing aside, there is substantial human bias in such an approach. Drawing cursors across the diameters of objects does not meet any standard of dimensioning objects in SEM or TEM images where precise and repeatable contrast thresholds are required.
In this NanoMet analysis of a fiber material, 20,793 measurements were made in a CPU time of 204 seconds, for a measurement rate of 10 ms/measurement on conventional core i7 laptop. The image is automatically segmented, processed, post-analyzed and reported by NanoMet. The only operator interface is setting nominal ranges for fiber diameters and lengths—in the case of particles stastistcs this would be a range of diameters. Global statistical outliers are automatically identified and located on the SEM or TEM image. Unlike the DiameterJ plugin for ImageJ, the statistics generated are fiber specific for 1767 individual fibers, for a fiber measurement rate of 115 ms/measurement. This statistical correlation to individual objects allows for much deeper investigation into how individual objects are affecting population statistics. This type of understanding greatly accelerates better understanding new nanomaterials and correlating their performance in the commercial application, resulting in faster time to market at lower cost.