Why Do I Need So Many Measurements?
Most people involved in nanomaterials metrology can’t appreciate what large samples sizes give them because they’ve never had the luxury. Time is a key limitation not only in terms of costs producing measurements, but quality assurace decisions need to be made on a short time scale that doesn’t allow minutes and hours of data analysis time.
This first image shows an SEM image of SiO2 nanonparticles analyzed by NanoMet. In this 100KX image taken in field-immersion with a through-lens detector, NanoMet dimensions 54 spheres– all of the complete spheres in the image view– in 2.8 seconds for a net analysis rate of 52 ms/measurement. Statistics are automatically reported, giving a sample mean of 325.62 nm and a standard deviation of 25.00 nm.
In this second image taken at 50KX under the same conditions, NanoMet dimensions 224 spheres in 15.8 seconds for a measurement rate of 70 ms/measurement. Statistics are automatically generated and reported, in this case the sample mean being 327.60 nm and a standard deviation of 31.59 nm. In both images, the diameters are shown with colored circles, and while not shown here, the measurements are simultaneously numbered, allowing the user to identify and remove outliers from the automatically generated report which includes diameter histograms.
While the standard deviation increased slightly with additional measurements, the error range for the estimation of the mean particle diameter actually decreases according to the formulawhere is the estimated mean, the standard deviation, the confidence interval, N the number of measurements, and the actual mean. For a 95% confidence level, equals 1.96, so the 100KX image gives us an average diameter of 325.62 ± 6.67 nm while the more densely populated 50KX image gives an average diameter of 327.60 ± 4.14 nm. That’s a reduction of uncertainty in particle diameter of almost 40%. Manually obtaining metrics from those additional 170 particles would take much more time, and they would be obtained with significant human bias and risk of operator error and fatigue. Here the cost of decreasing the error in particle diameter by 40% was just 12 seconds of CPU time.