NanoMet: More Than Just Counting
The most obvious application of NanoMet is increased measurment sample size. Thousands and tens of thousands of measurements from a single image are possible with NanoMet. By increasing the measurement sample size the confidence in average particle size, as well as maximum and minimum particle sizes increases. This makes NanoMet a useful tool for quickly gaining statistical confidence without having to spend the time to get it.
But there is more that can be accomplished with NanoMet and the massive amount of data it provides if one considers the nature of the statistical distributions and not just the statistical metrics NanoMet provides such as the mean diameter and corresponding standard deviation.
In this study, tin spheres on HOPG were imaged using SEM. This is a classic SEM test standard as test objects can be found from 10’s nm to well over 10’s mm. The size distribution of these Sn spheres can be controlled through a combination of tuning the metal deposition rate and HOPG substrate temperature. At any magnification there are objects to test image resolution and gauge aberrations such astigmatism. Both the NanoMet particle module and area module were used to study the size of these nucleated particles. In the case of the NanoMet area module a presumed circular projected area was used to convert particle areas to diameters.
Image 1, taken at 25,000X, 924 particle diameters were measured in the region between two large Sn clusters ranging from 77 nm to 1,173 nm in diameter. Image 2, taken at 2,500X, 2239 particle areas were measuring ranging from 94 nm to 18,632 nm in diameter. These images were taking at 20kV using BSED (backscattered electrons) and the larger particles are all shown as statistical outliers. Since the nucleation of the Sn particles provides spheres at every magnification, the average particle diameter scales with the image magnification or horizontal field of view (HFOV) as different magnifications will highlight particles of different average size.
Image 3 was taken at 20,000X, comparable to the first image, but in a region of the Sn covered HOPG substrate that included primarily smaller tin clusters. Here 1424 diameters ranged from just 11 nm to 1,686 nm, so the smaller diameter tin clusters are sampled.
A naive approach is to use NanoMet to simply look at the statistical moments of the distribution of particles. The most important statistical moments are the sample mean and standard deviation. In the case of Image 1 taken with the NanoMet diameter module, the mean particle diameter is 175.3 nm with a statistical uncertainty of 4.5 nm based on the variance and sample size. Using the NanoMet area module, Image 2 has a mean particle diameter of 1992.5 nm with a statistical uncertainty of 126.1 nm. Image 3 which focuses on the smaller nucleated tin particles, using the NanoMet area module, yields a mean particle diameter of 213.2 nm with a statistical uncertainty of 8.2 nm.
These statistical moments in isolation of other information indicate that these three images reflect statistically different samples of the nucleated tin spheres. Given that different regions of the sample reflect different size scales of Sn spheres– this is not very interesting. What is more interesting is exploring the size distributions of these objects. NanoMet and FullScaleNANO’s Histogram On Demand (HOD) service all provide size distributions of objects measured. These size distributions themselves can be used to critically study the materials being imaged.
Particle sizes are often described by different statistical distributions that have specific analytical forms informed by the physical processes that govern the generation of those particles. In practice, log-normal distributions are often used to describe the distribution of particles in aerosols, while Weibull distributions are useful in describing particles generated by such processes as grinding or ball-milling. Other distributions, such as log-hyperbolic distributions, are commonly used in modelling particles in geological sediments. Using the particle size histograms produced by FullScaleNANO’s NanoMet application or Histogram On Demand (HOD) service, particle size distributions can be fit to theoretical models, or subject to such nonparametric statistical tests as the Komolgorov-Smirnov test which can be used to determine whether two distributions are the same within a given confidence interval.
Generating particle size histograms to produce particle size distributions for more subtle statistical analysis than generating statistical moments requires hundreds if not thousands of measurements. NanoMet is capable of providing that in seconds, allowing the critical analysis of statistical distributions to become routine. The particle size distributions for the three images shown where plotted on a log-log graph to explore the presence of power-law or scaling behavior. Such behavior is common in stochastic processes, and power-law behavior models most statistical distributions in the “tail” region– in this case larger particles. Linear behavior in such a log-log plot is indicative of power-law behavior, with the slope yielding the scaling exponent.
The log-log plot of the particle size distributions shows clear power-law behavior. Image 2 and Image 3 sampled very small particles and yield a similar scaling exponent of 1.9-2.3 in the range of small particles. Even though these images differ in magnification by a factor of 8x they produce measurements that reflect the same statistical behavior. Image 1 did not produce sufficient measurements of the smallest particles as it was processed with the particle diameter module which fits object shapes to circles, and a minimum number of pixels in an object is required for that process. Power-law behavior was also present at the larger diameters, in the “tails” of the particle size distributions. Image 1 and Image 3, taken at roughly the same magnification, 25KX and 20KX respectively, have comparable scaling exponents of -1.7 and -1.8 respectively. Image 2 which was taken at a much lower magnification, 2500X, and which samples a different type of region of the Sn clustering, has a scaling exponent of -1.0.
What does this mean? To answer that is a study of the physical processes that drive Sn mobility and nucleation on HOPG. What is certain is that answering such questions are beyond such particle size metrics as average diameter and standard deviation. Particle size distributions from hundreds if not thousands of measurements are required, and NanoMet and Histogram On Demand service provides that in the same time it takes to get basic statistical moments and quartiles. In seconds.
Use NanoMet and Histogram on Demand to make the full power of statistics work for you.