Ask an Expert: How well can you present your DLS results?

a question mark lit up amongst green question marks

Dynamic Light Scattering – recap

Dynamic light scattering (DLS) measures the Brownian motion and subsequent size distribution of particles in a solution. In a DLS instrument, a digital signal processor compares the intensity signal from the light scattered by the particles at multiple time points in the analysis to generate the autocorrelation function for a sample. In this correlation function, or correlogram is all the information regarding the motion or diffusion of the particles in the solution.

The hydrodynamic diameter for a collection of particles is calculated by the application of an exponential fitting algorithm to the measured correlogram. A common question from dynamic light scattering users is what is the best algorithm to use.

In this webinar, we will explain briefly the different sizing algorithms, so you know which size parameters to report and how to confidently present your DLS results.

Cumulants Analysis

The Cumulants analysis assumes a single particle size family and applies a single exponential fit to the autocorrelation function.

The Z average is the intensity-weighted mean hydrodynamic diameter of the collection of particles in the solution. The polydispersity index, or PI, is the width of this theoretical Gaussian distribution.

Distribution Analysis

The particle size distribution from DLS is derived from deconvolution of the measured intensity autocorrelation function of the sample. Generally, DLS instruments use a non-negatively least squares (NNLS) fitting algorithm, such as the General Purpose, Multiple Narrow Mode, and L-curve algorithms included in the Zetasizer software. This analysis is what provides a distribution of sizes and identifies the different peaks for multimodal samples.

Additionally, the software apples Mie theory to the intensity-weighted (primary) results to obtain volume and number-weighted distributions.

Which size value should I use?

In truth, neither one algorithm makes the other redundant. Both analyses, cumulants and distribution, are complementary and provide a better understanding of your samples.

Cumulants, as an intensity-weighted average analysis, is extremely sensitive to changes in the sample. If a sample has a tendency to aggregate, cumulants will flag this even with minute quantities of larger material. On the other hand, if the goal is to determine the size of aggregates, then the distribution analysis can help in providing the size position and relative abundance of the peaks identified in the distribution.

Adaptive correlation can also be a useful tool to detect oversized particles. This topic is outside the scope of this webinar, but for more information see Adaptive Correlation: Better Insight To The Presence Of Aggregates.

Other topics covered…

It is very easy to obtain a size result from a Zetasizer. So, add your sample to a cuvette, select your method, press start, and you will get a size value. But how to demonstrate the quality of your DLS measurements? What other parameters can you highlight to show your results are valid?

Tune in for this webinar and you will find out how. And do not forget to submit your questions in advance to askanexpert@malvernpanalytical.com, or during the webinar Q&As session.

Previous ‘Ask an Expert!‘ posts: