Correlation intercept – what is it and what does it mean?

Comparing data

I saw a recent discussion about exporting data from the Zetasizer for further analysis. Both the Zetasizer Nano and Zetasizer Pro/Ultra software packages include a range of analysis tools. I wondered what was missing?

When we compare data, it is often tempting to simplify the data to make this process easier. This is what triggered the exporting data question. The user wanted to normalize the autocorrelation function (ACF) data from their Dynamic Light Scattering (DLS) measurements.

We know from theory that the ACF should start close to 1 and end at zero. So scaling the data to 1 seems sensible, right? Some DLS instruments do this automatically. By doing so, however, we might be missing an important piece of our data quality jigsaw.

What does the intercept show us?

For example, below are two sets of correlation data for a polystyrene latex sample, where we have normalized the data. They both show “good” correlation functions, but we can see a difference in the decay. The reported hydrodynamic size value is different. They must be different sizes, right?

Normalized correlation functions for two samples of polystyrene latex. The red data shows a slightly faster decay.

In reality, we have missed something by re-scaling the data- the correlation intercept. This is the value of the plateau of the autocorrelation function at short lag times. In theory, the correlation intercept is 1, but in reality a number of noise effects couple into this value. Some of this comes from the instrument (laser and detection optics), but this could be from the sample.

If we take our two latex samples and look and the raw data, we can see a striking difference between the two data sets. One has a much lower intercept than the other.

Raw correlation functions, for the same two samples of polystyrene latex. The red data has a lower intercept. What is causing this?

The low intercept of the red data was caused by “multiple scattering”. This is where several particles scatter the light before it is detected. DLS analysis theory works if only singly scattered light, so here we have an inaccurate measurement. To remedy this, the sample concentration could be reduced. If using Zetasizer capable of backscatter measurements, the measurement position can be adjusted. Measurement close to the cell wall will reduce multiple scattering. We call this technique Non-Invasive Back Scatter.

What affects the intercept value?

Other examples of optical noise from the sample include:

  • Flare- scattering from the cuvette- typically because it is dirty or scratched.
  • Fluorescence- the sample emits its own light which is detected additionally to the scattered light.
  • Number fluctuations- the number of particles within the laser beam is changing significantly over time. This means that the average scattering intensity is not stable.

The correlation intercept helps us identify these effects. The extra light from flare, fluorescence or multiple scattering all results in a lower intercept. Number fluctuations lead to a higher intercept, which may be much greater than 1.

Examples of correlation functions for measurements of different quality.

Still not sure if you have a good quality measurement?

Interpreting DLS data is tricky, and there is more to look at than just the intercept. We can also interpret the correlation baseline and decay, as well as the count rate and other measurement settings. Our software has features to give advice about data quality. In the Zetasizer Nano software, look for the Expert Advice report. If you are using the new Zetasizer Pro or Ultra, look at the Data Quality Guidance window.

The Data Quality Guidance feature uses artificial intelligence to identify measurement problems. This looks at the correlation function, and based on learning from 10,000’s of other examples, classifies the data.

The algorithm shows a label for how each record can be used. Advice is also given on how to solve any measurement issues. If more than one issue is seen, advice for the most significant will be given. This allows a robust and guided workflow for improving sample and measurement conditions.

For more information

To find out more about some of the artifacts that can affect a DLS measurement, check out this webinar: When is a particle not a particle?

Further reading