Accurate particle size analysis is crucial in all scientific disciplines and industrial processes. Whether you work in pharmaceutical formulation or environmental monitoring, the quality of your data directly impacts how smoothly your work plan progresses. With poor background and/or sample data, there is an increased likelihood of recording an incorrect particle size result, and this can impact product or process quality, time and effort spent on troubleshooting your analysis and buildup of business and environmental costs with waste of dispersant and sample usage for repeat measurements.
Data Quality Guidance is a tool available for use on the Mastersizer 3000 to make it easy for anyone to achieve accurate and reliable data. This add-on feature immediately flags if you’re experiencing issues with your data. It also provides clear information on the potential cause and outlines the next steps you should take to resolve these issues. And all this feedback is provided live within the measurement window, so no time is wasted on analyzing the quality of your data. With this tool, you can spend less time troubleshooting issues and more time using your advanced knowledge working with results you are confident in.
Accurate particle size analysis is crucial in all scientific disciplines and industrial processes. Whether you work in pharmaceutical formulation or environmental monitoring, the quality of your data directly impacts how smoothly your work plan progresses. With poor background and/or sample data, there is an increased likelihood of recording an incorrect particle size result, and this can impact product or process quality, time and effort spent on troubleshooting your analysis and buildup of business and environmental costs with waste of dispersant and sample usage for repeat measurements.
Data Quality Guidance is a tool available for use on the Mastersizer 3000 to make it easy for anyone to achieve accurate and reliable data. This add-on feature immediately flags if you’re experiencing issues with your data. It also provides clear information on the potential cause and outlines the next steps you should take to resolve these issues. And all this feedback is provided live within the measurement window, so no time is wasted on analyzing the quality of your data. With this tool, you can spend less time troubleshooting issues and more time using your advanced knowledge working with results you are confident in.
To achieve meaningful results, a laser diffraction measurement requires meticulous attention to detail throughout the measurement of the background and sample, as at each step, a careful evaluation of data quality is required. Some of the most common challenges include:
Fortunately, it is often possible to detect potential issues early on during the measurement as we know that achieving good data requires two main components: a clean and stable background, showing a progressive decrease across the detector range, as shown in Figure 1, and a stable sample measurement.
To achieve a clean and stable background we must verify that:
1. No hump is observed in the data.
2. No intermittent peaks across the detectors.
3. No unexpected large fluctuation.
1. The laser obscuration is suitable for the measured particle size.
2. Minimal negative data is recorded.
3. Dataset variability is minimal.
However, even with the best knowledge and experience, data quality issues can easily go undetected. If this happens then the wrong particle size distribution for a sample might be obtained which could impact batch release and sample specification, or cause inconsistency in results due to lower accuracy and/or precision.
In this application note we will explore how the Mastersizer 3000 Data Quality Guidance tool can ensure poor quality data is detected and resolved promptly, simplifying and/or eliminating challenges faced by the user.
One very common issue that often goes undetected is large fluctuations in the background due to contamination from a previous sample or air bubbles. Samples more prone to this include metal powders, paints and lactose. A user might easily miss out on detecting such an issue as it requires the user to be very vigilant, keeping a close eye on the background fluctuation and dispersant within the accessory unit during measurement.
In this example, a background measurement was completed. Upon completion the data quality tab turned yellow, as shown in figure 2, indicating that a new message is available to view.
When the data quality tab is selected a message indicating a poor background is shown. The tool uses a range of algorithms to analyze the light scattering data and thus it can detect a comprehensive range of common background data quality issues; if an issue is detected then the user will be provided with the potential list of causes, from the most to least likely, and given guidance on how to resolve as shown in figure 3.
For each individual sample measurement, the tool performs data quality checks for the level of obscuration, alignment, negative data, data fit, optical model and fine powder mode. Similar to the background quality check, once an issue is detected, the user will be informed of the cause and given guidance on how to resolve it.
In this case study, we have measured a yellow ink sample where too much sample has been added to the dispersion unit, with laser obscuration value measuring at 11.42% and mean DV50 value of 0.043 μm for 6 measurement records. For this sample, the data quality guidance tool has detected high obscuration, as shown in figure 4. During measurement, the obscuration and particle size are checked to determine appropriate sample concentration. A warning is displayed if too much or too little sample is added.
The data quality guidance advice was followed, and the yellow ink sample was remeasured at a lower obscuration value of 2.45%. This time, the data quality guidance tool did not detect any issues. To understand the impact that the change in obscuration had on our particle size distribution results we can compare the two datasets. Figure 5 shows how with increased obscuration a smaller particle size distribution is measured, with a skewing in the particle size distribution observed on the results graph (a) and lower mean Dv10, Dv50 and Dv90 values measured for the 11.42 % obscuration compared to 2.45 % obscuration measurement results (b). With too much sample a smaller particle size distribution is measured due to multiple scattering which led to exaggerated fines being interpreted within our data. The impact of multiple scattering is more significant for samples smaller than 10 μm.
Once the full sequence of sample measurements is completed, the data quality guidance tool performs dataset variability checks in accordance with ISO and USP standards. If the Percentage Relative Standard Deviation (%RSD) values are outside of the ISO and/or USP standards this will be indicated within the data quality tab as shown in figure 6.
For additional sample data stability checks, the user can also input their own manual % RSD limits that is available within the data quality guidance report tab post measurement (Figure 7).
With prompt data quality feedback, available during the measurement, you will be the first to know if something isn’t right with your data, so you can fix it early and avoid mistakes in your results. Now you will spend less time manually identifying or troubleshooting issues, and more time working with results you’re confident in. Not only you will be saving time and effort but also you will be wasting less sample and dispersant by avoiding multiple repeat measurements.
For an expert user the tool can be used to support you in verifying your data quality, and for a novice user, it can be used to guide you through the measurement, like a built-in training session – your results will be more accurate this time, and you’ll know the pitfalls to avoid next time!
With Data Quality Guidance, it is now easier than ever to achieve accurate and reliable data.