I was able to size the lactose particles after sieving using the SympaTEC HELOS laser diffractor in the lab. I unfortunately had to do this step multiple times as I was having some difficulty interpreting the results. Nonetheless, I believe I understand how the device works now.
The sizer uses a laser to illuminate particles that are suspended in mineral oil that has 1% Span-85 in it. The laser is collimated and has a beam width of about 2 cm. A lens is then used to focus the light from the sample onto a detector. The detector is placed in the Fourier plane of the lens in order to detect diffraction patterns from the particles. Using Fraunhofer diffraction theory, a particle size is estimated from its diffraction pattern.
What caused my initial hesitation about the results was that I was told to not put a lot of particles in the cuvette—so I didn’t. However, I was told to add more particles to the cuvette and take successive measurements. I found documentation on the computer that states one should shoot for a 10–15% optical density which, at the time of my first run with the machine, I had no clue as to what that meant. My main cause of confusion and why I redid the sizing was when I added particles to the cuvette, it caused the sizing results to shift slightly.
My second go with the machine allowed for more clarity. I understand the caution for not putting a ton of particles in the cuvette since you don’t want to saturate the detector. If there are too many particles, than the detector (which I’m assuming is nothing more than a camera with multiple arrays) is unable to detect individual diffraction patterns through the software. I did oversaturate the detector once but, I just waited a couple of minutes till the particles settled in the cuvette. I then retook the measurement and sure enough, the detector wasn’t saturated. Of course doing this skews the particle size results as the detector is detecting particles small enough that stayed in solution. The sizer indicates the “optical density” of a measurement and I believe this is a measure of the turbidity of the solution in the cuvette. I was able to get it as high as 44% optical density without the software complaining.
The data I obtained shifted dependent upon the time in which particles were allowed to be in the cuvette—i.e. larger ones sank. So, I made sure to load the cuvette with tons of particles, wait till it was able to detect them, and then took measurements as particles sunk. I also added particles to the cuvette over time adding small amounts as was instructed for me to do. Doing this gave a spread in sizing that I’m assuming comes from a spread in particles after sieving.
The SympaTEC software gives a plot of sizing data and shows the 10th, 50th, and 90th percentile of the sizes. I believe it is doing this by fitting a Sigmoid to the size data and finding on the curve the corresponding percentiles. This data is of course plotted in a proprietary .bmp format that is completely useless. So, I’m working on a script to plot the data which, thankfully was output in a txt file. For all intents and purposes I believe using the data that was output by the device is sufficient for sizing purposes. I’m just wanting to have a little fun with the data it gave me.