Advertisement
Articles
Advertisement

Taking It One Particle at a Time

Tue, 08/10/2010 - 11:04am
Paul Livingstone

Two new characterization platforms reveal that there’s more than one way to size up a nanoparticle.

Archimedes Sensor

The microfluidic resonator in Affinity Bioscience's Archimedes particle characterizer is so small it can detect the frequency change caused by particles moving through the channel. Image: Affinity Biosciences

Once upon a time, when particles were measured in micrometers and an optical microscope could do the heavy lifting, the size and distribution of particles in a solution were discovered through the strength of a researcher’s eyesight.

To see the truly small particles, however, we needed to invent less direct solutions. After the invention of the laser, researchers determined that Rayleigh scattering could accomplish this task. As long as a particle is relatively small when compared to a given wavelength of light, it will scatter light in all directions. If it’s highly-collimated light—a laser—it can be measured accurately, and the particle characteristics can be determined by watching how the intensity of reflect light changes over time. It’s guaranteed to change, too, because all such small molecules are constantly moving about in Brownian motion.

This method, dynamic light scattering (DLS), has become the gold standard for characterization of very small particles since its invention not long after the laser. It is not, of course, the only way to analyze particles. Attenuators, sieves, filters, and centrifuges are all employed for larger particle sizes. But for analyzing samples containing broad distributions of species of widely differing molecular masses (such as proteins and aggregates), DLS is reliably the cost-effective choice.

But the demands of R&D have pushed DLS to its diffraction limit. While excellent for collecting data on the dynamic properties of softer materials, DLS is often unable to cope with the scattering properties of large particles or those with a large refractive index (such as gold). Such particles cause multiple scattering, especially in high concentration, clouding the data.

Compensatory techniques like cross-correlation light scattering, or 3-D dynamic light scattering, can isolate the scattering of interest. Additional technologies have allowed DLS to accommodate high-throughput analysis, or even obtain information about Zeta potential. And newer variations, such as photo-correlation spectroscopy (PCS), use Stokes-Einstein equations and diluted sample solutions to enhance DLS sensitivity. However, new methods have hit the market that may soon give DLS a run for its money.

In Brownian motion
Robert Brown first observed the seemingly random motion of small particles in a solution more than a century ago. His long-accepted theory as to why such particles constantly move has frequently been described by envisioning particles as large balloons moving over a stadium crowd. Multiple people push up on the balloon at once and direct it in seemingly random ways. This continuous-time stochastic process is detected in ensemble fashion by DLS, either with a photomultiplier or CCD detector.

However, researchers also want to know the size and location of individual nanoparticles in solution. Nanoparticle tracking analysis (NTA), invented by Bob Carr, now the CTO of NanoSight, Amesbury, UK, is the first technology that can directly track the Brownian motion that makes DLS work.

“The big difference between the two methods is that dynamic light scattering is an ensemble technique and nanoparticle tracking works on a particle-by-particle basis,” says Jeremy Warren, CEO of NanoSight. With DLS results, he says, one piece of data that describes particle size distribution must be used to draw out a much more complicated situation.

“What you end up with is something like an average that is biased toward larger particles,” he says.

NTA, however, builds data by tracking the speed of individual particles through the solution. Carr, who is now NanoSight’s CTO, accomplished this by sending a laser light through a cell of suspended liquids. That laser light produces a scatter pattern.

“We’re not really producing a resolved image where you can see the shape of each particle. Instead we a producing a visualization of the scatter that’s produced,” he says. This is shown by way of a series of bright spots that are dictated by the size of the particle and its refractive index. Big particles look bigger, and are easily visualized. Under the Brownian motion that is being exhibited, small molecules move quickly, and large molecules move slowly. NanoSight’s NTA is able to measure the speed of the particle by taking several tens of video frames. The speed is directly proportional to the presented particle size, and these are then are presented as a particle size distribution.

The technology relies on a 635-nm laser beam passed through a prism-edged optical unit that causes the laser to refract at the interface between the flat optic and a liquid layer placed above it. This compresses the beam within the liquid film that contains the sample. A 20x magnification microscope objective is fitted to an otherwise conventional microscope, which is then mounted on a 30 fps CCD.

Each particle is simultaneously, but separately, visualized and tracked by software. The average distance each particle moves in x and y in the image is automatically calculated. From this value, the particle diffusion coefficient can be obtained and—knowing the sample temperature and solvent viscosity—the particle’s hydrodynamic diameter identified. The use of Stokes-Einstein equations reveals 3-D Brownian movement even through it is tracked in only two dimensions.

The range of particle sizes that can be analyzed by NTA depends on the particle type. For very high refractive index (Ri) particles, such as colloidal gold, accurate determination of size can be achieved down to 10 nm diameter. For lower refractive index particles, such as those of biological origin, the smallest detectable size might only be between 25 to 35 nm, about the size of a virus. Upper size limits are approached when the Brownian motion of a particle becomes too limited to track accurately, typically 1 to 2 µm diameter.

Work by graduate student, Iker Montes-Burgos of Professor Kenneth Dawson’s group at University College Dublin has shown the utility of the NTA approach to characterize silica nanoparticles before studying how they enter cells and decrease cell viability. Monomodal particle sizes obtained using NTA broadly agreed with sizes obtained with PCS. And several universities are now using the recently introduced NS500 to determine the concentration of metallic or polymer nanoparticles produced through the wear-and-tear of artificial joints. The NS500 has also added fluorescence capability.

In addition, gold nanoparticles are a rich field for research with NTA because they are convenient for use in biological systems as way to deliver molecules to specific locations in the body. Non-toxic and featuring a high refractive index, they are an ideal object for study by NTA.

One constraint of the system is that a sufficient number of particles must be analyzed with a certain time period so that a statistically meaningful and reproducible particle size distribution profile can be obtained. A sample dilution can be used to achieve this concentration.

Measuring Particles

NanoSight's nanoparticle tracking analysis uses Brownian motion to locate and follow individual particles in solution. Image: NanoSight

The benefit of being able to simultaneously measure two independent parameters, such as particle scattering intensity and particle diameter (from dynamic behavior), can prove valuable in resolving mixtures of different particle types. Similarly, small differences in particle size within a heterogeneous population can be resolved with far higher accuracy than would be achieved by other ensemble light scattering techniques.

“We also provide a measure of concentration. You don’t need to know the density or the refractive index of the particles because the diffusion rate of the particles is independent of density,” says Warren.

Particle characterization without photons
At Pittcon 2010, Archimedes , the award-winning product of the vendor conference was tucked away in a small booth next to Particle Sizing Systems, Willow Grove, Pa. (which serves as its distributor).

It resembled an atomic force microscope. Launched in 2009, the Archimedes particle metrology system from Affinity Biosensors, Santa Barbara, Calif., takes advantage of the precision of a cantilever-based system without needing optical properties. But instead of reading a surface, it reads the change of resonance in a microfluidic system. By adopting a non-optical, quantitative approach, the Affinity team has brought strong nanoscale resolution to both particle sizing and particle density measurements.

Cantilever-based sensing platforms have long been used to profile surface characteristics, and Affiinity Biosensor’s CEO, Ken Babcock, appreciated the accuracy of such systems enough to want to apply it to other types of samples. Babcock is formerly a technologist at Digital Instruments (DI) and Veeco, which is well known for its AFMs. Starting around 2001, Veeco tried to develop a cantilever sensing platform for use in fluid, but the sensitivity was never very good.

However, the work at Veeco caught the attention of Scott Manalis at MIT, who had been an intern at DI when Babcock was starting his career. “He called me one day and told me “You’re doing it wrong. Don’t put the cantilever in the fluid; put the fluid in the cantilever,” says Babcock. Manalis and his post doc, Thomas Burg, had realized that enclosing the fluid inside the cantilever, while placing the exterior in a vacuum environment, would produce the high quality factor needed for high resolution frequency measurements.

Manalis and Burg began designing a solution. But their breakthrough required outside help. Together, Babcock and Manalis successfully applied for a grant from the Institute for Collaborative Biotechnologies in conjunction with a U.S. Army project on food safety, and with Innovative Micro Technologies Inc., Santa Barbara, Calif., a leading MEMS company, were able to build the complex but tiny chip.

What they developed is a microchannel resonator that was a departure from previous MEMS chips in that it is designed to both resonate mechanically and to handle samples in solution. The embedded sensors were sensitive to detect masses as small as 1 femtogram.

“The way I like to describe it is to imagine a tuning fork that is hollow. Imagine that the fork is vibrating at a certain frequency. If, say, a fly lands on the tuning fork, the added mass will cause a change in the resonant frequency. We’re using that same principle to detect particles by suspending them in fluid and flowing them through the hollow tuning fork. That’s how we use resonant mass detection,” says Babcock, to weigh particles one-by-one at high resolution.

In addition to enabling ultra-high resolution measurements, the microfluidic channel is a built-in way to deliver a sample to the resonator. And the architecture of chip allows for high-throughput tasks.

Frequency readings are taken 1,000 times per second with a resolution of 40 ppb, allowing the particles to be weighed as they pass through the resonator in a few milliseconds.

According to Babcock, particle size as small as 50 nm and up to 5 µm have been regularly detected by Archimedes, and diverse samples have been measured, from nanoparticles to single bacterial cells.

Obvious strengths of the system are resolution and accuracy, particle-by-particle, but the system is also well-suited for distinguishing between particles close in size and determining particle concentration through control of the fluid flow rate.

This ability can help the user determine, for example, how many millions of bacteria are in a vial. An additional benefit, Babcock says, is that Archimedes is calibrated to within 1% and measurements are NIST-traceable.

“There’s a future here. We’re just starting out,” says Babcock. A recent paper in Nature Methods by the Manalis group discussed a method to use this technology to trap an individual cell and observe its growth. He anticipates Archimedes eventually being used for mass-based flow cytometry and direct detection of viruses, in addition to the sizing and mass density measurement of particles.

But how will Archimedes fare in a mature DLS marketplace? With the first production systems now in the field, Babcock says the system will occupy a niche for applications demanding very high resolution and accuracy, and has received strong interest from a diverse set of potential customers. One early application is in helping to quantify protein aggregation for a number of biotech firms. In the future, even smaller resonators will help push detection limits even further.

But how will Archimedes fare in a mature DLS marketplace? Accustomed to using a photonics-based system, some users may be reluctant to transition immediately. Which is why Babcock says that, for now, the system will occupy a niche for applications that need the level of resolution and repeatability that Archimedes can provide. But applications are definitely being found. Archimedes is now being used to help several biotech firms quantify protein aggregation. In the future, smaller resonators could help push detection limits even further.

Published in R & D magazine: Vol. 52, No. 4, August, 2010, pp. 14-17.

Advertisement

Share This Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading