Hope for Alzheimer’s Patients?

PET image of brain using PIB
Courtesy of Dr. Jagust Lab, UC Berkeley

My mother died of Alzheimer’s at the age of 69, so I can personally attest to the horror of this disease. I can think of few things worse than slowly watching your cognitive abilities decline, particularly if you are aware of the progressive deterioration as my mother was. So I’m keeping a close watch on the latest Alzheimer’s research, including the research of my colleague William Jagust who is a neuroscientist at UC Berkeley.

Dr. Jagust is participating in the Alzheimer’s Disease Neuroimaging Initiative (ADNI), which is large multicenter project supported by NIH, private pharmaceutical companies and nonprofit organizations. The primary goal of ADNI is to discover indicators (biomarkers) that can track disease progression and hopefully diagnose Alzheimer’s early on. Basically, they want to help speed up and streamline drug and clinical trials by developing biomarkers that track Alzheimer’s more reliably.

The initial ADNI five-year research project completed last fall. It studied cognition, function, brain structure and biomarkers for 800 subjects (200 elderly controls, 400 subjects with mild cognitive impairment, and 200 subjects with Alzheimer’s). The clinical data from the patients went into a large database, including MRI scans, PET scans, blood tests, neuropsychological tests, and genetic tests. The truly unique thing is that this database can be accessed by the public through a website. Basically the raw data (with patient personal information removed) is made available for everyone to use, in hopes that this will help scientists more rapidly understand and treat Alzheimer’s. This ADNI project just received the second phase of funding, so the studies will be expanded.

Although the cause and progression of Alzheimer’s disease is not fully understood, current research indicates that the disease is associated with the formation of “amyloid plaques” and “neurofibrillary tangles” in the brain that damage nerve cells. What does this mean? Amyloid plaques are protein fragments that the body produces naturally. In a healthy brain, these protein fragments are broken down and eliminated. In a brain with Alzheimer’s, the fragments instead accumulate to form hard, insoluble plaques between nerve cells. This excess amyloid buildup occurs before clinical Alzheimer’s symptoms, so it may be used as a predictor of disease. Neurofibrillary tangles are insoluble twisted fibers found inside the brain’s cells. These  tangles mainly consist of a protein called tau, which helps form microtubules that transport nutrients from one part of the nerve cell to another. In an Alzheimer’s brain, the tau protein is abnormal and the tangles collapse this important transport system.

Dr. Jagust and other researchers are studying this beta-amyloid buildup using medical imaging, including PET imaging with a new drug called [11C]Pittsburg Compound B. This new PET drug binds to beta-amyloid plaques and indicates their size and position. “With PET, we’re able to study the biochemistry of the brain, and with MRI we can study both the anatomy and structure of the brain,” Jagust said. “We can also study some of the function of the brain to see what parts of the brain are active during different cognitive tests. So when you put all this information together, you can get a very detailed picture of how the brain is functioning and how function and structure might change with age.” Last fall Jagust published an article on the relationships between biomarkers in aging and dementia. The group found that the confluence of three factors — beta-amyloid deposition, atrophy of the hippocampus (part of the brain that stores and sorts memories), and episodic memory loss — signals early stage of Alzheimer’s. Hopefully this new understanding will ultimately provide early and more accurate diagnosis.

I don’t have room here to summarize all the results from the Jagust lab, let alone all the other labs doing Alzheimer’s research. But I must say that I’m optimistic given the recent progress they have made in understanding the disease. There are also many clinical trials underway for new Alzheimer’s drugs, including ones that hope to stop cognitive deterioration instead of just reducing symptoms. I’m encouraged but I still tell my friends who are doing the research that they need to find a cure within the next 10 years, because I do not want to suffer through this frightening disease like my mother did.

PET Imaging — Not for Cats or Dogs

PET ring drawingAs a medical imaging researcher, I notice when medical imaging technologies are mentioned by popular news media or medical-themed television shows. Lately I’ve been seeing PET imaging mentioned more frequently, including on TV shows like House and Grey’s Anatomy. This probably just reflects the fact that dramatically increasing numbers of PET scans are being performed in real life in clinics and hospitals. So what is PET imaging? Funny that you ask, because I just happen to do research in this field.

In this context, PET stands for Positron Emission Tomography. During a PET scan, a trace amount of biologically-active, radioactive drug is injected into the patient’s vein. The drug localizes somewhere in the patient, depending on the metabolic properties of the selected drug. The drug then emits a positron (anti-particle of the electron), and the positron annihilates with an electron in the patient’s body. The resulting energy forms gamma ray pairs that pass through the patient and are detected by the PET scanner. These detected gamma ray signals are used to create a 3-D volumetric image or picture of the drug’s concentration in the body.

PET imaging technology is unique because it images a patient’s metabolism, whereas most other medical imaging techniques measure anatomical structure. For example, X-ray CT or MRI scans can be used to identify a tumor because they show the patient’s anatomy in detail. However, PET imaging can identify if the tumor is benign or cancerous, by measuring whether or not the tumor takes up the radioactive drug. In reality, you’d really like to know both though — detailed anatomical structure and metabolic function. Recent work has demonstrated the increased clinical diagnostic value of fusing imaging technologies based on function (e.g., PET, SPECT or functional MRI) with those based on structure (e.g., CT, MRI, or ultrasound). As a result, PET and CT scanners are now typically combined into a single gantry system, so that images can be taken from both devices sequentially during a single procedure.

Since PET measures metabolism instead of anatomical structure, it is mostly used to image organs whose size or shape does not indicate whether they are functioning properly, such as the brain or heart. It is also used to diagnose diseases that exhibit an abnormal metabolism, such as cancer.

Stay tuned this week when I discuss some Alzheimer’s research that utilizes PET imaging.

Obama’s Carbon Tax Cools Down

Although there is some controversy over the issue, human-induced green house gas emissions are generally considered to be the primary cause of global warming. Carbon dioxide is considered to be the most important of these “greenhouse” air pollutants, and the burning of fossil-fuels a main source. This is because fossil fuels have carbon atoms that are released as carbon dioxide when they are burned. For example, gasoline consists of atoms of hydrogen and carbon (about two hydrogens per carbon). When it burns, the hydrogen combines with oxygen to make water, and the carbon combines with oxygen to make carbon dioxide. If the combustion is incomplete, it also makes carbon monoxide.

In contrast, renewable non-combustible energy sources (such as wind or sunlight) do not convert hydrocarbons into carbon dioxide. That means that renewable energy technologies have a better chance at competing with fossil fuels (such as petroleum and coal) if you take into account the full energy cost, including the amount of carbon dioxide emitted during energy production. Solar plants cost more to construct than coal plants, but they don’t pollute. So economists argue that solar might actually be cheaper if you include the damage done by carbon dioxide pollution in the cost of coal. Basically, if you give a monetary value to the cost of polluting the air, then emissions become an internal cost of doing business that is visible on the balance sheet.

In order to address this issue, President Obama has pushed to establish carbon emission trading (known as “cap-and-trade”). The basic idea of this carbon exchange market is that the government sets a limit (cap) on the total amount of greenhouse gases that can be emitted nationally, and the market sets the price. In other words, a business would have to pay a carbon tax, buying the right to emit carbon dioxide from the government. The basic idea is to create a financial incentive to reduce greenhouse gas emissions, while boosting energy efficiency and renewable energy efforts.

President Obama specifically proposes a 14% emission reduction from 2005 levels by 2020 (and 83% reduction by 2050). He also proposes that companies buy an allowance, or permit, for each ton of carbon emitted at an estimated cost of $13 to $20 per ton to start. Rather than following Obama’s proposed system of “full permit auctioning,” the House of Representatives passed a bill last June that would establish a variant of a cap-and-trade plan for greenhouse gases. I’m not going to get into the details of this American Clean Energy and Security Act of 2009, which is still under consideration in the Senate. However, I did want to note that this bill’s cap-and-trade program allocates 85% of allowances to industry for free, auctioning only the remaining 15%. There is a tremendous amount of debate on the best system to curb emissions of climate-changing gases. There are other competing bills currently in the House and Senate. However, it appears that none of these bills are likely to pass Congress. “Realistically, the cap-and-trade bills in the House and the Senate are going nowhere,” said Senator Lindsey Graham, who is trying to create bipartisan climate and energy measures. “They’re not business-friendly enough, and they don’t lead to meaningful energy independence.”

Airport Insecurity

airplane
Courtesy of Yuna via Creative Commons

As you know, there was an aborted attempt to bomb Northwest Airlines Flight 253 last Christmas. The explosive (a.k.a., “crotch bomb”) was hidden in Mr. Abdulmutallab’s underwear to avoid detection. In the wake of this attempted terrorist attack, there was a lot of press on using full-body scanners at airport security checkpoints because such a scanner would have revealed the bomb. The news coverage focused primarily on the issue of privacy invasion weighed against the benefit of increased airport security.

Usually the news coverage included a frequent flier pleading for us to spend the money on full-body scanners to improve safety. However, I saw very little coverage on the potential health effect of frequent full-body scans. Now as a research scientist, I know just how much paperwork is required to scan humans even for medical purposes. So I know that these airport security scanners must pose very little health risk to humans. However, all the hype did get me curious about the technology used. There are two different technologies used for airport security full-body scanners — millimeter wave technology and backscatter technology.

Millimeter wave technology uses low-level electromagnetic waves. The millimeter wave is transmitted from two antennas simultaneously as they rotate around the body at high speed. A person walks into a large portal that resembles a glass elevator, pauses, and lifts his arms while being scanned for about 2 seconds. The wave energy reflected back from the body is then used to construct a 3-D image, which resembles a fuzzy photo negative that is displayed on a monitor in a nearby room. According to the Transportation Security Administration, the energy projected by a millimeter wave system is thousands of times less intense than a cell phone transmission.

Backscatter technology projects a very weak ionizing X-ray beam over the body surface. A person stands against a refrigerator-sized backscatter machine as a narrow, low-intensity X-ray beam scans his entire body at high speed. The beam of X-rays is sequentially scanned at a very high rate in the horizontal direction across the person’s body, while simultaneously moving down at a lower rate of speed. The entire scan takes a few seconds. The reflection ( “backscatter”) of the beam is detected, digitized and displayed on a monitor in a nearby room. The images look like a chalk drawing.

Many reports have erroneously stated that the X-rays from this backscatter technology penetrate clothing but not skin. In fact, the X-rays penetrate the skin but not much beyond it. However, the dose from the X-ray beam is truly negligible by any standard. It is equal to the dose that you receive from 15 minutes of natural background radiation (such as the sun’s rays). The dose from each scan is less than 10 microrem, which is equivalent to the dose you receive from two minutes of flying in an airplane at 30,000 feet. Or put in another way, the dose from the backscatter scan is less than 0.2% of the radiation received from a medical chest X-ray. Doctors and radiation experts argue that such a dose is inconsequential even for pregnant women.

The American College of Radiology states, “An airline passenger flying cross-country is exposed to more radiation from the flight than from screening by one of these devices.”  However, these scans do have some limitations in terms of security effectiveness. For instance, these backscatter scanners cannot find weapons hidden in body cavities since the X-rays don’t penetrate much beyond the skin. Presumably the terrorists will adapt to account for the technology.

U.S. Fuel Usage

When thinking about energy alternatives to fossil fuels, we need to keep in mind how the United States uses its current fuel supplies. Based on a report by the Lawrence Livermore National Laboratory,

28% is used for transportation (gasoline and jet fuel)

40% is used to generate electric power

20% is used for direct heating (natural gas, coal)

32% is used by industry.

(Someone of you will immediately notice that this list adds up to more than 100% — that is because of overlap. For instance, some of the electric power is used by industry.)

The reported numbers vary, but the bottom line is that we use fuel for transportation, electricity, heat and industry in comparable amounts. We need to keep this in mind when discussing energy policies. If we miraculously replace all gasoline with biofuels (alcohol made from plants), we will affect only 28% of the total. So we need to address several “sectors” or uses if we want to significantly reduce fossil fuel emissions.

Health News Blog

For those interested in health news blogs, I highly recommend “Health News from NHS.” This science blog looks at the science behind the international news headlines. The unique thing about this blog is its format:

  • Summary of news reports
  • Where did the story come from?
  • What kind of research was this?
  • What did the research involve?
  • What were the basic results?
  • How did the researchers interpret the results?
  • Conclusion
  • Links to Headlines
  • Links to Science

Given the long list above, you’ll justifiably conclude that the blog is a little bit longer than the average science news blog. However, you’ll come away with a more thorough understanding of the topic as a result. It may appeal particularly to scientists, but it could appeal to everyone on topics of specific interest.

%d bloggers like this: