I want to let my Bay Area readers know about an upcoming cool science lecture for the general public. Berkeley Lab presents “Just Say No To Carbon Emissions” on April 26 from 7-9 pm at the Berkeley Repertory Theatre. Admission is free. There will be three dynamic speakers from Berkeley Labs discussing renewable energy topics. Ramamoorthy Ramesh, a material scientist, will describe current research efforts to make cheap solar. Nan Zhou will discuss efforts to increase energy efficiency and reduce carbon dioxide emissions in China. Lastly, the geologist Curt Oldenbury will explain a strategy to reduce carbon emissions from coal and natural gas, by storing it deep underground. This event is co-sponsored by “Friends of Berkeley Lab” and “Berkeley Energy and Resources Collaboration.” For more information, check out their website.
My mother died of Alzheimer’s at the age of 69, so I can personally attest to the horror of this disease. I can think of few things worse than slowly watching your cognitive abilities decline, particularly if you are aware of the progressive deterioration as my mother was. So I’m keeping a close watch on the latest Alzheimer’s research, including the research of my colleague William Jagust who is a neuroscientist at UC Berkeley.
Dr. Jagust is participating in the Alzheimer’s Disease Neuroimaging Initiative (ADNI), which is large multicenter project supported by NIH, private pharmaceutical companies and nonprofit organizations. The primary goal of ADNI is to discover indicators (biomarkers) that can track disease progression and hopefully diagnose Alzheimer’s early on. Basically, they want to help speed up and streamline drug and clinical trials by developing biomarkers that track Alzheimer’s more reliably.
The initial ADNI five-year research project completed last fall. It studied cognition, function, brain structure and biomarkers for 800 subjects (200 elderly controls, 400 subjects with mild cognitive impairment, and 200 subjects with Alzheimer’s). The clinical data from the patients went into a large database, including MRI scans, PET scans, blood tests, neuropsychological tests, and genetic tests. The truly unique thing is that this database can be accessed by the public through a website. Basically the raw data (with patient personal information removed) is made available for everyone to use, in hopes that this will help scientists more rapidly understand and treat Alzheimer’s. This ADNI project just received the second phase of funding, so the studies will be expanded.
Although the cause and progression of Alzheimer’s disease is not fully understood, current research indicates that the disease is associated with the formation of “amyloid plaques” and “neurofibrillary tangles” in the brain that damage nerve cells. What does this mean? Amyloid plaques are protein fragments that the body produces naturally. In a healthy brain, these protein fragments are broken down and eliminated. In a brain with Alzheimer’s, the fragments instead accumulate to form hard, insoluble plaques between nerve cells. This excess amyloid buildup occurs before clinical Alzheimer’s symptoms, so it may be used as a predictor of disease. Neurofibrillary tangles are insoluble twisted fibers found inside the brain’s cells. These tangles mainly consist of a protein called tau, which helps form microtubules that transport nutrients from one part of the nerve cell to another. In an Alzheimer’s brain, the tau protein is abnormal and the tangles collapse this important transport system.
Dr. Jagust and other researchers are studying this beta-amyloid buildup using medical imaging, including PET imaging with a new drug called [11C]Pittsburg Compound B. This new PET drug binds to beta-amyloid plaques and indicates their size and position. “With PET, we’re able to study the biochemistry of the brain, and with MRI we can study both the anatomy and structure of the brain,” Jagust said. “We can also study some of the function of the brain to see what parts of the brain are active during different cognitive tests. So when you put all this information together, you can get a very detailed picture of how the brain is functioning and how function and structure might change with age.” Last fall Jagust published an article on the relationships between biomarkers in aging and dementia. The group found that the confluence of three factors — beta-amyloid deposition, atrophy of the hippocampus (part of the brain that stores and sorts memories), and episodic memory loss — signals early stage of Alzheimer’s. Hopefully this new understanding will ultimately provide early and more accurate diagnosis.
I don’t have room here to summarize all the results from the Jagust lab, let alone all the other labs doing Alzheimer’s research. But I must say that I’m optimistic given the recent progress they have made in understanding the disease. There are also many clinical trials underway for new Alzheimer’s drugs, including ones that hope to stop cognitive deterioration instead of just reducing symptoms. I’m encouraged but I still tell my friends who are doing the research that they need to find a cure within the next 10 years, because I do not want to suffer through this frightening disease like my mother did.
“5.3 million people have Alzheimer’s. 172 billion dollars will be spent this year on health care services for people with Alzheimer’s.”
— 2010 Alzheimer’s Disease Facts and Figures
“Approximately 900,000 PET scans were performed in 2004. It is estimated that over 2 million PET scans will be performed in 2010.”
— PETNET Solutions
As a medical imaging researcher, I notice when medical imaging technologies are mentioned by popular news media or medical-themed television shows. Lately I’ve been seeing PET imaging mentioned more frequently, including on TV shows like House and Grey’s Anatomy. This probably just reflects the fact that dramatically increasing numbers of PET scans are being performed in real life in clinics and hospitals. So what is PET imaging? Funny that you ask, because I just happen to do research in this field.
In this context, PET stands for Positron Emission Tomography. During a PET scan, a trace amount of biologically-active, radioactive drug is injected into the patient’s vein. The drug localizes somewhere in the patient, depending on the metabolic properties of the selected drug. The drug then emits a positron (anti-particle of the electron), and the positron annihilates with an electron in the patient’s body. The resulting energy forms gamma ray pairs that pass through the patient and are detected by the PET scanner. These detected gamma ray signals are used to create a 3-D volumetric image or picture of the drug’s concentration in the body.
PET imaging technology is unique because it images a patient’s metabolism, whereas most other medical imaging techniques measure anatomical structure. For example, X-ray CT or MRI scans can be used to identify a tumor because they show the patient’s anatomy in detail. However, PET imaging can identify if the tumor is benign or cancerous, by measuring whether or not the tumor takes up the radioactive drug. In reality, you’d really like to know both though — detailed anatomical structure and metabolic function. Recent work has demonstrated the increased clinical diagnostic value of fusing imaging technologies based on function (e.g., PET, SPECT or functional MRI) with those based on structure (e.g., CT, MRI, or ultrasound). As a result, PET and CT scanners are now typically combined into a single gantry system, so that images can be taken from both devices sequentially during a single procedure.
Since PET measures metabolism instead of anatomical structure, it is mostly used to image organs whose size or shape does not indicate whether they are functioning properly, such as the brain or heart. It is also used to diagnose diseases that exhibit an abnormal metabolism, such as cancer.
Stay tuned this week when I discuss some Alzheimer’s research that utilizes PET imaging.
“Writing a blog depends on your belief that you have something meaningful to say, as well as the confidence to put it out there.”
— Jennifer Huber
Although there is some controversy over the issue, human-induced green house gas emissions are generally considered to be the primary cause of global warming. Carbon dioxide is considered to be the most important of these “greenhouse” air pollutants, and the burning of fossil-fuels a main source. This is because fossil fuels have carbon atoms that are released as carbon dioxide when they are burned. For example, gasoline consists of atoms of hydrogen and carbon (about two hydrogens per carbon). When it burns, the hydrogen combines with oxygen to make water, and the carbon combines with oxygen to make carbon dioxide. If the combustion is incomplete, it also makes carbon monoxide.
In contrast, renewable non-combustible energy sources (such as wind or sunlight) do not convert hydrocarbons into carbon dioxide. That means that renewable energy technologies have a better chance at competing with fossil fuels (such as petroleum and coal) if you take into account the full energy cost, including the amount of carbon dioxide emitted during energy production. Solar plants cost more to construct than coal plants, but they don’t pollute. So economists argue that solar might actually be cheaper if you include the damage done by carbon dioxide pollution in the cost of coal. Basically, if you give a monetary value to the cost of polluting the air, then emissions become an internal cost of doing business that is visible on the balance sheet.
In order to address this issue, President Obama has pushed to establish carbon emission trading (known as “cap-and-trade”). The basic idea of this carbon exchange market is that the government sets a limit (cap) on the total amount of greenhouse gases that can be emitted nationally, and the market sets the price. In other words, a business would have to pay a carbon tax, buying the right to emit carbon dioxide from the government. The basic idea is to create a financial incentive to reduce greenhouse gas emissions, while boosting energy efficiency and renewable energy efforts.
President Obama specifically proposes a 14% emission reduction from 2005 levels by 2020 (and 83% reduction by 2050). He also proposes that companies buy an allowance, or permit, for each ton of carbon emitted at an estimated cost of $13 to $20 per ton to start. Rather than following Obama’s proposed system of “full permit auctioning,” the House of Representatives passed a bill last June that would establish a variant of a cap-and-trade plan for greenhouse gases. I’m not going to get into the details of this American Clean Energy and Security Act of 2009, which is still under consideration in the Senate. However, I did want to note that this bill’s cap-and-trade program allocates 85% of allowances to industry for free, auctioning only the remaining 15%. There is a tremendous amount of debate on the best system to curb emissions of climate-changing gases. There are other competing bills currently in the House and Senate. However, it appears that none of these bills are likely to pass Congress. “Realistically, the cap-and-trade bills in the House and the Senate are going nowhere,” said Senator Lindsey Graham, who is trying to create bipartisan climate and energy measures. “They’re not business-friendly enough, and they don’t lead to meaningful energy independence.”