Fermilab and partners achieve sustained, high-fidelity quantum teleportation

Fermilab and partners achieve sustained, high-fidelity quantum teleportation

Quantum information ‘teleported’ at Fermilab, Caltech represents step toward quantum internet

Aviable quantum internet—a network in which information stored in qubits is shared over long distances through entanglement—would transform the fields of data storage, precision sensing and computing, ushering in a new era of communication.

This month, scientists at Fermi National Accelerator Laboratory—a U.S. Department of Energy national laboratory affiliated with the University of Chicago—along with partners at five institutions took a significant step in the direction of realizing a quantum internet.

In a paper published in PRX Quantum, the team presents for the first time a demonstration of a sustained, long-distance teleportation of qubits made of photons (particles of light) with fidelity greater than 90%.

The qubits were teleported over a fiber-optic network 27 miles (44 kilometers) long using state-of-the-art single-photon detectors, as well as off-the-shelf equipment.

“We’re thrilled by these results,” said Fermilab scientist Panagiotis Spentzouris, head of the Fermilab quantum science program and one of the paper’s co-authors. “This is a key achievement on the way to building a technology that will redefine how we conduct global communication.”

The achievement comes just a few months after the U.S. Department of Energy unveiled its blueprint for a national quantum internet at a press conference at the University of Chicago.

Linking particles

Quantum teleportation is a “disembodied” transfer of quantum states from one location to another. The quantum teleportation of a qubit is achieved using quantum entanglement, in which two or more particles are inextricably linked to each other. If an entangled pair of particles is shared between two separate locations, no matter the distance between them, the encoded information is teleported.

The joint team—researchers at Fermilab, AT&T, Caltech, Harvard University, NASA Jet Propulsion Laboratory and University of Calgary—successfully teleported qubits on two systems: the Caltech Quantum Network and the Fermilab Quantum Network. The systems were designed, built, commissioned and deployed by Caltech’s public-private research program on Intelligent Quantum Networks and Technologies, or IN-Q-NET.

“With this demonstration we’re beginning to lay the foundation for the construction of a Chicago-area metropolitan quantum network.”

“We are very proud to have achieved this milestone on sustainable, high-performing and scalable quantum teleportation systems,” said Maria Spiropulu, the Shang-Yi Ch’en professor of physics at Caltech and director of the IN-Q-NET research program. “The results will be further improved with system upgrades we are expecting to complete by the second quarter of 2021.”

Both the Caltech and Fermilab networks, which feature near-autonomous data processing, are compatible both with existing telecommunication infrastructure and with emerging quantum processing and storage devices. Researchers are using them to improve the fidelity and rate of entanglement distribution, with an emphasis on complex quantum communication protocols and fundamental science.

“With this demonstration we’re beginning to lay the foundation for the construction of a Chicago-area metropolitan quantum network,” Spentzouris said.

The Chicagoland network, called the Illinois Express Quantum Network, is being designed by Fermilab in collaboration with Argonne National Laboratory, Caltech, Northwestern University and industry partners.

“The feat is a testament to success of collaboration across disciplines and institutions, which drives so much of what we accomplish in science,” said Fermilab Deputy Director of Research Joe Lykken. “I commend the IN-Q-NET team and our partners in academia and industry on this first-of-its-kind achievement in quantum teleportation.

Citation: “Teleportation Systems Towards a Quantum Internet.” Valivarthi et al., PRX Quantum, Dec. 4, 2020, DOI: 10.1103/PRXQuantum.1.020317

Funding: U.S. Department of Energy Office of Science QuantISED program



from ScienceBlog.com https://ift.tt/3rIGtWT

Gum disease-causing bacteria borrow growth molecules from neighbors to thrive

Gum disease-causing bacteria borrow growth molecules from neighbors to thrive

The human body is filled with friendly bacteria. However, some of these microorganisms, such as Veillonella parvula, may be too nice. These peaceful bacteria engage in a one-sided relationship with pathogen Porphyromonas gingivalis, helping the germ multiply and cause gum disease, according to a new University at Buffalo-led study.

The research sought to understand how P. gingivalis colonizes the mouth. The pathogen is unable to produce its own growth molecules until it achieves a large population in the oral microbiome (the community of microorganisms that live on and inside the body).

The answer: It borrows growth molecules from V. parvula, a common yet harmless bacteria in the mouth whose growth is not population dependent.

In a healthy mouth, P. gingivalis makes up a miniscule amount of the bacteria in the oral microbiome and cannot replicate. But if dental plaque is allowed to grow unchecked due to poor oral hygiene, V. parvula will multiply and eventually produce enough growth molecules to also spur the reproduction of P. gingivalis.

More than 47% of adults 30 and older have some form of periodontitis (also known as gum disease), according to the Centers for Disease Control and Prevention. Understanding the relationship between P. gingivalis and V. parvula will help researchers create targeted therapies for periodontitis, says Patricia Diaz, DDS, PhD, lead investigator on the study and Professor of Empire Innovation in the UB School of Dental Medicine.

“Having worked with P. gingivalis for nearly two decades, we knew it needed a large population size to grow, but the specific processes that drive this phenomenon were not completely understood,” says Diaz, also director of the UB Microbiome Center. “Successfully targeting the accessory pathogen V. parvula should prevent P. gingivalis from expanding within the oral microbial community to pathogenic levels.”

The study, which was published on Dec. 28 in the ISME Journal, tested the effects of growth molecules exuded by microorganisms in the mouth on P. gingivalis, including molecules from five species of bacteria that are prevalent in gingivitis, a condition that precedes periodontitis.

Of the bacteria examined, only growth molecules secreted by V. parvula enabled the replication of P. gingivalis, regardless of the strain of either microbe. When V. parvula was removed from the microbiome, growth of P. gingivalis halted. However, the mere presence of any V. parvula was not enough to stimulate P. gingivalis, as the pathogen was only incited by a large population of V. parvula.

Data suggest that the relationship is one-directional as V. parvula received no obvious benefit from sharing its growth molecules, says Diaz.

“P. gingivalis and V. parvula interact at many levels, but the beneficiary is P. gingivalis,” says Diaz, noting that V. parvula also produces heme, which is the preferred iron source for P. gingivalis.

“This relationship that allows growth of P. gingivalis was not only confirmed in a preclinical model of periodontitis, but also, in the presence of V. parvula, P. gingivalis could amplify periodontal bone loss, which is the hallmark of the disease,” says George Hajishengallis, DDS, PhD, co-investigator on the study and Thomas W. Evans Centennial Professor in the University of Pennsylvania School of Dental Medicine.

“It is not clear whether the growth-promoting cues produced by P. gingivalis and V. parvula are chemically identical,” says Diaz. “Far more work is needed to uncover the identity of these molecules.”

Additional investigators include Anilei Hoare, PhD, assistant professor, University of Chile; Hui Wang, PhD, postdoctoral researcher, University of Pennsylvania; Archana Meethil, resident, University of Connecticut; Loreto Abusleme, PhD, assistant professor, University of Chile; Bo-Young Hong, PhD, associate research scientist, Jackson Laboratory for Genomic Medicine; Niki Moutsopoulos, DDS, PhD, senior investigator, National Institute of Dental and Craniofacial Research; and Philip Marsh, PhD, professor, University of Leeds.

The research was funded by the National Institute of Dental and Craniofacial Research of the National Institutes of Health.



from ScienceBlog.com https://ift.tt/382LYYH

NIH study uncovers blood vessel damage and inflammation in COVID-19 patients’ brains but no infection

NIH study uncovers blood vessel damage and inflammation in COVID-19 patients’ brains but no infection

Results from a study of 19 deceased patients suggests brain damage is a byproduct of a patient’s illness.

In an in-depth study of how COVID-19 affects a patient’s brain, National Institutes of Health researchers consistently spotted hallmarks of damage caused by thinning and leaky brain blood vessels in tissue samples from patients who died shortly after contracting the disease. In addition, they saw no signs of SARS-CoV-2 in the tissue samples, suggesting the damage was not caused by a direct viral attack on the brain. The results were published as a correspondence in the New England Journal of Medicine.

“We found that the brains of patients who contract infection from SARS-CoV-2 may be susceptible to microvascular blood vessel damage. Our results suggest that this may be caused by the body’s inflammatory response to the virus” said Avindra Nath, M.D., clinical director at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) and the senior author of the study. “We hope these results will help doctors understand the full spectrum of problems patients may suffer so that we can come up with better treatments.”

Although COVID-19 is primarily a respiratory disease, patients often experience neurological problems including headaches, delirium, cognitive dysfunction, dizziness, fatigue, and loss of the sense of smell. The disease may also cause patients to suffer strokes and other neuropathologies.

Several studies have shown that the disease can cause inflammation and blood vessel damage. In one of these studies, the researchers found evidence of small amounts of SARS-CoV-2 in some patients’ brains. Nevertheless, scientists are still trying to understand how the disease affects the brain.

In this study, the researchers conducted an in-depth examination of brain tissue samples from 19 patients who had died after experiencing COVID-19 between March and July 2020. Samples from 16 of the patients were provided by the Office of the Chief Medical Examiner in New York City while the other 3 cases were provided by the department of pathology at the University of Iowa College of Medicine, Iowa City. The patients died at a wide range of ages, from 5 to 73 years old. They died within a few hours to two months after reporting symptoms. Many patients had one or more risk factors, including diabetes, obesity, and cardiovascular disease. Eight of the patients were found dead at home or in public settings. Another three patients collapsed and died suddenly.

Initially, the researchers used a special, high-powered magnetic resonance imaging (MRI) scanner that is 4 to 10 times more sensitive than most MRI scanners, to examine samples of the olfactory bulbs and brainstems from each patient. These regions are thought to be highly susceptible to COVID-19. Olfactory bulbs control our sense of smell while the brainstem controls our breathing and heart rate. The scans revealed that both regions had an abundance of bright spots, called hyperintensities, that often indicate inflammation, and dark spots, called hypointensities, that represent bleeding.

The researchers then used the scans as a guide to examine the spots more closely under a microscope. They found that the bright spots contained blood vessels that were thinner than normal and sometimes leaking blood proteins, like fibrinogen, into the brain. This appeared to trigger an immune reaction. The spots were surrounded by T cells from the blood and the brain’s own immune cells called microglia. In contrast, the dark spots contained both clotted and leaky blood vessels but no immune response.

“We were completely surprised. Originally, we expected to see damage that is caused by a lack of oxygen. Instead, we saw multifocal areas of damage that is usually associated with strokes and neuroinflammatory diseases,” said Dr. Nath.

Finally, the researchers saw no signs of infection in the brain tissue samples even though they used several methods for detecting genetic material or proteins from SARS-CoV-2.

“So far, our results suggest that the damage we saw may not have been not caused by the SARS-CoV-2 virus directly infecting the brain,” said Dr. Nath. “In the future, we plan to study how COVID-19 harms the brain’s blood vessels and whether that produces some of the short- and long-term symptoms we see in patients.”

This study was supported by NIH Intramural Research Program at the National Institute of Neurological Disorders and Stroke (NS003130) and an NIH grant (NS109284).



from ScienceBlog.com https://ift.tt/2MpcdjJ

Allergists offer reassurance regarding potential allergic reactions to COVID-19 vaccines

Allergists offer reassurance regarding potential allergic reactions to COVID-19 vaccines

Reports of possible allergic reactions to the COVID-19 vaccines produced by Pfizer-BioNTech and Moderna, both recently approved for emergency use by the U.S. Food and Drug Administration (FDA), have raised public concern. A team of experts led by allergists at Massachusetts General Hospital (MGH) has now examined all relevant information to offer reassurance that the vaccines can be administered safely even to people with food or medication allergies. The group’s review is published in the Journal of Allergy and Clinical Immunology: In Practice.

In response to accounts of potential allergic reactions in some people following COVID-19 vaccination in the United Kingdom, that country’s medical regulatory agency advised that individuals with a history of anaphylaxis to a medicine or food should avoid COVID-19 vaccination. After closer review of the data related to allergic reactions, however, the FDA recommended that the vaccines be withheld only from individuals with a history of severe allergic reactions to any component of the COVID-19 vaccine, and the Centers for Disease Control and Prevention advised that all patients be observed for 15 minutes post-vaccination by staff who can identify and manage such reactions. The U.S. agencies do not recommend that people with food or medication allergies avoid vaccination.

To provide insights from allergists’ perspectives, Aleena Banerji, MD, clinical director of the Allergy and Clinical Immunology Unit at MGH and associate professor at Harvard Medical School, and her colleagues have summarized what’s currently known about allergic reactions to vaccines like those developed against COVID-19, and they have proposed detailed advice so that individuals with different allergy histories can safely receive their first COVID-19 vaccine. They also outline steps on safely receiving the second dose in individuals who develop a reaction to their first dose of COVID-19 vaccine.

“As allergists, we want to encourage vaccination by reassuring the public that both FDA-approved COVID-19 vaccines are safe. Our guidelines are built upon the recommendations of U.S. regulatory agencies and provide clear steps to the medical community on how to safely administer both doses of the vaccine in individuals with allergic histories,” says Banerji.

The experts note that allergic reactions to vaccines are rare, with a rate of about 1.3 per 1 million people. They also determined that the Pfizer-BioNTech and Moderna COVID-19 vaccine allergic reactions will have a similarly low rate of occurrence. They stress that vaccine clinics will be monitoring all patients for 15 to 30 minutes and can manage any allergic reactions that occur. Banerji and her co-authors recommend that individuals with a history of anaphylaxis to an injectable drug or vaccine containing polyethylene glycol or polysorbate speak with their allergists before being vaccinated. They stress that patients with severe allergies to foods, oral drugs, latex, or venom can safely receive the COVID-19 vaccines.



from ScienceBlog.com https://ift.tt/3aXNJYY

Blood vessel cells implicated in chronic inflammation of obesity

Blood vessel cells implicated in chronic inflammation of obesity

When fat cells in the body are stuffed with excess fat, the surrounding tissue becomes inflamed. That chronic, low-level inflammation is one of the driving factors behind many of the diseases associated with obesity. Now, UT Southwestern scientists have discovered a type of cell responsible, at least in mice, for triggering this inflammation in fat tissue. Their findings, published in Nature Metabolism, could eventually lead to new ways to treat obesity.

“The inflammation of fat cells in obese individuals is linked to many of the comorbidities we associate with being overweight – cancer, diabetes, heart disease, and infection,” says study leader Rana Gupta, Ph.D., associate professor of internal medicine. “By identifying these cells, we’ve taken a step toward understanding some of the initial events that contribute to that inflammation.”

When a person consumes more calories than needed, the excess calories are stored in the form of triglycerides inside fat tissue, also known as white adipose tissue (WAT). Researchers know that in obese people, WAT becomes overworked, fat cells begin to die, and immune cells become activated. But the exact mechanism by which this inflammation occurs isn’t fully understood.

While many studies have focused on the signaling molecules produced by the fat cells or immune cells in WAT that might contribute to inflammation, Gupta’s team took a different approach. They focused instead on the vessels that carry blood – as well as immune cells and inflammatory molecules – into WAT.

In 2018, Gupta and his colleagues identified a new type of cell lining these blood vessels in mice – an adipose progenitor cell (APC), or precursor cell that goes on to generate mature fat cells. But unlike most APCs, the new cells – dubbed fibro-inflammatory progenitors, or FIPs – produced signals that encouraged inflammation. In the new work, the researchers looked more closely at the role of the FIPs in mediating inflammation.

Within just one day of switching young male mice to a high-fat diet, Gupta and his colleagues discovered that the FIPs quickly increased the number of inflammatory molecules produced. After 28 days on a high-fat diet, they found a substantial increase in the proportion of FIPs compared with other APCs.

“This is the first study to demonstrate that these cells play a very active, early role in being gatekeepers of inflammation in fat tissue,” says Gupta.

To show that the increase in the number and activity of the FIPs was not just a side effect of already-inflamed fat cells, the team removed a key immune signaling gene, Tlr4, from the FIPs in some mice. After five months on a high-fat diet, the mice lacking Tlr4 had gained just as much weight, and just as much fat, as other mice on a high-fat diet. But the genetically engineered mice – with FIPs that could no longer generate the same signals – no longer had high levels of inflammation. Instead, the levels of inflammatory molecules in their WAT were closer to the levels seen in mice on low-fat diets.

Gupta and his colleagues went on to show that increasing levels of a related signaling molecule, ZFP423, in FIPs can also ameliorate the inflammation in mouse fat cells. The findings point toward possible avenues to pursue to lower the risk of disease in people with obesity.

“It looks like ZFP423 could be an important brake in terms of slowing the inflammatory signals in these cells,” says Gupta. “Of course, it remains to be seen if that’s true in humans as well as mice.”

Gupta’s group is planning future experiments to better understand what aspect of a high-fat diet initiates the increased inflammatory signaling in FIPs, as well as whether the results hold true in human fat.



from ScienceBlog.com https://ift.tt/3n6GU9U

Scientists explore deficits in processing speed in individuals with spinal cord injury

Scientists explore deficits in processing speed in individuals with spinal cord injury

A team of rehabilitation researchers has studied processing speed deficits in individuals with spinal cord injury (SCI), comparing their brain activation patterns with those of healthy age-matched controls, and older healthy individuals. They found that the SCI group and older controls had similar activation patterns, but the SCI group differed significantly from their age-matched controls.

The article, “The neural mechanisms underlying processing speed deficits in individuals who have sustained a spinal cord injury: A pilot study” (doi: 10.1007/s10548-020-00798-x) was epublished on September 25, 2020 by Brain Topography. The authors are scientists with expertise in research in cognitive rehabilitation and SCI rehabilitation: Glenn Wylie, DPhil, Nancy D. Chiaravalloti, PhD, Erica Weber, PhD, Helen Genova, PhD, and Trevor Dyson-Hudson, MD, from Kessler Foundation, and Jill M. Wecht, EdD, from the James J. Peters VA Medical Center.

Individuals with chronic SCI have an increased risk for cognitive deficits that resemble the deficits associated with the aging process, giving rise to the theory of “accelerated cognitive aging.” As reported previously by this team, the deficits affect processing speed, new learning and memory, and verbal fluency, which are the domains affected during aging. This study is the first to examine the neural mechanisms of higher order cognitive tasks of individuals with SCI. The focus was on processing speed, which is known to be affected by SCI and aging, and is integral to cognitive function and everyday life activities.

The 30 participants were participants of a larger study who underwent optional neuroimaging studies at the Rocco Ortenzio Neuroimaging Center at Kessler Foundation — 10 individuals with cervical SCI, 10 age-matched controls, and 10 healthy older individuals. In addition to traditional neuropsychological testing methods, processing speed was tested in the scanner, using timed letter comparison tasks during functional magnetic resonance imaging (fMRI). This study was the first to use the modified letter comparison test.

Significant differences in brain activation were found between the SCI group and the age-matched control group, but the SCI and older groups had similar patterns, including activation of the hippocampal, frontal and parietal areas. “This suggests that individuals with SCI are compensating for deficits in processing speed by relying on the areas of the brain involved in executive control and memory,” noted Dr. Chiaravalloti, “which supports the theory of accelerated brain aging after SCI.”

Despite the limitations of sample size and level of injury, the study is an important contribution to our understanding of the impact of SCI on cognition, according to Dr. Wylie, director of the Ortenzio Center. “Our ability to observe brain activation while the individual performs specific cognitive tasks provides new information on the mechanisms that underlie the cognitive deficits that we now know affect a substantial proportion of the SCI population,” Dr. Wylie said. “Developing treatments targeted to these deficits depends on our pursuit of this line of research, which may benefit other populations affected by delayed processing speed.”



from ScienceBlog.com https://ift.tt/34Z0y1m

New Theory on ‘Venus’ Figurines

New Theory on ‘Venus’ Figurines

One of world’s earliest examples of art, the enigmatic ‘Venus’ figurines carved some 30,000 years ago, have intrigued and puzzled scientists for nearly two centuries. Now a researcher from the University of Colorado Anschutz Medical Campus believes he’s gathered enough evidence to solve the mystery behind these curious totems.

The hand-held depictions of obese or pregnant women, which appear in most art history books, were long seen as symbols of fertility or beauty. But according to Richard Johnson, MD, lead author of the study published today in the journal, Obesity, the key to understanding the statues lays in climate change and diet.

“Some of the earliest art in the world are these mysterious figurines of overweight women from the time of hunter gatherers in Ice Age Europe where you would not expect to see obesity at all,” said Johnson, a professor at the University of Colorado School of Medicine specializing in renal disease and hypertension. “We show that these figurines correlate to times of extreme nutritional stress.”

Early modern humans entered Europe during a warming period about 48,000 years ago. Known as Aurignacians, they hunted reindeer, horses and mammoths with bone-tipped spears. In summer they dined on berries, fish, nuts and plants. But then, as now, the climate did not remain static.

Clues in body ratios

As temperatures dropped, ice sheets advanced and disaster set in. During the coldest months, temperatures plunged to minus 10-15 degrees Celsius. Some bands of hunter gatherers died out, others moved south, some sought refuge in forests. Big game was overhunted.

It was during these desperate times that the obese Venus figurines appeared. They ranged between 6 and 16 centimeters in length and were made of stone, ivory, horn or occasionally clay. Some were threaded and worn as amulets.

Johnson and his co-authors, Professor (ret.) of Anthropology John Fox, PhD, of the American University of Sharjah in the United Arab Emirates, and Associate Professor of Medicine Miguel Lanaspa-Garcia, PhD, of the CU School of Medicine, measured the statues’ waist-to-hip and waist-to-shoulder ratios. They discovered that those found closest to the glaciers were the most obese compared to those located farther away. They believe the figurines represented an idealized body type for these difficult living conditions.

“We propose they conveyed ideals of body size for young women, and especially those who lived in proximity to glaciers,” said Johnson, who in addition to being a physician has an undergraduate degree in anthropology. “We found that body size proportions were highest when the glaciers were advancing, whereas obesity decreased when the climate warmed and glaciers retreated.”

Obesity as a desired condition

Obesity, according to the researchers, became a desired condition. An obese female in times of scarcity could carry a child through pregnancy better than one suffering malnutrition. So the Venus figurines may have been imbued with a spiritual meaning – a fetish or magical charm of sorts that could protect a woman through pregnancy, birth and nursing.

Many Venus figurines are well-worn, indicating that they were heirlooms passed down from mother to daughter through generations. Women entering puberty or in the early stages of pregnancy may have been given them in the hopes of imparting the desired body mass to ensure a successful birth.

“Increased fat would provide a source of energy during gestation through the weaning of the baby and as well as much needed insulation,” the authors said.

Promoting obesity, said Johnson, ensured that the band would carry on for another generation in these most precarious of climatic conditions.

“The figurines emerged as an ideological tool to help improve fertility and survival of the mother and newborns,” Johnson said. “The aesthetics of art thus had a significant function in emphasizing health and survival to accommodate increasingly austere climatic conditions.”

The team’s success in amassing evidence to support its theory came from applying measurements and medical science to archaeological data and behavioral models of anthropology.

“These kinds of interdisciplinary approaches are gaining momentum in the sciences and hold great promise,” Johnson said. “Our team has other subjects of Ice Age art and migration in its research sights as well.”



from ScienceBlog.com https://ift.tt/3aTrfIE

Water May be an Effective Treatment for Metabolic Syndrome

Water May be an Effective Treatment for Metabolic Syndrome

Researchers at the University of Colorado Anschutz Medical Campus have discovered that fructose stimulates the release of vasopressin, a hormone linked to obesity and diabetes. They also found that water can suppress the hormone and alleviate these conditions in mice.

“The clinical significance of this work is that it may encourage studies to evaluate whether simple increases in water intake may effectively mitigate obesity and metabolic syndrome,” said the study’s lead author Miguel A. Lanaspa, PhD, an associate professor at the University of Colorado School of Medicine specializing in renal disease and hypertension.

The study was published today in the journal JCI Insight.

Lanaspa and his colleague, Richard Johnson, MD, also a professor at the CU School of Medicine, wanted to understand why vasopressin, which maintains the body’s water levels, was elevated in those with obesity and diabetes.

They fed mice sugar water, specifically fructose, and found that it stimulated the brain to make vasopressin. The vasopressin in turn stored the water as fat causing dehydration which triggered obesity. Treating the mice with non-sugary water reduced the obesity.

According to Lanaspa, this is the first time scientists have shown how vasopressin acts on dietary sugar to cause obesity and diabetes.

“We found that it does this by working through a particular vasopressin receptor known as V1b,” he said. “This receptor has been known for a while but no one has really understood its function. We found that mice lacking V1b were completely protected  from the effects of sugar.  We also show that the administration of water can suppress vasopressin and both prevent and treat obesity.”

The researchers also discovered that dehydration can stimulate the formation of fat.

“This explains why vasopressin is so high in desert mammals as they do not have easy access to water,” Johnson said. “So vasopressin conserves water by storing it as fat.”

This data fits with observations showing that obese people often have signs of dehydration. It also explains why high salt diets may also cause obesity and diabetes.

The researchers found that water therapy in mice effectively protected against metabolic syndrome – a collection of conditions including high blood pressure, high blood sugar and high triglyceride levels that increase the risk of heart disease, stroke and type 2 diabetes.

“The best way to block vasopressin is to drink water,” Lanaspa said. “This is hopeful because it means we may have a cheap, easy way of improving our lives and treating metabolic syndrome.”

Johnson summed up the findings this way.

“Sugar drives metabolic syndrome in part by the activation of vasopressin. Vasopressin drives fat production likely as a mechanism for storing metabolic water,” he said. “The potential roles of hydration and salt reduction in the treatment of obesity and metabolic syndrome should be considered.”



from ScienceBlog.com https://ift.tt/2MgG39X

A pursuit of better testing to sort out the complexities of ADHD

A pursuit of better testing to sort out the complexities of ADHD

The introduction of computer simulation to the identification of symptoms in children with attention deficit/hyperactivity disorder (ADHD) has potential to provide an additional objective tool to gauge the presence and severity of behavioral problems, Ohio State University researchers suggest in a new publication.

Most mental health disorders are diagnosed and treated based on clinical interviews and questionnaires – and, for about a century, data from cognitive tests has been added to the diagnostic process to help clinicians learn more about how and why people behave in a certain way.

Cognitive testing in ADHD is used to identify a variety of symptoms and deficits, including selective attention, poor working memory, altered time perception, difficulties in maintaining attention and impulsive behavior. In the most common class of performance tests, children are told to either press a computer key or avoid hitting a key when they see a certain word, symbol or other stimulus.

A pursuit of better testing to sort out the complexities of ADHD
Nadja Ging-Jehli

For ADHD, however, these cognitive tests often don’t capture the complexity of symptoms. The advent of computational psychiatry – comparing a computer-simulated model of normal brain processes to dysfunctional processes observed in tests – could be an important supplement to the diagnostic process for ADHD, the Ohio State researchers report in a new review published in the journal Psychological Bulletin.

The research team reviewed 50 studies of cognitive tests for ADHD and described how three common types of computational models could supplement these tests.

It is widely recognized that children with ADHD take longer to make decisions while performing tasks than children who don’t have the disorder, and tests have relied on average response times to explain the difference. But there are intricacies to that dysfunction that a computational model could help pinpoint, providing information clinicians, parents and teachers could use to make life easier for kids with ADHD.

“We can use models to simulate the decision process and see how decision-making happens over time – and do a better job of figuring out why children with ADHD take longer to make decisions,” said Nadja Ging-Jehli, lead author of the review and a graduate student in psychology at Ohio State.

Ging-Jehli completed the review with Ohio State faculty members Roger Ratcliff, professor of psychology, and L. Eugene Arnold, professor emeritus of psychiatry and behavioral health.

The researchers offer recommendations for testing and clinical practice to achieve three principal goals: better characterizing ADHD and any accompanying mental health diagnoses such as anxiety and depression, improving treatment outcomes (about one-third of patients with ADHD do not respond to medical treatment), and potentially predicting which children will “lose” the ADHD diagnosis as adults.

Decision-making behind the wheel of a car helps illustrate the problem: Drivers know that when a red light turns green, they can go through an intersection – but not everyone hits the gas pedal at the same time. A common cognitive test of this behavior would repeatedly expose drivers to the same red light-green light scenario to arrive at an average reaction time and use that average, and deviations from it, to categorize the typical versus disordered driver.

This approach has been used to determine that individuals with ADHD are typically slower to “start driving” than those without ADHD. But that determination leaves out a range of possibilities that help explain why they’re slower – they could be distracted, daydreaming, or feeling nervous in a lab setting. The broad distribution of reactions captured by computer modeling could provide more, and useful, information.

“In our review, we show that this method has multiple problems that prevent us from understanding the underlying characteristics of a mental-health disorder such as ADHD, and that also prevent us from finding the best treatment for different individuals,” Ging-Jehli said. “We can use computational modeling to think about the factors that generate the observed behavior. These factors will broaden our understanding of a disorder, acknowledging that there are different types of individuals who have different deficits that also call for different treatments.

“We are proposing using the entire distribution of the reaction times, taking into consideration the slowest and the fastest reaction times to distinguish between different types of ADHD.”

The review also identified a complicating factor for ADHD research going forward – a broader range of externally evident symptoms as well as subtle characteristics that are hard to detect with the most common testing methods. Understanding that children with ADHD have so many biologically based differences suggests that a single task-based test is not sufficient to make a meaningful ADHD diagnosis, the researchers say.

“ADHD is not only the child who is fidgeting and restless in a chair. It’s also the child who is inattentive because of daydreaming. Even though that child is more introverted and doesn’t express as many symptoms as a child with hyperactivity, that doesn’t mean that child doesn’t suffer,” Ging-Jehli said. Daydreaming is especially common in girls, who are not enrolled in ADHD studies nearly as frequently as boys, she said.

Ging-Jehli described computational psychiatry as a tool that could also take into account – continuing the analogy – mechanical differences in the car, and how that could influence driver behavior. These dynamics can make it harder to understand ADHD, but also open the door to a broader range of treatment options.

“We need to account for the different types of drivers and we need to understand the different conditions to which we expose them. Based on only one observation, we cannot make conclusions about diagnosis and treatment options,” she said.

“However, cognitive testing and computational modeling should not be seen as an attempt to replace existing clinical interviews and questionnaire-based procedures, but as complements that add value by providing new information.”

According to the researchers, a battery of tasks gauging social and cognitive characteristics should be assigned for a diagnosis rather than just one, and more consistency is needed across studies to ensure the same cognitive tasks are used to assess the appropriate cognitive concepts.

Finally, combining cognitive testing with physiological tests – especially eye-tracking and EEGs that record electrical activity in the brain – could provide powerful objective and quantifiable data to make a diagnosis more reliable and help clinicians better predict which medicines would be most effective.

Ging-Jehli is putting these suggestions to the test in her own research, applying a computational model in a study of a specific neurological intervention in children with ADHD.

“The purpose of our analysis was to show there’s a lack of standardization and so much complexity, and symptoms are hard to measure with existing tools,” Ging-Jehli said. “We need to understand ADHD better for children and adults to have a better quality of life and get the treatment that is most appropriate.”

This research was supported by the Swiss National Science Foundation and the National Institute on Aging.



from ScienceBlog.com https://ift.tt/3rDFwzd

One psychedelic experience may lessen trauma of racial injustice

One psychedelic experience may lessen trauma of racial injustice

A single positive experience on a psychedelic drug may help reduce stress, depression and anxiety symptoms in Black, Indigenous and people of color whose encounters with racism have had lasting harm, a new study suggests.

The participants in the retrospective study reported that their trauma-related symptoms linked to racist acts were lowered in the 30 days after an experience with either psilocybin (Magic Mushrooms), LSD or MDMA (Ecstasy).

One psychedelic experience may lessen trauma of racial injustice
Alan Davis

“Their experience with psychedelic drugs was so powerful that they could recall and report on changes in symptoms from racial trauma that they had experienced in their lives, and they remembered it having a significant reduction in their mental health problems afterward,” said Alan Davis, co-lead author of the study and an assistant professor of social work at The Ohio State University.

Overall, the study also showed that the more intensely spiritual and insightful the psychedelic experience was, the more significant the recalled decreases in trauma-related symptoms were.

A growing body of research has suggested psychedelics have a place in therapy, especially when administered in a controlled setting. What previous mental health research has generally lacked, Davis noted, is a focus on people of color and on treatment that could specifically address the trauma of chronic exposure to racism.

Davis partnered with co-lead author Monnica Williams, Canada Research Chair in Mental Health Disparities at the University of Ottawa, to conduct the research.

“Currently, there are no empirically supported treatments specifically for racial trauma. This study shows that psychedelics can be an important avenue for healing,” Williams said.

The study is published online in the journal Drugs: Education, Prevention and Policy.

The researchers recruited participants in the United States and Canada using Qualtrics survey research panels, assembling a sample of 313 people who reported they had taken a dose of a psychedelic drug in the past that they believed contributed to “relief from the challenging effects of racial discrimination.” The sample comprised adults who identified as Black, Asian, Hispanic, Native American/Indigenous Canadian, Native Hawaiian and Pacific Islander.

Once enrolled, participants completed questionnaires collecting information on their past experiences with racial trauma, psychedelic use and mental health symptoms, and were asked to recall a memorable psychedelic experience and its short-term and enduring effects. Those experiences had occurred as recently as a few months before the study and as long ago as at least 10 years earlier.

The discrimination they had encountered included unfair treatment by neighbors, teachers and bosses, false accusations of unethical behavior and physical violence. The most commonly reported issues involved feelings of severe anger about being subjected to a racist act and wanting to “tell someone off” for racist behavior, but saying nothing instead.

Researchers asked participants to recall the severity of symptoms of anxiety, depression and stress linked to exposure to racial injustice in the 30 days before and 30 days after the experience with psychedelic drugs. Considering the probability that being subjected to racism is a lifelong problem rather than a single event, the researchers also assessed symptoms characteristic of people suffering from discrimination-related post-traumatic stress disorder (PTSD).

“Not everybody experiences every form of racial trauma, but certainly people of color are experiencing a lot of these different types of discrimination on a regular basis,” said Davis, who also is an adjunct faculty member in the Johns Hopkins University Center for Psychedelic and Consciousness Research. “So in addition to depression and anxiety, we were asking whether participants had symptoms of race-based PTSD.”

Participants were also asked to report on the intensity of three common kinds of experiences people have while under the influence of psychedelic drugs: a mystical, insightful or challenging experience. A mystical experience can feel like a spiritual connection to the divine, an insightful experience increases people’s awareness and understanding about themselves, and a challenging experience relates to emotional and physical reactions such as anxiety or difficulty breathing.

All participants recalled their anxiety, depression and stress symptoms after the memorable psychedelic experience were lower than they had been before the drug use. The magnitude of the positive effects of the psychedelics influenced their reduction in symptoms.

“What this analysis showed is that a more intense mystical experience and insightful experience, and a less intense challenging experience, is what was related to mental health benefits,” Davis said.

The researchers noted in the paper that the study had limitations because the findings were based on participant recall and the entire sample of recruited research volunteers had reported benefits they associated with their psychedelic experience – meaning it cannot be assumed that psychedelics will help all people of color with racial trauma. Davis and Williams are working on proposals for clinical trials to further investigate the effects of psychedelics on mental health symptoms in specific populations, including Black, Indigenous and people of color.

“This was really the first step in exploring whether people of color are experiencing benefits of psychedelics and, in particular, looking at a relevant feature of their mental health, which is their experience of racial trauma,” Davis said. “This study helps to start that conversation with this emerging treatment paradigm.”

This work was funded by the University of Ottawa, the Canada Research Chairs Program and the National Institutes of Health. Additional co-authors included Yitong Xin of Ohio State’s College of Social Work; Nathan Sepeda of Johns Hopkins; Pamela Grigas and Sinead Sinnott of the University of Connecticut; and Angela Haeny of Yale School of Medicine.



from ScienceBlog.com https://ift.tt/3pAwgdo

A single gene ‘invented’ haemoglobin several times

A single gene 'invented' haemoglobin several times

Thanks to the marine worm Platynereis dumerilii, an animal whose genes have evolved very slowly, scientists from CNRS, Université de Paris and Sorbonne Université, in association with others at the University of Saint Petersburg and the University of Rio de Janeiro, have shown that while haemoglobin appeared independently in several species, it actually descends from a single gene transmitted to all by their last common ancestor. These findings were published on 29 December 2020 in BMC Evolutionary Biology.

Having red blood is not peculiar to humans or mammals. This colour comes from haemoglobin, a complex protein specialized in transporting the oxygen found in the circulatory system of vertebrates, but also in annelids (a worm family whose most famous members are earthworms), molluscs (especially pond snails) and crustaceans (such as daphnia or ‘water fleas’). It was thought that for haemoglobin to have appeared in such diverse species, it must have been ‘invented’ several times during evolution. But recent research has shown that all of these haemoglobins born ‘independently’ actually derive from a single ancestral gene.

Researchers from the Institut Jacques Monod (CNRS/Université de Paris), the Laboratoire Matière et Systèmes Complexes (CNRS/Université de Paris), the Station Biologique de Roscoff (CNRS/Sorbonne Université), the Universities of Saint Petersburg (Russia) and Rio de Janeiro (Brazil), conducted this research on Platynereis dumerilii, a small marine worm with red blood.

It is considered to be an animal that evolved slowly, because its genetic characteristics are close to those of the marine ancestor of most animals, Urbilateria(1). Studying these worms by comparing them with other species with red blood has helped in tracing back to the origins of haemoglobins.

The research focused on the broad family to which haemoglobins belong: globins, proteins present in almost all living beings that ‘store’ gases like oxygen and nitric oxide. But globins usually act inside the cells because they do not circulate in the blood like haemoglobin.

This work shows that in all species with red blood, it is the same gene that makes a globin called ‘cytoglobin’ that independently evolved to become a haemoglobin-encoding gene. This new circulating molecule made oxygen transport more efficient in their ancestors, who became larger and more active.

Scientists now want to change scale and continue this work by studying when and how the different specialized cells of bilaterian vascular systems emerged.

(1)Urbilateria is the last common ancestor of bilaterians, i.e. animals with bilateral (left-right) symmetry and complex organs, apart from species with simpler organization such as sponges and jellyfish.



from ScienceBlog.com https://ift.tt/2M6wLx9

How and why privileged defendants fare better in criminal court than non-privileged ones.

How and why privileged defendants fare better in criminal court than non-privileged ones.

Race and class make a difference in experiences and outcomes for criminal defendants in a system that emphasizes control and getting defendants to give in, according to sociologist Matthew Clair.



from ScienceBlog.com https://ift.tt/3pyv9L2

The science behind extinction

The science behind extinction

A collection of research and insights from Stanford experts who are deciphering the mysteries and mechanisms of extinction and survival in Earth’s deep past and painting an increasingly detailed picture of life now at the brink.

An estimated 8 million animal and plant species live on planet Earth. But extinction rates are accelerating. Gorillas, gazelles, frogs, rhinos and whales are among the species now critically endangered, and human activities present the biggest threat.

In mass extinctions, a huge portion of the planet’s species die off over thousands or even millions of years – a geological blink. Scientists have identified five of these events in fossil data going back roughly half a billion years.

Scientists who study past extinction events can find clues about not only the evolution of life on Earth, but also about the effects of extreme changes in our planet’s atmosphere, and how life finds ways to rebound. Stanford scientists and colleagues have uncovered evidence, for example, that the biggest extinction in Earth’s history was caused by global warming that left ocean animals unable to breath.

Other research, coauthored by Stanford geophysicist Sonia Tikoo-Schantz, suggests the crater from the giant asteroid impact linked to the dinosaur extinction some 66 million years ago may have provided niches for life.

“The fossil record is our only archive of past extinction events,” Stanford paleobiologist Jonathan Payne has said. It allows researchers to examine directly which biological traits tend to lead to higher extinction risk under different circumstances, whether in the wake of an asteroid impact or volcanic eruption, or amid global warming.

Many scientists say a sixth mass extinction is now under way. In 2019, following a review of thousands of scientific and government sources, the United Nations’ Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services reported that approximately 1 million animal and plant species are threatened with extinction. According to the report, more than 40 percent of amphibian species, nearly 33 percent of sharks, shark relatives and reef-forming corals, and upwards of 33 percent of all marine mammals are threatened.

Even parasites are under threat. Up to one-third of the world’s parasite species could go extinct within a few decades – potentially opening new niches for other, invasive parasites to exploit. And losses can snowball. As Stanford biologist Paul Ehrlich and colleagues wrote in a recent study suggesting the extinction rate is likely much higher than previously thought, “Extinction breeds extinction.”

When species vanish, benefits to humanity can be lost, too – from economic opportunities related to ecotourism to keeping in check populations of species that can spread infectious disease. The UN report estimated that as much as $577 billion in annual global crops are now at risk from loss of pollinators. The consequences do not fall equally across society. The areas projected to see some of the worst negative effects from biodiversity loss and related changes to ecosystem functions are also home to many of the globe’s poorest communities as well as large concentrations of Indigenous peoples.

This collection covers how scientists are deciphering the mysteries and mechanisms of extinction and survival in Earth’s deep past and painting an increasingly detailed picture of life now at the brink.

Scroll down for extinction research news and insights spanning the future of our own species; the disappearance of our hominid relatives and why the Neanderthals’ fate could have been ours; expanding knowledge of past extinctions; the root causes and ripple effects of the Earth’s ongoing biodiversity crisis; and connections between extinctions and pandemics.



from ScienceBlog.com https://ift.tt/2L7V0uj

And, by the way, what does climate have to do with it all, anyway?!

And, by the way, what does climate have to do with it all, anyway?!

I can’t tell you how many times I’ve heard the word “sustainability” come up in conversation lately. The word seems to be all the rage. That straightforward construct here undoubtedly needs no elaboration, no explanation.

Not so with climate, another buzzword, catch-phrase of this modern era we’re living in. While the number of times I hear climate is let’s just say, innumerable – cannot be counted – there is more to climate than just its connection to meteorology. Don’t forget there is such thing as political climate, social climate, etc.

As we are currently in a period of transition if not uncertainty in going forward, climate, or rather the atmosphere to be perfectly clear, is tenuous – the air is pretty thin if you get my drift. And, this is nothing new as we’ve seen this sort of thing before.

We saw it prior to the 20th to the 21st century odometer rollover what with the Y2K scare. But, here’s the thing: the technological problem-solving community was on it early – it had at least five years to get it figured out. The bottom line is, it all got resolved and neither was there any serious backlash from it as a result.

Even preceding that, there was trepidation or angst among scientists familiar with the 20th century environmental threats of acid rain and the more catastrophic or damaging stratospheric layer ozone depletion going on at the time. So, you’ve heard the word disruption, right? Not only were these “disruptions” identified but accordingly dealt with.

Since that time many are the tools at our disposal that can better help humankind identify new potentially life-altering ecologically disastrous situations and consequently stave off their advances, and on top of this are at a level of sophistication that, if nothing else, inspires awe. Technology has indeed evolved.

With all the resources in the toolkit with which we can draw upon to help us problem-solve, if the will of the majority of people just isn’t there to render such problems null and void, then what does it matter how effective the available defenses? I’m thinking the coronavirus outbreak and specifically the vaccine to combat if not eradicate it will do little good if too few get vaccinated.

Such, I fear, is also the case with climate change which, because of its abstract nature, makes the prospect of climate normalization that much more difficult. Such is also the case concerning climate change posing an existential threat. The reality is many people simply don’t subscribe to that notion. On the other hand, of those who believe the cause of climate change is human-driven, there are those who take the position that in terms of progress made to right the ship, with them it’s a case of either not enough’s been done or what’s been done has come too late.

This, thankfully, is not the case with air pollution and its effects. It’s more a case here of if it talks like a duck and walks like a duck, it must be a duck. There is simply no denying that air pollution exists or that it poses a clear-and-present danger to human health.

In entering the 21st century’s third decade, what we need is to feel inspired and empowered that we will get these crises – the coronavirus and climate change – under control. COVID-19 has showed us what can happen is we simply drive less, the result here being cleaner air. That very prospect should prompt us to align and put us on the same page in terms of our working in a cohesive, collective, cooperative and coordinated manner to bring levels of pollution in the air to safe levels everywhere lives are negatively impacted by it. Why? Just because it’s the correct thing to do not to mention that the benefits to be enjoyed in so doing are, well, priceless.

And, you just never know, but it just could be that at the end of the day that not only climate but the economy will be beneficiaries also. Here’s hoping!

Image: Mchavez, Cornfield, Fallon, Nev.

– Alan Kandel



from ScienceBlog.com https://ift.tt/3b0UYzF

‘Scorecard’ highlights building, transportation energy efficiency progress in 100 large U.S. cities in 2020

‘Scorecard’ highlights building, transportation energy efficiency progress in 100 large U.S. cities in 2020

It is no secret that all across the world in the past couple of decades the area of clean energy has experienced leaps-and-bounds growth. One organization, the American Council for an Energy-Efficient Economy (ACEEE), annually compiles what it calls a “City Clean Energy Scorecard.”

The American Council for an Energy-Efficient Economy (ACEEE) in “Scorecard: Leading U.S. Cities Grow Clean Energy Efforts but Many More Lag Far Behind,” Oct. 6, 2020 ACEEE press release explains, “[The scorecard] provides the most comprehensive national measuring stick for climate progress and a roadmap for future improvements.”

Data supplied by publicly available sources, utilities and communities alike acquired by the ACEEE is analyzed and along with the findings from said analysis, is what is presented in the report. A hundred major American cities – housing nearly 20 percent of the national population – were evaluated in this year’s scorecard, up from the 75 analyzed in 2019.

What’s important to note is, “The report assesses policies adopted by May 1, 2020. The public health and economic devastation wrought by COVID-19, as well as the growing outrage over racial disparities and their impacts on communities of color, could cause city policy priorities to change. The report argues that as cities focus on economic recovery in the context of COVID-19, energy efficiency and renewable energy remain a crucial strategy for creating jobs and keeping investment within local communities. Furthermore, a closer focus on equitable planning and investment can yield benefits that have historically been unavailable to communities of color.”

Expanding on this idea further in the release in question is report author lead and ACEEE local policy program director, David Ribeiro, who said: “‘Many cities are really seizing the moment and embracing policies that help them fight climate change, while too many others are, frankly, doing very little,’” and who then went on to state, “‘We want to show all the cities, even the leaders, the further steps they can take to cut carbon emissions most effectively and equitably.’”

Scorecard highlights

“Among the report’s findings:

  • Washington, DC; Denver; Los Angeles; San José; and Oakland rounded out the top 10 highest-ranked cities, with San José and Oakland making the top 10 for the first time.
  • The top 10 cities embraced new actions. Boston and Los Angeles updated codes to require new buildings be pre-wired for electric vehicle charging stations at more parking spaces, and San Francisco convened a network to work with marginalized communities to establish equitable zero-emissions residential building strategies.
  • Paul (#16) was the most-improved city, taking key steps to improve efficiency of existing buildings, reduce total vehicle miles traveled, and embrace renewable energy. St. Louis (#28) was the second-most-improved city; in April, it became the third city in the country to require large existing buildings to meet a performance standard, which will drive energy efficiency upgrades.
  • More cities are making efforts to increase community engagement with, and clean energy investments in, low-income communities and communities of color. Washington, DC, formed an equity advisory group to develop recommendations to be incorporated in its clean energy plans. But nearly all cities have substantial room to ramp up their efforts.
  • Bottom-scoring cities’ policy efforts have either stagnated or not started; these cities are years behind the leaders. To scale up climate efforts across the country, more cities will need to adopt and implement effective clean energy policies.
  • Many cities are encouraging electric utilities and state regulators to increase the use of renewable energy in the power system. Twenty-four cities submitted comments on public utility commission proceedings, entered into utility partnerships, enacted community choice aggregation programs, or participated in planning efforts with utilities.”

“… New York City leaped to first place in the ranking—spurred in part by a new law ensuring upgrades to many inefficient buildings—followed by Boston and Seattle (tied for second place) and Minneapolis and San Francisco (tied for fourth place),” as reported in the release by the ACEEE.

Scorecard city policy areas assessed, meanwhile, include: Building policies, community-wide initiatives, energy and water utilities, local government operations, and transportation policies, the ACEEE in the release reported.

Information on the 2020 City Clean Energy Scorecard is available here while there is more about the American Council for an Energy-Efficient Economy here.

Related article: “States see forward progress in areas of energy and transportation efficiency.”

Published by Alan Kandel



from ScienceBlog.com https://ift.tt/3puRUiT

A critical look at surface-air-temperature change: Influencing ground-level ozone or not is the question

A critical look at surface-air-temperature change: Influencing ground-level ozone or not is the question

Ozone (O3) is a colorless but not odorless gas. When ignited, ozone gives off a chlorine-bleach-like odor. So, obviously, based on this, there is a point at which ozone ignites, and it is when this ignition point is reached that the chlorine-bleachish smell from the ozone is produced and released. It is in this sense that temperature has a direct influence on ozone when the change in is such that it becomes high enough to cause a change in the way ozone in this particular instance behaves.

In the stratospheric layer high above the earth’s surface, ozone is an asset in that it blocks much of the ultraviolet (UV) radiation coming from the sun from reaching the surface. When it was discovered that stratospheric ozone was being destroyed by ozone-depleting substances on earth that made their way to the stratosphere and eventually causing a gaping hole to open up over the Antarctic region, this became on Earth tremendous cause for concern. The problem having been identified and addressed, that hole is now diminishing in size and integrity to stratospheric ozone is being restored.

Ozone on the ground or in the tropospheric layer is a completely different story. This is the so-called “bad ozone,” the kind that when inhaled causes damage to human health. Due to ozone’s corrosive nature, it more or less burns away the delicate tissue in the lungs. This ozone type can trigger asthmatic responses in some while in others can cause wheezing, coughing, chronic obstructive pulmonary disease (COPD) and worse in others, and can even lead to death.

In combatting the scourge that ozone is, what is extremely helpful is having an understanding of how ozone forms in the troposphere, what facilitates its perpetuation and best way to approach it in terms of its mitigation and possibly ozone’s complete ground-level elimination.

In regard to ozone’s atmospheric or tropospheric creation, there are three contributing factors enabling its formation, these being chemical, light and temperature.

Chemical: The chemical components of ozone are: hydrocarbons (HC) and oxides of nitrogen (NOx). The mixing of these so-called ozone-forming chemicals in the presence of sunlight and, yes, heat, prompts ground-level ozone formation.

Sunlight: Light from the sun is a huge determining factor. This is the reason why ozone and consequently smog is problematic in daytime only – when the sun is shining. On the other hand, when the sun is absent from the sky, smog is likewise.

Temperature: The phenomenon that ozone is, well, it’s also a warm-weather driven, which is why ozone isn’t evident when the air temperature drops below a certain threshold.

While humans can control the ozone precursor chemical inputs, what are out of the control of humans are the contributing sunlight and temperature factors.

More food for thought

What we want to concentrate on next is the temperature or more precisely the temperature change component as a potential influencer on a changing atmospheric ozone picture.

Okay, so check this out: “Unless offset by additional emissions reductions of ozone precursor emissions, there is high confidence that climate change will increase ozone levels over most of the United States, particularly over already polluted areas, thereby worsening the detrimental health and environmental effects due to ozone. The climate penalty results from changes in local weather conditions, including temperature and atmospheric circulation patterns, as well as changes in ozone precursor emissions that are influenced by meteorology. Climate change has already had an influence on ozone concentrations over the United States, offsetting some of the expected ozone benefit from reduced precursor emissions. The magnitude of the climate penalty over the United States could be reduced by mitigating climate change,” is the opening paragraph of the “Executive Summary” of Chapter 13: “Air Quality” in the Impacts, Risks, and Adaptation in the United States: Fourth National Climate Assessment, Volume II document.1

But what exactly should be made of or taken away from this?

So, in the next section, “State of the Sector,” under the “Key Message 1: Increasing Risks from Air Pollution” subheading, there is this qualifying statement: “Although competing meteorological effects determine ozone levels, temperature is often the largest single driver.”2

Temperature as atmospheric ozone-concentration driver, and “the largest single driver” at that, hmmm!

And there is this qualifying statement: “Assessments of climate change impacts on ozone trends are complicated by year-to-year changes in weather conditions and require multiple years of model information to estimate the potential range of effects.”3

Okay, so we know that meteorology and chemicals play a role in the formation of ground-level ozone. But climate change – how exactly does this factor in?

It would appear that the climate is changing, this change seems to have prompted a change in air circulation patterns and indications are that fossil-fuel-burning activity is a driver, though how much of a driver is not known definitively. What is known definitively is that the average temperature at the surface of the earth has changed: It’s been increasing. Since 1800 there has been a 1.9 degree Fahrenheit increase. It has also been established that 2011 to 2020 is the warmest decade on record. The sea-ice extent in the Arctic is smaller.

Established also, according to the same source, is increasing durations of drier weather.4 Longer periods of hotter temperatures could mean increased opportunity for daily ozone (smog) formations. As can be seen, though, this is more about weather than it is about climate.

Expanding on that, what with the hotter summer temperatures covering a broader swath, add in a sufficient amount of ozone precursor emissions in the atmosphere, there could be more and more locations – big cities, small towns, rural tracts – that become smog-impacted whereas that might not have been the case otherwise.

If there was ever a question as to there being a surface-air-temperature-change-ground-level-ozone-influencing connection, any doubt about that can now be removed. Hopefully, further clarity has been brought to this whole notion of surface-air-temperature-change-as-ground-level-ozone influencer.

And now, with any luck, the statement: “concomitant with a rise in average ambient air temperature at the earth’s surface, in this air, coming also is a corresponding jump in pollution,” the one we saw in “Scientific analysis: Facts-driven. Any questions?!” has way more meaning.

Notes

  1. Nolte, C.G., P.D. Dolwick, N. Fann, L.W. Horowitz, V. Naik, R.W. Pinder, T.L. Spero, D.A. Winner, and L.H. Ziska, 2018: Air Quality. In Impacts, Risks, and Adaptation in the United States: Fourth National Climate Assessment, Volume II [Reidmiller, D.R., C.W. Avery, D.R. Easterling, K.E. Kunkel, K.L.M Lewis, T.K. Maycock, and B.C. Stewart (eds.)]. U.S. Global Change Research Program, Washington, DC, USA, pp. 512-538. doi: 10.7930/NCA4.2018.CH13
  2. Ibid, p. 519
  3. Ibid
  4. Ibid, p. 514

Image above: AndrewHorne

Published by Alan Kandel



from ScienceBlog.com https://ift.tt/37OdAkd

38% of Americans lack confidence in election fairness

38% of Americans lack confidence in election fairness

With the Georgia Senate runoff elections set for Jan. 5, 2021, a nationwide survey conducted post-election could provide insights about voter perceptions of fairness in the U.S. election and trust in democratic institutions.

Researchers from a university consortium of Northwestern, Harvard, Northeastern and Rutgers surveyed more than 24,000 individuals across the nation between Nov. 3 and 30. The survey found that overall, 38% of Americans lack confidence in the fairness of the 2020 presidential election. That number is especially high among Republicans (64%) and Trump voters (69%) compared to Democrats (11%) and Biden voters (8%).

“This level of distrust is not surprising, given political rhetoric, but it certainly is concerning. Elections are the foundation of our democracy and loss of faith in the process could undermine the new administration’s legitimacy and ability to get things done,” said James Druckman, the Payson S. Wild Professor of political science in the Weinberg College of Arts and Sciences at Northwestern and associate director of the University’s Institute for Policy Research.

The survey showed large partisan gaps of over 40 percentage points in public concern about mail-in fraud (85% of Republicans and 38% of Democrats), inaccurate or biased vote counts (84% of Republicans and 44% of Democrats) and illegal votes from non-citizens (81% of Republicans and 34% of Democrats).

To better understand the reasons why some Americans distrust the election process, respondents were asked about their level of concern regarding voter suppression, intimidation, inaccurate or biased counts and interference.

The problem most people found troubling was voter suppression (making it harder for certain groups to vote), with over two-thirds of respondents (67%) saying they were somewhat or very concerned about it. Voter intimidation was a concern for 62% of respondents, while inaccurate or biased vote counts concerned 60% of Americans. Foreign country interference was a concern for 59%, mail-in ballot fraud for 57% and illegal votes from non-citizens was a concern for 52%.

“These numbers create a puzzle for the current Senate elections in Georgia,” said Druckman. “For some, the concerns may de-mobilize but for others it may be a mobilizing factor to get your vote in, especially to combat concerns about suppression and intimidation.”

The three most polarizing election process issues with partisan gaps of over 40 percentage points had been heavily promoted by President Trump and received attention by right wing media. These included mail-in fraud (reported as somewhat or very concerning for 85% of Republicans but only 38% of Democrats), inaccurate or biased vote counting (a concern for 84% of Republicans and 44% of Democrats) and illegal votes from non-citizens (a concern for 81% of Republicans and 34% of Democrats).

Partisan differences were lowest with regard to foreign interference in the election (60% Republicans and 63% Democrats), voter intimidation (60% Republicans and 67% Democrats) and voter suppression (63% or Republicans and 73% of Democrats).

“The results make clear that we have a long way to go to restore faith in our electoral process,” Druckman said.

Additional survey findings:

  • Asked about acceptable reactions to an unfair election, 45% of Americans approved of protesting on social media, 38% of protesting in person, 18% approved of violating laws without violence, and 8% of using violence. Non-violent law breaking was approved by 23% Democrats and 17% Republicans, violence by 10% Democrats and 8% Republicans.
  • 69% of Americans trust the Supreme Court to handle the election, 43% trust the news media, 31% trust social media companies. Trump is trusted by 39%, Biden by 59%.
  • Considerable partisan gaps emerged for all institutions except Congress, which was trusted by 57% of Republicans and 53% of Democrats.
  • When asked who won the election, 67% of Americans say President-elect Biden won the election and 17% suggest that Donald Trump is the winner. Thirty-nine percent of Republicans and 3% of Democrats reported thinking Trump is probably winning or definitely winning.

Read the current report here as well as previous reports by the COVID-19 Consortium.



from ScienceBlog.com https://ift.tt/2KqMmrb

Featured Post

⚡ Robust Online Climate Prediction with Recursive Mixed-Norm Methods🌍

🌍 Climate monitoring increasingly relies on interconnected sensor networks that generate massive streams of environmental data. Graph Sign...

Popular