Connect with us

News

New Skeletal Disease Found And Explained

Published

on

Researchers at Karolinska Institutet in Sweden have discovered a new and rare skeletal disease. In a study published in the journal Nature Medicine they describe the molecular mechanism of the disease, in which small RNA molecules play a role that has never before been observed in a congenital human disease. The results are important for affected patients but can also help scientists to understand other rare diagnoses.

The newly identified skeletal disease was first observed in a parent and a child from a Swedish family.

“They came to my clinic,” says the study’s lead author Giedre Grigelioniene, physician and associate professor at the Department of Molecular Medicine and Surgery, Karolinska Institutet.

“They’d received a different diagnosis previously, but it didn’t fit with what we were seeing in the X-rays. I was convinced that we were looking at a new diagnosis that had not been described before.”

A long, arduous process then began to examine the finding further. The results of these efforts are now published in a study in Nature Medicine, in which Giedre Grigelioniene and her colleagues describe the new skeletal disease – a type of skeletal dysplasia – and its mechanism.

Mutation identified

Together with Fulya Taylan, assistant professor at the same department at Karolinska Institutet, the disease causing mutation in a gene called MIR140 was identified. The gene does not give rise to a protein but to a so-called micro-RNA (miR-140), a small RNA molecule that regulates other genes.

Working alongside with Tatsuya Kobayashi, associate professor at Massachusetts General Hospital, Harvard Medical School in Boston, USA, the researchers produced a mouse model of the disease, using the CRISPR-Cas9 “molecular scissors” technique to create a strain carrying the identified mutation. They subsequently observed that the animals’ skeletons displayed the same aberrations as the three patients in the study.

The researchers also show that the identified mutation leads to an abnormal expression of several important genes in the cartilaginous growth plates and the ends of the long tubular bones. These studies were done in collaboration with Hiroshi Suzuki, researcher at the Massachusetts Institute of Technology in the lab of Phillip Sharp, Nobel laureate in medicine. Some genes that are normally suppressed by miR-140 are expressed, while others are down-regulated.

“This causes a change in skeletal growth, deformed joints and the delayed maturation of cartilage cells in the patients, who have short stature, small hands and feet and joint pain,” says Dr Grigelioniene.

Normal function knocked out

The identified mutation knocks out a normal function of the micro-RNA, which is replaced by a different function. The mechanism is called neomorphic and has never before been described involving small RNAs in human congenital disease. A similar mechanism in cancer cells was described last year in a paper in Nature Genetics by researchers who were also involved in the present study.

According to Dr Grigelioniene, the results now published are important both for patients with the disease and for scientists interested in how small regulatory RNA molecules are involved in the development of human congenital disease.

“We plan to examine whether similar mechanisms with mutations in small RNA genes are involved in the development of other rare congenital disorders,” she says.

“As for patients who already have this disease, the results mean that they can choose to use prenatal fetal diagnostic, in order not to pass the disease on to their children”.

News

Face It. Our Faces Don’t Always Reveal Our True Emotions

Published

on

Actor James Franco looks sort of happy as he records a video diary in the movie “127 Hours.” It’s not until the camera zooms out, revealing his arm is crushed under a boulder, that it becomes clear his goofy smile belies his agony.

That’s because when it comes to reading a person’s state of mind, visual context — as in background and action — is just as important as facial expressions and body language, according to a new study from UC Berkeley.

The findings, to appear online this week in the journal Proceedings of the National Academy of Sciences, challenge decades of research positing that emotional intelligence and recognition are based largely on the ability to read micro-expressions signaling happiness, sadness, anger, fear, surprise, disgust, contempt and other positive and negative moods and sentiments.

Implications include the potential limitations of facial recognition software.

“Our study reveals that emotion recognition is, at its heart, an issue of context as much as it is about faces,” said study lead author Zhimin Chen, a doctoral student in psychology at UC Berkeley.

Researchers blurred the faces and bodies of actors in dozens of muted clips from Hollywood movies and home videos. Despite the characters’ virtual invisibility, hundreds of study participants were able to accurately read their emotions by examining the background and how they were interacting with their surroundings.

The “affective tracking” model that Chen created for the study allows researchers to track how people rate the moment-to-moment emotions of characters as they view videos.

Chen’s method is capable of collecting large quantities of data in a short time, and could eventually be used to gauge how people with disorders like autism and schizophrenia read emotions in real time and help with their diagnoses.

“Some people might have deficits in recognizing facial expressions, but can recognize emotion from the context,” Chen said.

“For others, it’s the opposite.”

Moreover, the findings, based on statistical analyses of the ratings collected, could inform the development of facial recognition technology.

“Right now, companies are developing machine learning algorithms for recognizing emotions, but they only train their models on cropped faces and those models can only read emotions from faces,” Chen said.

“Our research shows that faces don’t reveal true emotions very accurately and that identifying a person’s frame of mind should take into account context as well.”

For the study, Chen and study senior author David Whitney, a UC Berkeley vision scientist, tested the emotion recognition abilities of nearly 400 young adults. The visual stimuli they used were video clips from various Hollywood movies as well as documentaries and home videos that showed emotional responses in more natural settings.

In the first of three experiments, 33 study participants viewed interactions in movie clips between two characters, one of which was blurred, and rated the perceived emotions of the blurred character. The results showed that study participants inferred how the invisible character was feeling based not only on their interpersonal interactions, but also from what was happening in the background.Study participants went online to view and rate the video clips. A rating grid was superimposed over the video so that researchers could track each study participant’s cursor as it moved around the screen, processing visual information and rating moment-to-moment emotions.

Next, approximately 200 study participants viewed video clips showing interactions under three different conditions: one in which everything was visible, another in which the characters were blurred, and another in which the context was blurred. The results showed that context was as important as facial recognition for decoding emotions.

In the final experiment, 75 study participants viewed clips from documentaries and home videos so that researchers could compare emotion recognition in more naturalistic settings. Again, the context was as critical for inferring the emotions of the characters as were their facial expressions and gestures.

“Overall, the results suggest that context is not only sufficient to perceive emotion, but also necessary to perceive a person’s emotion,” said Whitney, a UC Berkeley psychology professor.

“Face it, the face is not enough to perceive emotion.”

Continue Reading

News

Working Long Hours Linked To Depression In Women

Published

on

Women who work more than 55 hours a week are at a higher risk of depression but this is not the case for men, according to a new UCL-led study with Queen Mary University of London.

The study of over 20,000 adults, published today in the BMJ’s Journal of Epidemiology & Community Health, found that after taking age, income, health and job characteristics into account, women who worked extra-long hours had 7.3% more depressive symptoms than women working a standard 35-40 week. Weekend working was linked to a higher risk of depression among both sexes.

Women who worked for all or most weekends had 4.6% more depressive symptoms on average compared to women working only weekdays. Men who worked all or most weekends had 3.4% more depressive symptoms than men working only weekdays.

“This is an observational study, so although we cannot establish the exact causes, we do know many women face the additional burden of doing a larger share of domestic labour than men, leading to extensive total work hours, added time pressures and overwhelming responsibilities,” explained Gill Weston (UCL Institute of Epidemiology and Health Care), PhD candidate and lead author of the study.

“Additionally women who work most weekends tend to be concentrated in low-paid service sector jobs, which have been linked to higher levels of depression.”

The study showed that men tended to work longer hours in paid work than women, and having children affected men’s and women’s work patterns in different ways: while mothers tended to work fewer hours than women without children, fathers tended to work more hours than men without children.

Two thirds of men worked weekends, compared with half of women. Those who worked all or most weekends were more likely to be in low skilled work and to be less satisfied with their job and their earnings than those who only worked Monday to Friday or some weekends.

Researchers analysed data from the Understanding Society, the UK Household Longitudinal Study (UKHLS). This has been tracking the health and wellbeing of a representative sample of 40,000 households across the UK since 2009.

Information about working hours, weekend working, working conditions and psychological distress was collected from 11,215 working men and 12,188 working women between 2010 and 2012. Depressive symptoms such as feeling worthless or incapable were measured using a self-completed general health questionnaire.

“Women in general are more likely to be depressed than men, and this was no different in the study,” Weston said.

“Independent of their working patterns, we also found that workers with the most depressive symptoms were older, on lower incomes, smokers, in physically demanding jobs, and who were dissatisfied at work.”

She added: “We hope our findings will encourage employers and policy-makers to think about how to reduce the burdens and increase support for women who work long or irregular hours — without restricting their ability to work when they wish to.

“More sympathetic working practices could bring benefits both for workers and for employers — of both sexes.”

Continue Reading

News

Brain Response To Mom’s Voice Differs In Kids With Autism

Published

on

For most children, the sound of their mother’s voice triggers brain activity patterns distinct from those triggered by an unfamiliar voice. But the unique brain response to mom’s voice is greatly diminished in children with autism, according to a new study from the Stanford University School of Medicine.

The diminished response was seen on fMRI brain scans in face-processing regions and learning and memory centers, as well as in brain networks that process rewards and prioritize different stimuli as important.

The findings were published Feb. 26 in eLife.

“Kids with autism often tune out from the voices around them, and we haven’t known why,” said the study’s lead author, Dan Abrams, PhD, clinical assistant professor of psychiatry and behavioral sciences at Stanford.

“It’s still an open question how this contributes to their overall difficulties with social communication.”

The results suggest that the brains of children with autism are not wired to easily tune into mom’s voice, Abrams said. The study also found that the degree of social communication impairment in individual children with autism was correlated with the degree of abnormality in their brain responses to their mother’s voice.

“This study is giving us a handle on circuits and vocal stimuli that we have to make more engaging for a child with autism,” said the study’s senior author, Vinod Menon, PhD, the Rachael L. and Walter F. Nichols, MD, Professor and professor of psychiatry and behavioral sciences.

“We now have a template for targeting specific neural circuits with cognitive therapies.”

An important social cue

Mom’s voice is an important social cue for most children. For instance, tiny babies recognize and are soothed by their mother’s voice, while young teenagers are more comforted by words of reassurance spoken by their mothers than the same words sent by their mothers via text message, prior research has shown. The response to mom’s voice has a distinct brain-activation signature in children without autism, a 2016 paper co-authored by Abrams and Menon demonstrated.

Autism is a developmental disorder that affects 1 in 59  children. It is characterized by social and communication difficulties, restricted interests and repetitive behaviors. The disorder exists on a spectrum, with some children more impaired than others.

Mom’s voice is the primal cue for social and language communication and learning.

The new study included 42 children ages 7 to 12. Half had autism, and the other half didn’t. The children had their brains scanned using functional magnetic resonance imaging while listening to three different recorded sounds: their mother’s voice; the voices of unfamiliar women; and nonvocal environmental sounds. In the voice recordings, the women said nonsense words to avoid activating language comprehension regions in the brain.

The researchers compared patterns of brain activation and brain network connectivity between the two groups of children. They also asked the children to identify whether each brief (956-millisecond) voice recording they heard came from their mother or an unfamiliar woman. Children without autism correctly identified their mothers’ voices 97.5 percent of the time; those with autism identified their mothers’ voices 87.8 percent of the time, a statistically significant difference.

The brain response to unfamiliar voices, when compared with the response to environmental sounds, was fairly similar in children with and without autism, although those with autism had less activity in one area of the auditory association cortex.

When comparing the brain response to mom’s voice versus unfamiliar voices, children without autism had many more brain areas activated: Mom’s voice preferentially lit up part of the hippocampus, a learning and memory region, as well as face-processing regions. Brain-connectivity patterns measured in a network that included auditory-processing regions, reward-processing regions and regions that determine the importance, or salience, of incoming information also distinguished children with autism from children without autism. The network impairments in individual children with autism were also linked to their individual level of social communication impairment.

‘Really striking relationship’

“There is this really striking relationship between the strength of activity and connectivity in reward and salience regions during voice processing and children’s social communication activity,” Abrams said.

This suggests that brain responses to mom’s voice are a key element for building social communication ability, he added.

The findings support the social motivation theory of autism, which suggests that social interaction is intrinsically less engaging for children with the disorder than those without it.

Many current autism therapies involve motivating children to engage in specific types of social interaction. It would be interesting to conduct future studies to see whether these therapies change the brain characteristics uncovered in this study, the researchers said.

“Mom’s voice is the primal cue for social and language communication and learning,” Menon said.

“There is an underlying biological difference in the brain circuitry in autism, and this is a precision-learning signal we can target.”

Continue Reading

Like Us on Facebook

Trending Posts

Trending