2020Health|May 2018 | Tackling obesity – What the UK can learn from other countries
This report examines topical obesity intervention strategies from around the world to frame the question: can the UK learn from policy abroad? (2020 Health)
The report includes ten case studies from across three continents which provide some limited learning for the UK at a national level, and stronger evidence based learning (as supported by the literature) at the local level. These include:
A health-in-all-policies approach is vital at both the national and local level.
Compulsory national policies on school-based education, health and wellbeing can give greater strength and support to local action on obesity prevention.
Taxation, whether considered, planned or implemented, can encourage manufacturer reformulation of products to healthier options.
Taxation can create additional revenues for government; the amount will vary according to the extent of product reformulation.
The report also recognises a number of barriers:
Initiatives undertaken without evaluation processes have limited opportunity to encourage buy-in and support for similar strategies elsewhere.
A widening price gap between cheap junk food and more expensive healthier options is creating a barrier to healthier diets in the poorest households. • Obesity prevention with a school-only focus often shows no effect in the long term, leading to stakeholder discouragement and possible disinclination to pursue further strategies.
Opt-out by many publicly funded schools (mainly academies and free-schools) of health supporting initiatives, such as food technology (cooking), is a major impediment to health literacy and the implementation of schemes in the vein of EPODE*. (2020Health)
*EPODE is a community based intervention to prevent obesity originating in France that has now been piloted in over 17 countries.
The full press release is available from 2020Health
University of Huddersfield | May 2018 | Children with Autism are able to create imaginary friends
A new study demonstrates that children with Austistic Spectrum Disorder (ASD) are able to create imaginary friends. This challenges previous research that suggests children with autism are unable to engage in imaginary play (via Science Daily).
The researchers used over 200 questionnaires completed by UK and US parents of children diagnosed with ASD and parents of children with typical development (TD). Although the findings show that children with a diagnosis of ASD were less likely to create an imaginary friend (16.2 per cent) compared to 42 per cent of their TD peers; they were older when they begin engaging in this kind of play, and were also more likely to play with a “personified object” such as a stuffed toy or doll. The researchers argue that there is no difference in the quality of the play the children engage in.
According to Dr Paige Davis, of the University of Huddersfield and the lead author of the study: “The finding that children diagnosed with ASD even spontaneously create such imaginary companions refutes existing beliefs that they are not imagining in the same way as typically developing children.”
Office for National Statistics |April 2018 | Measuring National Well-being: Quality of Life in the UK, 2018
The latest review covers specific indicators of well-being, including personal well-being, relationships, health, mental well-being and job satisfaction. It focuses on focuses on the main differences between age groups. The measures include both objective data (for example, unemployment rate) and subjective data (for example, satisfaction with job) to provide a more complete view of the nation’s progress than economic measures such as gross domestic product (GDP) can do alone.
The latest update of the Measuring of National Well-being programme provides a broadly positive picture of life in the UK, with most indicators either improving or staying the same over the short-term (one year) and long-term (five years).
Younger people (mainly aged 16 to 24) were more likely to report higher ratings of satisfaction with their health and engage in physical activities.
The main challenges for younger people include unemployment, loneliness, having someone to rely on and a lack of sense of belonging to their neighbourhood.
People in their early and middle years (mainly aged 25 to 54) were generally more likely to be in employment, but less likely to be satisfied with their leisure time.
Older people (mainly aged 75 and over) were more likely to be satisfied with their income, leisure time, feel they can cope financially and belong to their neighbourhood.
The main challenges for older people are lower satisfaction with their health and lower engagement with an art or cultural activity. (via ONS)
Children who have less exposure to animals and dust have less stress-resilient and may be at higher risk of mental illness than children who are raised in rural environments (via Science Daily).
The study, a collaboration between a university in Germany and in the US, recruited 40 males aged between 20 and 40. Twenty had grown up on farms with farm animals, the other half were raised in an urban environment- the majority without pets. As part of the study the men were given two tests: a speech in front of an audience and a maths problem to solve under timed conditions. Blood and saliva were taken before and then at intervals after the test. The researchers found that those who grew up in a city had an exaggerated immune response to the low stressor, despite this the participants in this group reported feeling less stressed than the other groups.
Urbanization is on the rise, and environments offering a narrow range of microbial exposures are linked to an increased prevalence of both physical and mental disorders. Human and animal studies suggest that an overreactive immune system not only accompanies stress-associated disorders but might even be causally involved in their pathogenesis. Here, we show in young [mean age, years (SD): rural, 25.1 (0.78); urban, 24.5 (0.88)] healthy human volunteers that urban upbringing in the absence of pets (n equal to 20), relative to rural upbringing in the presence of farm animals (n equal to 20), was associated with a more pronounced increase in the number of peripheral blood mononuclear cells (PBMCs) and plasma interleukin 6 (IL-6) concentrations following acute psychosocial stress induced by the Trier social stress test (TSST). Moreover, ex vivo-cultured PBMCs from urban participants raised in the absence of animals secreted more IL-6 in response to the T cell-specific mitogen Con A. In turn, antiinflammatory IL-10 secretion was suppressed following TSST in urban participants raised in the absence of animals, suggesting immunoregulatory deficits, relative to rural participants raised in the presence of animals. Questionnaires, plasma cortisol, and salivary α-amylase, however, indicated the experimental protocol was more stressful and anxiogenic for rural participants raised in the presence of animals. Together, our findings support the hypothesis that urban vs. rural upbringing in the absence or presence of animals, respectively, increases vulnerability to stress-associated physical and mental disorders by compromising adequate resolution of systemic immune activation following social stress and, in turn, aggravating stress-associated systemic immune activation.
Böbel, T.S., et al |Less immune activation following social stress in rural vs. urban participants raised with regular or no animal contact, respectively |Böbel, T.S., et al |Proceedings of the National Academy of Sciences| Apr 2018| 201719866 |DOI:10.1073/pnas.1719866115
University College London | May 2018 | Mid-life anxiety may be linked to later life dementia
A new research paper published in the BMJ Open analysed studies looking at the association between mid-life anxiety, depression, and the development of dementia. The researchers from University College London (UCL) and the University of Southampton searched databases for published studies. While only four out of over 3500 studies met their criteria for inclusion this was equivalent to nearly 30000 people. They suggest that an abnormal stress response, experienced in moderate to severe anxiety, may increase brain cell ageing and degenerative changes in the central nervous system, so increasing vulnerability to dementia. For this reason they suggest that anxiety should be considered by doctors as a risk factor for dementia. Currently, it remains unclear if treatment for anxiety could potentially curb dementia (via UCL).
Objectives Anxiety is an increasingly recognised predictor of cognitive deterioration in older adults and in those with mild cognitive impairment. Often believed to be a prodromal feature of neurodegenerative disease, anxiety may also be an independent risk factor for dementia, operationally defined here as preceding dementia diagnosis by more than or equal to 10 years.
Design A systematic review of the literature on anxiety diagnosis and long-term risk for dementia was performed following published guidelines.
Setting and participants Medline, PsycINFO and Embase were searched for peer-reviewed journals until 8 March 2017. Publications reporting HR/OR for all-cause dementia based on clinical criteria from prospective cohort or case–control studies were selected. Included studies measured clinically significant anxiety in isolation or after controlling for symptoms of depression, and reported a mean interval between anxiety assessment and dementia diagnosis of at least 10 years. Methodological quality assessments were performed using the Newcastle-Ottawa Scale.
Outcome measure HR/OR for all-cause dementia.
Results Searches yielded 3510 articles, of which 4 (0.02%) were eligible. The studies had a combined sample size of 29 819, and all studies found a positive association between clinically significant anxiety and future dementia. Due to the heterogeneity between studies, a meta-analysis was not conducted.
Conclusions Clinically significant anxiety in midlife was associated with an increased risk of dementia over an interval of at least 10 years. These findings indicate that anxiety may be a risk factor for late-life dementia, excluding anxiety that is related to prodromal cognitive decline. With increasing focus on identifying modifiable risk factors for dementia, more high-quality prospective studies are required to clarify whether clinical anxiety is a risk factor for dementia, separate from a prodromal symptom.
Gimson A, Schlosser M, Huntley JD, et al | Support for midlife anxiety diagnosis as an independent risk factor for dementia: a systematic review| BMJ Open | 2018| 8|e019399| doi: 10.1136/bmjopen-2017-019399
Stroke Association | April 2018 | New’ brain health index’ could predict memory and thinking problems after stroke
The Stroke Association have released a press release highlighting recent research published in the International Journal of Stroke. The researchers recruited 288 participants in Edinburgh, including stroke and lupus patients and healthy working age volunteers. Brain scans currently enable doctors to see when a stroke has occurred; but a new computer programme, the Brain Health Index (BHI) is more sophisticated in terms of predicting the level of memory and thinking (cognitive) problems patients will experience after stroke, than more time consuming current methods (via Stroke Association).
The programme was created and developed by scientists Universities of Edinburgh and Glasgow and co- funded by the Stroke association. It is up to ten times more effective in assessing whole brain deterioration and helping to predict cognitive function than current tools. The programme is able to translate many pieces of information from brain scans into a single measure, the brain health index.
Dr David Alexander Dickie, from the University of Glasgow’s Institute of Cardiovascular and Medical Sciences, said: “We recognised a need for a more inclusive approach to assessing common brain disorders of ageing. Our new method allows us to use every piece of information from a brain scan, rather than just individual features of the brain that can only tell us so much about a person’s risk for cognitive problems. (University of Glasgow)
The BHI method will now be tested in newly-developed brain scanners, and in larger groups of patients.
According to new research from MIT (Massachusetts Institute of Technology) published in the journal Cognition, the ability to acquire a new language decreases after the age of 10. The researchers found that the ability to have an understanding of English grammar equivalent to that of a native speaker language acquisition must start before 10. Similarly, they found that a ‘critical period’ exists up to the age of 17 or 18 for learning a language after which the ability decreases. Almost 670,000 people completed a quiz which tested non-native speakers of English, but did not include other languages. Participants were asked to determine if a sentence was grammatically correct.
Children learn language more easily than adults, though when and why this ability declines have been obscure for both empirical reasons (underpowered studies) and conceptual reasons (measuring the ultimate attainment of learners who started at different ages cannot by itself reveal changes in underlying learning ability). We address both limitations with a dataset of unprecedented size (669,498 native and non-native English speakers) and a computational model that estimates the trajectory of underlying learning ability by disentangling current age, age at first exposure, and years of experience. This allows us to provide the first direct estimate of how grammar-learning ability changes with age, finding that it is preserved almost to the crux of adulthood (17.4 years old) and then declines steadily. This finding held not only for “difficult” syntactic phenomena but also for “easy” syntactic phenomena that are normally mastered early in acquisition. The results support the existence of a sharply-defined critical period for language acquisition, but the age of offset is much later than previously speculated. The size of the dataset also provides novel insight into several other outstanding questions in language acquisition.
Hartshorne, J .K., Tenenbaum, J.B., & Pinker, S.| 2018 | A critical period for second language acquisition: Evidence from 2/3 million English speakers | Cognition | Available online 2 May 2018 | https://doi.org/10.1016/j.cognition.2018.04.007
University of Leeds | April 2018 | The effects of diet on the start of the menopause
University of Leeds researchers have led research into a woman’s diet and the onset of menopause. The study is the first of its kind to look at the relationship between diet and the start of the menopause. Researchers used data from more than 14,150 women living in the UK (via University of Leeds).
The 14 172 participants in the cohort study completed a detailed diet questionnaire and were also surveyed about their reproductive history and health. These women were followed up four years later to enable researchers to assess their diets. They found over 900 women between the ages of 40 and 65 had experienced the natural start of their menopause at the time of the follow-up survey, meaning they had not had menstrual periods for at least a year and their menopause had not been brought on by cancer, surgery or pharmaceutical treatments. The average age at the start of the menopause for women in the UK is 51 years.
A high intake of carbohydrates was associated with an earlier menopause by 1.5 years, whereas a diet rich in oily fish was associated with a later natural menopause up to 3.3 years. A high intake of zinc and vitamin B6 was also associated with a later menopause.
Earlier research findings have suggested that earlier onset of menopause is associated with lower bone density, osteoporosis and increased risk of cardiovascular diseases. Having a later menopause has been linked to a higher risk for cancers such as breast, ovarian and endometrial.
Background Age at natural menopause is a matter of concern for women of reproductive age as both an early or late menopause may have implications for health outcomes.
Methods Study participants were women aged 40–65 years who had experienced a natural menopause from the UK Women’s Cohort Study between baseline and first follow-up. Natural menopause was defined as the permanent cessation of menstrual periods for at least 12 consecutive months. A food frequency questionnaire was used to estimate diet at baseline. Reproductive history of participants was also recorded. Regression modelling, adjusting for confounders, was used to assess associations between diet and age at natural menopause.
Results During the 4-year follow-up period, 914 women experienced a natural menopause. A high intake of oily fish and fresh legumes were associated with delayed onset of natural menopause by 3.3 years per portion/day (99% CI 0.8 to 5.8) and 0.9 years per portion/day (99% CI 0.0 to 1.8), respectively. Refined pasta and rice was associated with earlier menopause (per portion/day: −1.5 years, 99% CI −2.8 to −0.2). A higher intake of vitamin B6 (per mg/day: 0.6 years, 99% CI 0.1 to 1.2) and zinc (per mg/day: 0.3 years, 99% CI −0.0 to 0.6) was also associated with later age at menopause. Stratification by age at baseline led to attenuated results.
Conclusion Our results suggest that some food groups (oily fish, fresh legumes, refined pasta and rice) and specific nutrients are individually predictive of age at natural menopause.
Dunneram Y, Greenwood DC, Burley VJ, et al | Dietary intake and age at natural menopause: results from the UK Women’s Cohort Study | J Epidemiol Community Health | Published Online First: 30 April 2018| doi: 10.1136/jech-2017-209887