BlogBlog Archive

Entries for 'Sergio Graziosi'

15
Evidence use in public health – make-do and mend?

Dylan Kneale and Antonio Rojas-García reflect on recent work exploring the use of evidence in local public health decision-making. In new climates of public health decision-making, where the local salience of research evidence becomes an even more important determinant of its use, they question how much research is being wasted because it is not generalisable in local settings.

[Read the rest of this article...]

09
Broadening our understanding of ‘evidence’ for humanitarian aid to maximise learning where we currently know least

Sandy Oliver discusses whether the worlds of academia and humanitarianism can combine to improve the delivery and understanding of the processes and benefits of humanitarian aid through use of evidence.
Interest is growing, when making decisions within the humanitarian sector, in drawing on systematic reviews of studies that assess the effects of different policies or practices. As in other sectors, such research evidence is considered alongside what else is known, such as about competing priorities, social norms, available resources or ease of implementing a programme. Professor Sandy Oliver argues that in contexts where rigorous studies of effects are few and far between, perhaps because conducting research is difficult in such circumstances, it is useful to learn from systematic reviews that encompass other forms of knowledge that are commonly held by individuals and organisations delivering humanitarian aid. These broader systematic reviews increasingly come from partnerships of academics and humanitarian organisations. Strengthening the links between research and work in the field helps create evidence-informed policy/practice, and policy/practice-informed evidence.

[Read the rest of this article...]

08
Producing evidence synthesis for the humanitarian sector: challenges and solutions

Many humanitarians are evidence-aware, but may find it difficult to draw on what is known or find knowledge that speaks to their context. They may also be pressed for time to find or judge the relevance of what is often a dispersed literature. To address this gap the Humanitarian Evidence Programme, a partnership between Oxfam and Feinstein International Center at Tufts University, published eight systematic reviews in areas identified as a priority by humanitarian policy and practitioner stakeholders. Typical of the sector, and similar to international development, decision-makers ask very broad questions. Kelly Dickson and Mukdarut Bangpan reflect on the challenges we encountered when producing a mixed methods evidence synthesis for this programme, on mental health and psychosocial programmes for people affected by humanitarian emergencies.

[Read the rest of this article...]

10
Scientific reliability and the role of theory

The replication crisis, publication bias, p-hacking, harking, bad incentives, undesirable pressures and probably other factors all contribute to diminish the trustworthiness of published research, with obvious implications for research synthesis. Sergio Graziosi asks whether demanding simple theoretical clarity might be part of the solution.

[Read the rest of this article...]

23
Believing

[Warning: do not read this with small kids around!] Mark Newman poses some questions in theme with the seasonal festivities: what does it mean to believe in Father Christmas? Does it really differ that much from belief in the role of evidence? We at the EPPI-Centre are happy to rise to the occasion and wish all of our readers a very Merry Christmas and a happy and prosperous New Year.

 

[Read the rest of this article...]

09
The search for significant others: p-values rarely engage

It is conventional in the social sciences to report p-values when communicating the results of statistical analyses. There are, however, increasing criticisms of the p-value for being open to misinterpretation and – worse – at risk of falsely indicating the presence of an effect. Alison O’Mara-Eves considers a further problem: failing to engage readers with the meaning behind the numbers. Some alternative ways of reporting the results of analyses are considered.

[Read the rest of this article...]

Note: Articles on the EPPI-Centre Blog reflect the views of the author and not necessarily those of the EPPI-Centre or UCL. The editorial and peer review process used to select blog articles is intended to identify topics of interest. See also the comments policy.

Copyright 2016 Social Science Research Unit, UCL Institute of Education :: Privacy Statement :: Terms Of Use :: Site Map :: Login
About::Projects::Publications::Research Use::Courses & seminars::Resources::Databases::Blog