GLOBAL RESEARCH SYNDICATE
No Result
View All Result
  • Login
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
No Result
View All Result
globalresearchsyndicate
No Result
View All Result
Home Data Collection

Introducing: A New Series on the Practical Takeaways From Research

globalresearchsyndicate by globalresearchsyndicate
January 22, 2020
in Data Collection
0
0
SHARES
8
VIEWS
Share on FacebookShare on Twitter

Opinion

—Vanessa Solis/Education Week. Source images: OstapenkoOlena/Getty

A new series explores what educators need to know

By

Heather C. Hill & Susanna Loeb

Welcome to “What Works, What Doesn’t.”

Educators and policymakers want to make good choices for schools and districts. And research can help. For people in charge of schools and classrooms starting with “what the research says” can be critical in navigating the challenges of boosting student learning and creating environments where children thrive.

Research brings to bear facts that have been collected and analyzed in purposeful, systematic, and often public ways. Its power to rise above the anecdotal is why people in medicine, business, and every type of public policy increasingly refer to it.

Yet rarely does a single research study provide irrefutable evidence that one choice is better than another, because each study takes place with a particular set of schools, teachers, and students, and because each study tests a specific policy or program. And findings from single research studies often conflict; for instance, school turnaround policies may improve student achievement in one context but not in another.

Although the cries for “evidence” are frequent in the education space, evidence can prove elusive to practitioners: Where is it? How sound is it? What does it tell us about real-life situations? This essay is the first in a series that aims to put the pieces of research together so they can be used by those charged with choosing which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

In order to be useful, then, research evidence needs synthesis. Each study is a puzzle piece, telling us where different approaches to improving teaching or schools have reached their goals and where they haven’t. Some also tell us why the approach worked or didn’t work in a particular context. Only when brought together can the studies help us predict what factors are likely to be in play when we make one choice of practice or program or policy over another. While the flaws in research can be frustrating, once synthesized the evidence in many cases points us quite clearly to one choice over another. Unfortunately, those eager to learn what research has to teach often lack this sort of overview.

In this Education Week twice-a-month series, we will piece together the evidence on issues facing state and district policymakers, principals, and teachers. Look for topics such as having teachers examine student data, homework management, principal leadership-development programs, parent engagement, social-emotional learning, school finance, and STEM instructional improvement in these pages and online. (And we will be looking for you to suggest other topics by using #EdResearchToPractice on Twitter or submitting a comment to the online version of this essay and the others in this series.)

Policymakers and school and district leaders should be forewarned that some of the evidence we present—even for very popular programs and policies—will seem grim. Pruning back unhelpful practices creates room for new and better ones and breathing space for teachers and administrators who right now feel like they operate under the gun of too many (and sometimes competing) policies and programs.

In synthesizing research to see what practices are effective, we will care about both outcomes for students (such as achievement) and outcomes for classrooms (such as the quality of instruction)—we view classroom outcomes as important in their own right. Children deserve safe, caring, and intellectually stimulating classrooms, both because that’s their home for up to six hours a day, and because classroom instruction is the primary in-school mechanism through which children learn. Thus, we will include studies, when possible, that improve classroom outcomes, even if we do not know their effects on student outcomes.

When we do write about child outcomes, we will consider more than state standardized test scores. While such scores are important—and often critically important to the policymakers and leaders charged with raising them—many other observable child outcomes shape later-life experiences or are significant, too. Engagement in school, for example, reflects the richness of students’ day-to-day experiences and allows them to develop the capacities they need for later success. We can measure this engagement through both concrete actions, such as students’ choices to stay in school or take more challenging classes, and more abstract measures like students’ sense of connectedness to school, efficacy with difficult content, and hope for the future. Increasingly, studies address these types of important outcomes, and, where possible, we will give preference to work that sheds light on how to provide students with opportunities to develop in these ways.

A few notes about what you can expect from this project:

• We are aiming to provide accessible information. Principals, district leaders, state policymakers, teachers—anyone who makes decisions about the practices and policies we discuss here—should all find something relevant.

• We will try—and fail—at being comprehensive. While we will make an attempt to find every study on every topic, we will surely miss some, particularly unpublished reports and lesser-known published articles.

• We will give preference to studies with rigorous designs. By rigorous designs, we generally mean studies that examine the effects of a policy or program by comparing participants with a plausibly similar group of non-participants. (Plausible similarity may be achieved through random assignment of participants to the program or to a no-treatment comparison group, but it can be achieved in other ways as well.)

• Not all topics will be ripe for review. Emerging policy questions and programs, such as teacher anti-bias initiatives and personalized learning, may not have sufficient evidence to warrant a review. While we hope that we can find an evidence base on these and other similarly hot topics, we may have to leave some aside or address them later. When we do address questions without clear evidence, we will be sure to emphasize the resulting limits.

• We will not be experts in all the topics you want to know about. After 40-plus combined years in academia, however, we have a lot of practice reading empirical studies. If the question is timely and the evidence supports a synthesis, we will summarize it.

• We will be wrong some of the time. The only way we could be consistently correct would be to avoid addressing the complex issues that are most important for education improvement. When we are wrong (and even when we are correct), we invite discussion from other scholars, policymakers, and practitioners in the online spaces we have suggested.

• Expect some guest authors. We will occasionally hand over a review to a scholar with a particularly innovative and impactful study, new and thoughtful ideas, or long experience in a field.

If research is to fulfill its promise to education, researchers have to scoop up the puzzle pieces and put together a meaningful picture for policymakers and practitioners. We look forward to writing this series, and to engaging in dialogue with you as we move forward.

Heather C. Hill is a professor of education at the Harvard Graduate School of Education and studies teacher quality, teacher professional learning, and instructional improvement. Her broader interests include educational policy and social inequality. Susanna Loeb is a professor of education and of public affairs at Brown University and the director of the university’s Annenberg Institute for School Reform. She studies education policy, and her interests include social inequality.

Vol. 39, Issue 19, Page 22

Published in Print: January 22, 2020, as Weighing the Research: What Works, What Doesn’t




Back to Top Back to Top

Related Posts

How Machine Learning has impacted Consumer Behaviour and Analysis
Consumer Research

How Machine Learning has impacted Consumer Behaviour and Analysis

January 4, 2024
Market Research The Ultimate Weapon for Business Success
Consumer Research

Market Research: The Ultimate Weapon for Business Success

June 22, 2023
Unveiling the Hidden Power of Market Research A Game Changer
Consumer Research

Unveiling the Hidden Power of Market Research: A Game Changer

June 2, 2023
7 Secrets of Market Research Gurus That Will Blow Your Mind
Consumer Research

7 Secrets of Market Research Gurus That Will Blow Your Mind

May 8, 2023
The Shocking Truth About Market Research Revealed!
Consumer Research

The Shocking Truth About Market Research: Revealed!

April 25, 2023
market research, primary research, secondary research, market research trends, market research news,
Consumer Research

Quantitative vs. Qualitative Research. How to choose the Right Research Method for Your Business Needs

March 14, 2023
Next Post
Physical activity focus group sees enthusiasm in Saskatchewan

Physical activity focus group sees enthusiasm in Saskatchewan

Categories

  • Consumer Research
  • Data Analysis
  • Data Collection
  • Industry Research
  • Latest News
  • Market Insights
  • Marketing Research
  • Survey Research
  • Uncategorized

Recent Posts

  • Ipsos Revolutionizes the Global Market Research Landscape
  • How Machine Learning has impacted Consumer Behaviour and Analysis
  • Market Research: The Ultimate Weapon for Business Success
  • Privacy Policy
  • Terms of Use
  • Antispam
  • DMCA

Copyright © 2024 Globalresearchsyndicate.com

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT
No Result
View All Result
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights

Copyright © 2024 Globalresearchsyndicate.com