Skip to main content
Back to Top

 

How engaged are we? Measuring community engagement and partnership

Content on this page is provided for reference purposes only. It is no longer maintained and may now be outdated.

For this month’s discussion on implementation science, I am joined by Dr. Kurt Stange to share our thoughts on an important topic related to implementation science – stakeholder engagement.  There is an emerging consensus that a different type of translational or implementation science is needed to address many seemingly intractable problems and to translate health research into practice and policy.  The science should be relationship-centered and utilize partnership-based scientific methods that combine rigor, relevance, and stakeholder engagement.

According to many, this type of approach is already in practice.  However, there seems to be a discrepancy between what is claimed and what is actually found when the research is assessed on objective measures  of engagement/partnership (e.g., asking questions of research partners, team members or patients; observe, or document, interaction characteristics).  It’s about ‘walking our talk’ and not about using vague, politically correct terms in our proposals.  It’s about reflecting upon how well we are doing in implementing true partnerships.  It takes time to build relationships, but the investment makes all the difference and translational science needs to better identify and utilize methods of measuring and reporting on these levels of participation and engagement.

These principles and related actions operate at multiple levels and involve different partners and stakeholders at each level.  In this brief call to action, we simplify this multi-level challenge into three prototypic levels of partnership: (1) patient-practitioner; (2) research team; and (3) community research.   Important at each level is the ability to listen and not immediately advise; share rather than control (and often give up large amounts of control); and modify approaches based on input from others.  Below, we speculate about how to tell when one is ‘walking the talk’.

At the Individual (e.g. Consumer or End-user; Practitioner) level, certain factors determine the degree to which research is patient-centered and includes: who makes the final decisions; distribution of talk time and interruptions; how tailored an intervention is to an individual's values, preferences and culture; and whether there is any systematic collection and documentation of individual/patient perspective, preferences, values, or person-directed goals. 

At the Team level, another set of factors indicates whether research teams are transdisciplinary and team-based, and includes:  distribution of talk time; who proposes vs. tweaks ideas; who prepares the agenda; and how equitable the budgets, outputs and credit are.  Please see the recent issue of Translational Behavioral Medicine to learn more about team-based science.

At the Community Engagement level, there are several potential ‘ bottom line’ indicators including: the entity controlling the budget and the percent  allocated to various partners; the entity developing the agenda for academic-community meetings; the number and openness of meetings; and how research changes  as a result of community input (or not).

At each level, a key measure of success is the degree to which partners are committed, and whether the relationship among partners has the potential to transcend the specific project.  In summary, the IS field needs both ‘gold standard measures’ (for when level of partnership is the central issue) and pragmatic measures (for standard use) to assess and transparently report on the level of partnership.

We invite those interested in this topic of community engagement and measurement to join us on May 20th, 3:00-4:00pm ET for our next Advanced Topics in IS Research webinar which will focus on Community Based Participatory Research (CBPR) measures and will be led by Drs. Nina Wallerstein and Bonnie Duran. Registration and details coming soon. See an archive of this session below.

In the meantime, we should be self-reflective and cautious when claiming to be patient-centered, team-based, and community engaged, especially if lacking consistent feedback or documentation to confirm such representations. We offer the above reflections not as any definitive word, but as our initial thinking in the hopes of stimulating debate and sharing practical measures used by partners at multiple levels. Please join share your thoughts here -- Are we already "Walking our talk"? Where do you see the field needing to go?  How are you measuring level of partnership and engagement? 

Best,

Russell E. Glasgow, PhD and Kurt C. Stange, MD, PhD


Posts/Comments

Such a challenge. We have a

Such a challenge.

We have a center of excellence grant fom AHRQ to conduct community based research on using technology to keep me (and other older adults) out of nursing homes.   We are working with three counites in Wisconsin: urban, rural and suburban.  In year five we need to be in another 20 counties widely disseminating technologies to thousands of older adults and family caregivers.

We began almost two years ago using community organizing approaches to identify assets and challenges in the counties.  Literally hundreds of volunteers interviewing elders and caregivers.  We have held Celebrations where mayors and police chiefs and other community leaders have participated.  We have steering committees and adivsory committees.  We have worked on establishing the relationships and identifying existing ties and conflicts.  

I agree with the points made above.  These are important.  But there is something inside me saying they are touching the surface.  What I am about to say is a very simple example so I don't want to suggest this is all there is.  

Time (for instance) is such a significant consideration in so many ways.  People start out very committed but they retire (or in the aging case, die) or other priorities get in the way.  In quality improvement work I emphasize how important it is to move fast so that momentum is not lost.  We do that in this project as well but I must recognize that things get in the way; that reputations are very important and they are on the line.  Some people (understandably) feel that one can't act until all the ducks are in line; that trying something new without being sure ahead of time that it will work.  Even though research is done to answer many of those questions.

We also need to keep "our eyes on the prize"  For instance, we are being encouraged to disseminate our technologies through senior centers.  But in a sense that is the easy way out.  What we really want to do it to reach the isolated elderly; people who have never set foot in a senior center and probably never will. Sure we can go to senior centers and probably look good with numbers and supposed impact.  But we gotta remember that we set out to serve someone different from that.   

I am starting my own political action committee and have decided to enter politics to deal with these challenge (just kidding).  To do this well is a very difficult but very exciting challenge.  We really need to dig very deeply to come up with useful predictive models or criteria.  I am glad we are starting but we have a long haul.  I hate saying this (because patience is not my strong suit) but it will take time to really think this through well.

dave

@Dave - Thanks so much for

@Dave - Thanks so much for your thoughtful, multifaceted reply and reflections from your experiences putting these principles into community practice.

I agree with almost all of these comments (including patience not being my strong suit either).  I think raising the issue of needing to move relatively rapidly, and of providing frequent feedback to the community and stakeholders on progress, is an important element on sustaining community engagement.

As you know, our review, approval, funding and other mechanisms make this very challenging to move rapidly. Still as folks from your parent discipline of engineering and ‘Berwickians’ have taught us, I think often the key to progress is not to ‘try to get it perfect the first time’ (and never get anything done) but to ‘fail fast and informatively’.

Also, I thought for a moment you were serious about the political action committee...Do you remember Mitch Greenlick of KPNW who did exactly that -- after two other successful careers in research and research administration?  He recognized that policy sets the context for all of this work, not to mention allocation of resources.

Again, I appreciate your reflections and lessons learned, and encourage others to also share their perspectives and experiences as well as reflect on what Dave and I have shared here.

In order for our research to

In order for our research to have the biggest impact, we must engage community partners and stakeholders in the process.  The May 20th NCI Implementation Science webinar explored issues of community engagement and partnership and how we can improve our assessment of these principles.  We were joined by Dr. Bonnie Duran and Dr. Nina Wallerstein who presented their recent work in the area of CBPR measurement and assessment to examine this topic and explore the implications and opportunities for dissemination and implementation science. 

Now we would like to hear from you all.  How do you measure community engagement and partnership in your own research? What opportunities do you see in this field or in what direction should the field be moving? What research have you been conducting in this area? Share your thoughts and/or your research/experience with implementation science and engagement.

If you would like to review the measures and surveys that Nina and Bonnie mentioned on the call there are several places where the information is located:

You can also review recent articles by the research team including:

Missed the live session?  Don't fret, a recording of the session is available below. Watch the recording and then share your thoughts or questions.

Found the discussion of

Found the discussion of "trust" interesting.  For years we have talked about "paper trust" in our partnerships.  This is where partners cannot trust each other of some issue so a memorandum of agreement or contract is drawn up to put into writing the trust agreement.  I think this fits with the process of evolution of trust in partnerships.

With research in these areas of high variability there is a limit to how far one can go with linear research methods.  Context changes so rapidly and drastically that generalization is not realistic.  Models and frameworks seem to represent this limit.  Science can give us models (used to be called theories) and than it is up to the artistic talents of the practitioners to adapt, guide and facilitate a successful process. Perhaps as we better understand quantum theory we will better understand change under conditions of apparent chaos.

Maybe you could put together a webinar on systems and chaos theories as it applies to transitional research?  

One other important thing

One other important thing that communities have taught me.  They know what the problems are.  We do not need to spend inordinate amounts of time documenting the problems.  We just need a big picture.  We need to work more on building the solutions.  Best quote of this year "--Ambition is fired by possibility, not by deprivation, as a tour through the world’s poorest regions makes clear.  By DAVID BROOKS Published: September 17, 2012.  Ney York  Times, Op-Ed

Can you direct research this direction?  We waste so much time telling the community what they already know, and this probably decreases self-esteem and self-efficacy.  What is not helping.

Peyton asks: "What

Peyton asks: "What opportunities do you see in this field or in what direction should the field be moving?" 

We need to invest in the infrastructure and support needed in communities to conduct research and engage academic partners in authentic and equitable partnerships. 

With funding support from NIH, community-based organizations (CBOs) involved in research from across the U.S. have convened two years in a row and have formed the Community Network for Research Equity & Impact.  Over 250 CBOs are involved in the network, most of whom are partners in federally-funded research.  It's mission is to ensure that communities have a significant voice in decisions about research practice and policy, are true partners in research, and fully benefit from the knowledge gained through research. The network has issued a report (see http://bit.ly/Z29oj1), is holding a series of coaching conference calls (see http://bit.ly/13CFpAk) and will be gathering next in Chicago from April 29-30, 2014, immediately prior to Community-Campus Partnerships for Health's conference that runs April 30-May 3, 2014 (see http://depts.washington.edu/ccph/conf14-overview.html).  The network meeting in Chicago will include focused opportunities for community partners from Clinical and Translational Science Awards (CTSAs) to meet, as well as community-based organizations that operate IRBs or other types of research review mechanisms.  I hope researchers reading this will share this information with their community partners, and that community partners reading this will get involved by signing up at http://bit.ly/XhMuY6

Another direction the field is moving in is to recognize that journal articles aren't the only way to disseminate research findings and tools.  Increasingly, community-based research partnerships are generating products that communities need and want - whether policy reports, videos, digital stories, assessment tools, training manuals, and so forth.  CES4Health was set up in 2009 to peer-review and publish these sorts of products, and over 50 have been published to date, including the CBPR matrix authored by webinar presenters Nina Wallerstein and Bonnie Duran!  See http://ces4health.info/find-products/view-product.aspx?code=FWYC2L2T and http://CES4Health.info for details.  If you're reading this and have products of community-based research in development, free technical assistance is available to think through how to prepare a submission to CES4Health.  Just contact CES4Health Fellow Marlynn May at may@CES4Health.info

@ John, thank you for your

@ John, thank you for your interesting and thought provoking comments and suggestions.  I have shared these comments with the speakers and hopefully they will also be able to share their own thoughts and reactions.  While I can't say we have done a webinar on chaos theories, we have previously hosted a cyber-seminar on systems thinking -- you can access the archive here.  You might also find this recent article on rapid research by Riley et al of interest and relevance.  I think there are certainly people that, while acknowledging the complexity of  working and conducting research in an ever changing context, would argue that there are methods and approaches to research design that can help alleviate some of those issues.  This includes using rapid and recursive designs as shared in the Riley et al article. This is certainly a direction that we are the NCI Implementation Science Team are interested in exploring further and this will actually be the topic for our next webinar in June.

@Susan, thank you for sharing the resources your organization has and your thoughts on the directions the field should be heading.

I welcome Bonnie, Nina, and Russ  (and others) to share their thoughts on these topics as well.

 

 

For those that missed the

For those that missed the live session.  Here is the archive.  We hope you will view and then engage in the discussion and share your thoughts/comments on R2R.

I think this is great work.

I think this is great work.  The model you have put up makes sense.  The objectives of measurement are laudable.

What I need is advice on what to do about issues as they arise.  A structured model is OK, but when I am under the gun, I need advice and answers quickly.  I don't have time to go through extensive surveying.  I realize one could say is that you need to do so, in order to properly diagnose the situation.  Maybe so.  But I think more often than not, the problem is rather obvious.  The issue is what to do about and how to do it.

dave

Hi Dave:  I appreciate your

Hi Dave:  I appreciate your thinking the model makes sense, and also the need to act on these issues without waiting for extensive assessment tools.

Our team has also developed a focus group guide for using the model as a reflection device for critical times or for annual/or periodic reviews.  http://hsc.unm.edu/SOM/fcm/cpr/docs/CBPRmodel-FGguide041612.pdf  It's located at our University of New Mexico Center for Participatory Research website http://hsc.unm.edu/SOM/fcm/cpr/research.shtm

A partnership would choose an issue from the model (ie,. trust ) or the model is also dynamic and partners can add their own constructs to the oval categories...  But, then they would ask a set of assessment questions, ie., in the beginning of your partnership, how would you describe your level of trust?  where are you now? and then ask a future question, where would you like to be, and how can all the partners working together get there.

This last question is the one about "best or promising" practices. Our national study team is in the process of analyzing our data and we are coming up with a set of best practices, supported by the internet survey data, with complementary qualitative data, that we will be disseminating.  All partnerships of course have to find their own practices that work for them, but hopefully some of our results will be illuminating for others.

Thanks again for your interest in this work.

 

Nina Wallerstein