ο

Researchers find a ‘people-centric’ approach to surveys yields better data on diverse communities

As part of a study of Toronto's suburban residents, οScarborough graduate students created a handbook that details how they achieved a higher-than-average response rate for their survey by using a community-oriented approach (supplied photo)

A recent survey by University of Toronto Scarborough students not only gleaned important information from hundreds of households across Toronto, it provided critical insights on the act of surveying diverse communities that could help other researchers boost participation in future projects.

The researchers involved in the Community Voices study – an effort to find what residents in Toronto’s inner suburbs valued most in their neighbourhoods, and their views on the policy-makers that influence them – created a handbook, , that details how to bring the voices of those being surveyed into the design of the surveys themselves.

The resource also explains how to strike a balance between science and cultural sensitivity.

“We’ve documented so many of these processes and results in this handbook,” says Umair Majid, a PhD candidate and co-author of the study and handbook. “My hope is other organizations get on this journey and continue designing people-centric surveys.”

With about 750 respondents, the survey’s results were the backbone of Community Voices. The handbook explains how the team created a survey that was both scientifically significant and culturally appropriate for Toronto’s diverse suburban neighbourhoods.

About 40 per cent of residents who were approached completed the survey, exceeding the average .

Extensive community feedback crucial in survey design

Two groups were created to continuously advise the survey’s design and execution: one comprised leaders from community organizations in the studied neighbourhoods, while another included researchers and professional survey designers. 

 

 

“The community advisory group offered their perspective on community priorities and guided us on language inclusivity, accessibility and clarity,” says Yvonne Daoleuxay, a PhD candidate and one of the study’s co-authors. “The technical advisory board made sure the survey design was methodologically sound and that the data collected would be useful for other researchers.”

It was tempting to use questions from national surveys, which have been thoroughly tested and make it easier to compare results across studies. But Daoleuxay says these questions often include several detailed categories of European ancestry and Christianity, rather than reflecting Toronto’s diversity.

Researchers instead worked backwards from those standardized questions and tweaked them based on feedback from the community advisory group. They then tested the questions with focus groups made up of residents, which had them back to the drawing board (and their advisory boards) on some sections.  

“We heard very legitimate, genuine concerns that all too often researchers come into communities, extract data, write papers and never engage with communities again. And we want to make sure that the study wasn't going to be another one of those,” Daoleuxay says. 

The handbook includes multiple checklists for organizations conducting surveys, emphasizing input from communities, researchers and local organizations (supplied photo)

Meanwhile, a group of 20 undergraduate students from diverse backgrounds were recruited as surveyors and underwent weeks of training. Many had grown up around the communities they were surveying.

“The diversity of the team proved to be beneficial – they could interact with lots of people with different backgrounds. The connections were often immediate, especially with shared ethnicity or language,” says Majid.

Updating approach helps survey evolve

The surveys had to be engaging enough to keep participants’ interest for about 20 minutes while being accessible for those who spoke different languages and had limited experience with technology. Throughout the six months of surveying, feedback from the surveyors was also incorporated into their approach.

In the city’s eastern neighbourhoods, the team initially sent out flyers with links to an online version of the survey via mail where as neighbourhoods to the west had them personally delivered. The researchers found more engagement in the west end, which they attribute partly to interpersonal connections. Surveyors then went door-to-door with tablets loaded with the survey and carried slips linking to the online version.

During each of the surveyors’ shifts, about a quarter of residents said they did not want to complete the survey. However, the team found in about half of these rejections, residents were still willing to do the survey – just not at that moment, which they navigated by offering the online version. At the end of their shifts, surveyors re-visited houses that did not answer the door, which doubled the number of responses.

Researchers also continued to draw on their advisory groups throughout the study. When a landlord of an apartment threatened to call the police on a duo of surveyors for trespassing, the team found a member of the community advisory board had an existing relationship with the building and was able to help them gain entry. 

“This kind of work takes a lot of thinking,” says Majid, who co-ordinated the survey. “This handbook is a good foundation for any group or organization to do something like this, but they’ll have to put that time in.”

The Bulletin Brief logo

Subscribe to The Bulletin Brief

UTSC