You just scored 51 points to your opponent’s 49. “If you’re playing basketball, that’s a win,” says Chris Findlay, chair and CEO of Compusense Inc. in Guelph. But if you’re tabulating scores from a consumer testing panel for various wines or salsas, 51-49 is hardly “game over.”
Statistical uncertainty. Subjective preferences. Inability to replicate results perfectly. No wonder they call Findlay’s business “sensory and consumer science.” And small wonder that his company has turned to University of Guelph statisticians to help make sense of its consumer testing data.
From food science to genomics, research data are bursting out of labs all over, says Prof. Paul McNicholas, Mathematics and Statistics. That explosion is driving higher demand for statisticians, particularly experts in his data-rich field of computational statistics.
This spring, he received a $125,000 federal grant to devise new tools to help design sensory studies and interpret research data. He will continue to work with the Guelph company, but he expects that his work will also help in designing and analyzing studies in other fields on and off campus.
His grant was part of a total of $14 million in new funding for Guelph researchers announced in May by the Natural Sciences and Engineering Research Council (NSERC).
Most of us know that food and beverage makers run product tests before releasing their goods to market. But fewer people realize what complicated statistical analysis it takes to understand those test results, says the Guelph professor.
That’s especially true for “high-fatigue” products such as hard-to-differentiate wines and beers, intensely flavoured or seasoned items (think chocolate, coffee or curry), or even white breads. A connoisseur might distinguish one choice from the next among a dozen or more products. But for the average panellist, taste buds and noses soon wear out.
“Suppose you are working with 12 red wines. You can’t get data from one person on all 12 products,” says McNicholas, holder of the inaugural University Research Chair in Computational Statistics. “Each person might do four instead of 12. It’s a difficult data analysis problem. You want dissimilar products beside each other. We’ve been designing incomplete blocks based on how far apart products are and then analyzing the results.”
Adds Findlay, “With Paul, we’re looking at techniques to get better targets for product development. With this type of research, you have to give a broad range of products to segment on the basis of differences, usually among 12 to 30 products.”
They met while serving together on a PhD advisory committee in the Department of Food Science, where Findlay had earlier completed his own graduate degrees.
Early this month, U of G statisticians and Compusense representatives discussed sensory science during the annual meeting of the Canadian Statistical Society held at Guelph from June 3 to 6.
Says statistics professor Gary Umphrey, who organized the conference along with retired professor Brian Allen: “The world is awash in data, but it takes a lot of expertise to turn that data into useful information.”
Similar challenges lie in other areas of life science, notably health and genomics. That’s good news for students pursuing statistics as a career, says McNicholas. As director of U of G’s bioinformatics master’s programs, he hopes to lure more of Guelph’s life sciences grads – more than 1,000 students a year – to explore the field.
About 50 grad students are enrolled in traditional statistics programs in Mathematics and Statistics.
McNicholas already works with geneticists and human health researchers on campus; this spring, he was looking to explore common ground with a Guelph microbiologist.
Under a separate NSERC grant this year, his department will install a second high-performance computer on campus. Nicknamed “Oscar,” the equipment will join another computer called “Wilde” installed two years ago.
Reflecting his own Irish roots, McNicholas had hoped to name the newcomer “Joyce,” but he was overruled by the department’s IT manager.