Almost as powerful as a product recommendation from Oprah, fitness equipment reviews in several magazines consistently send consumers scurrying to buy the top products. Considering the influence of fitness product evaluations on consumers, retailers and manufacturers, SNEWS checked into how Consumer Reports, Runners World and Prevention magazines — viewed as the most read and most influential in fitness — roll up their sleeves, break a sweat, sharpen a pencil, and dive into assigning judgments.
Consumer Reports: Known for methodical testing
No surprise – testing veterans Consumer Reports are viewed as the most objective – from investigating, purchasing and assembling units on their own (they adamantly refuse any donations from manufacturers) to precisely measuring details under the hood such as caloric expenditure and shock absorption. Not to mention fairly intense durability testing – including subjecting treadmills to the equivalent of about one year’s use by a 185-pound runner. And since starting to test treadmills in 1989, CR now also tests other equipment like ellipticals and home gyms.
For CR’s tread review, three men and three women are methodically guided through the paces on 11 treadmills over three months and they had to use all units before recording any opinions. Reviewers evaluate ease of use, ergonomics (handrail placement and how well the treadmill fits people of different sizes – heck, Consumer Reports even consults anthropometric charts to determine what best fits most people!), exercise (incline, speed, stability, heart rate monitoring) and construction (continuous-duty horsepower rating and weld quality). After formal testing, treads are moved to the company’s fitness center where anecdotal comments from employees also are considered.
Data is scored based on the project engineer’s specific protocol and even reviewed by a statistician; scores then are translated into the purely qualitative “excellent” to “poor” rankings seen in the magazine.
“It’s hard to argue with our results because in many cases we are just giving data,” said John Galeotafiore, the director of testing for recreation, power and auto equipment and a 15-year veteran of Consumer Reports. “Everything we test goes through a rigorous procedure, and we are very objective. To maintain our credibility, we want people to have the same experience we have.”
Runners World: Considers “runability” and gut feelings
Runners World concedes that although it doesn’t conduct the same nuts-and-bolts testing as Consumer Reports, it has more realistic expertise for tread reviews focused on “runability.” About 15 testers (RW editors and area running club members) literally hop down the line of 12 treads side-by-side punching in a pre-selected series of running paces and inclines and using a 1-10 rating system for cushioning, stability, controls (user-friendliness) and display (console visibility, layout and readout relevance). An overall rating is based on individual evaluations according to how well each treadmill met that specific tester’s needs.
“The strength of our review is in measuring the actual running experience – how does this group of real runners respond viscerally to running on these treadmills,” says Dave Sellers, new product editor of Runners World.
Acknowledging its mainly subjective process, Sellers makes the point that although some readers felt that the rating system connotes winners and losers, the articles try to explain that every tread reviewed has something to offer people with different preferences.
“We educate consumers on models, but we think it is really important for them to try out different treadmills themselves,” Sellers says. “It’s like buying a car – theoretically it might have all the features that you like, but then when you actually sit in it and drive it, you discover something isn’t quite right. There are just some intangibles that are hard to quantify, but still are important to the user.”
Prevention: Less defined protocol
As for Prevention, a sister publication of Runners World also published by Rodale Inc. and a relative newcomer in the fitness equipment testing business, it doesn’t test equipment on any particular schedule and instead simply solicits donations of the newest models within its price range. For the most recent treadmill review, this happened to be $800-$1,500 because “Prevention readers are looking for a good value without spending a fortune,” according to fitness editor Michele Stanten.
Testing protocol seems more undefined – even downright secretive. Prevention told SNEWS that 13 staff members (mostly walkers and some novice treadmill users) tested seven treadmills by “putting themselves in their readers’ shoes with consumer-type reviews.” Although we probed for more specifics, curiously we got no details. Because the brief article only listed a handful of qualitative remarks for the four recommended models (two from Smooth and two Icon models) under headers such as “Why We Loved It” and “Drawbacks,” we really can’t tell what – if anything – was actually measured.
And the reactions from manufacturers?
So what do fitness equipment manufacturers make of all this? Naturally, winners are thrilled.
“Reviews like this are amazing in terms of what they mean for marketing, and they have a huge impact on sales,” said Todd Nickodym, director of marketing for True Fitness, who has held the top spot in Runners World in its past two treadmill reviews. And you can be sure that True takes advantage of these kudos with its dealers through hangtags, counter cards and article reprints. Heck, even its web site has a link to Runners World.
Those earning less than stellar marks area bit more guarded, what else was expected? Just sour grapes, you say? Not so fast. Runners World ranked Precor’s $3,500 M9.33 third out of 12, but dinged the unit for its sparse display readouts. Precor’s PR manager Jim Zahniser says, hold on now, since he was frustrated by the ambiguity surrounding what model to submit.
“We could have come in with our highest price point and gotten a much better review, but the magazine directed us to stay close to $3,000,” Zahniser said, which meant fewer display features. “When the review ultimately came out, Runners World tested higher price points anyway. It seems like the process changed mid-stream.”
Smooth Fitness president Joe Alter said he agreed that Runners World’s system seemed to be initially devised with good intentions but became a bit of a hodgepodge.
“We got great comments about our acceleration, and displays, but we ended up low overall,” said Alter of Smooth’s 9.3HR model. “It didn’t seem like some of these positives played into the final score.”
Also, although Runners World seems to imply exhaustive testing, Precor’s unit only had 30 miles on it when it was returned, we were told. Sellers admitted that the process for any one tester might have been relatively brief. Reviewers were not babysat or required to perform structured intervals or even long runs, although SNEWS noted the story at www.runnersworld.com states that reviewers “pounded out the miles on them.” The story also said, “We gave these treadmills the kind of road test it would take you months to dish out.” Not sure that adds up to only 30 miles.
Sellers added that the testers were required to go through at least one predetermined test sequence. “They were free to complete it in as much or as little time as they needed to be accurate,” he said. (We heard the magazine had procrastinated on getting the review done, leaving little time for a good look-see, and left grumpy testers who were called in after-hours on a muggy afternoon.)
“Testers may have simply spent less time on some units than others – because it was easier to make a relatively quick judgment in some cases,” Sellers added.
Price point disparity remains a nagging thorn in the side of manufacturers, as they understandably prefer that apples be compared to apples. Consumer Reports scrutinized a $500 ProForm unit up to a $3,000 Precor model; the magazine claims that it selects units based on the general population of treads as well as its readers’ demographics. But Consumer Reports is upfront about “the more treadmills cost, the better the materials and construction, as a rule.” Galeotafiore thinks that readers are smart enough to know that a more expensive treadmill is going to be better quality than a cheaper one. Runners World also clearly states that you get what you pay for, and Prevention says, “We’re happy to report that you can get a great treadmill for less than $1,000.”
SNEWS didn’t hear much griping overall about Consumer Reports or Prevention reviews – perhaps because manufacturers are reluctant to disparage the intimidating Consumer Reports for fear of being ignored or slammed in the future. And Prevention doesn’t yet share the same spotlight; maybe because only four treads were mentioned with relatively few negatives, it might be a bit ho-hum.
The bottom line? Ultimately, manufacturers prefer that magazines publish – and stick to – specific criteria for product reviews. “Manufacturers need to know how they are being gauged,” says Zahniser. “Tell us what consumers value and drill down deeper – what exactly constitutes an excellent versus a good rating?”
Alter recognizes that opinions count, but agrees that metrics are critical. “You need some science behind this, and every magazine has to set and follow its own criteria.”
SNEWS View: First and foremost, hats off to the magazines for reviewing fitness equipment, as these articles give greater visibility to the industry overall and potentially can raise the bar for everyone. In fact, years ago, after Consumer Reports indicated concerns about some treads with AC motors turning on at full speed, manufacturers responded. And Consumer Reports seems to have matured from the days of only reviewing the really inexpensive stuff to a nicer mix of prices.
But we agree that manufacturers and readers alike would benefit more from clearer criteria and more detailed protocols — and then of course following them exactly through the entire process and with all companies involved. Maybe magazines don’t want to tip their hand here, but being somewhat elusive really helps nobody, and certainly doesn’t help their readers either.
Reviewing fitness equipment may always be a bit of an inaccurate science that by nature must be partly subjective, given differing opinions of the actual experience on a particular piece. Sure, there are lots of things you can measure quantitatively, but essentially isn’t it the feel and user-friendliness of the machine that matters most to exercisers? And doesn’t the “right” feel mean something different to everyone? Numbers are nice but perhaps not necessary in all cases, as most of us can tell which treadmill sounds louder, or which elliptical stride feels better. And a lot of feature attributes really come down to preference, which can be hard to definitively rate for everyone.
Runners World even says, “We are always struck by the degree to which different runners like and dislike different treadmills…so use the ratings as a general guide, but don’t stop there. Check them out to determine what features you like, and buy accordingly.” That, we dare say, is well said.