When you start shopping for a major purchase, how often do you (or someone suggest) check out Consumer Reports? How much weight do you put on their rankings?
Something I've noticed is that for goods purchased by hobbyists/enthusiasts (say cars for example), review sites geared toward them tend to pay less attention to CR's rankings, but claim it's a good source for everything else (with the common example being appliances).
I've noticed this phenomenon with consumer electronics, automobiles and now baby stuff.
For consumer electronics (home theater), a limited number of models are tested, often ones out of date, and methodology is questionable. At issue are often large differences of opinion between audiophiles and CR reviews. But on those audiophile sites, those dismissing CR for A/V have no issue with CR in other area.
There's also a lot of criticism of CR's methodology for ranking auto reliability. Problems include sampling technique (CR readers), what constitutes a "problem" and lack of transparency in survey results. An interesting nugget I came across during the Pilot research period was that CR lumped problems together without any accounting for seriousness of the problem. On top of that, the reliability recommendations tend to really split hairs. Yes Vehicle A may be 40% more likely to have a problem (as reported by CR readers), but is there a big difference between 2 "problems" and 2.8 "problems" per 100 vehicles when you have no information on the severity of the reported problem, little idea of the statistical significance (is the reporting so low that it's more anecdotal?), and no information of the reporter? Of course this all didn't stop us from getting two Hondas, but other factors were in play there. But the car enthusiasts don't have issues with CR in other categories.
So now the baby issue. Back in January, CR released a report that of 12 tested infant car seats, only 2 passed their crash tests, frontal at 38 mph and side at 35 mph. The government standard is 30 mph. This was, as you might imagine, very alarming. Two weeks later, CR recalled the report. Why? The methodology was way, way off. The researchers attached car seats to a sled and simulated a crash at the aforementioned speeds. The only problem is that in a car crash at 35 mph, a lot of the force is absorbed by the car before being transferred to the car seat. CR's tests of 35/38 mph were the equivalent of roughly collisions at over 70 mph. That's a fairly hefty difference, and a rather damning indictment of testing methodology. In the aftermath, there's still some support for CR's original testing (kudos to the Graco Snugride and BabyTrend FlexLoc for passing the tougher standard) stating "we go over 70 on the freeway," which overlooks that very view collisions happen at that impact speed, because, you know, people have breaks and there's the relative speed of cars going in the same direction. CR, with less fanfare, updated its report at the end of August.
So this isn't meant to diminish CR, but as a brand name, it's granted a lot authority because of its noble goal, lack of advertisements, etc. But does it always deserve it? Well, maybe when you're looking for a fridge...
Thursday, September 27, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment