Last week saw the release of a new report on Canadian broadband that found things to be, well, not as bad as everyone thinks they are. The report, produced by Montreal-based telecommunications Lemay-Yates Associates, found that “Canadians enjoy among the fastest, most widely available, and least expensive broadband internet in the world,” according to the press release that accompanied it.
The release came from the report’s sponsor, Rogers Communications, which is the largest cable company and one of the biggest broadband providers in the country. Many people wrote the report off given its origin and the fact that it was at odds with virtually every other study produced by neutral third parties. The dismissive attitude was understandable – and warranted.
Perhaps the biggest issue with the Lemay-Yates study is its pricing analysis. It found that the average cost of Canadian broadband is $30.79 U.S., or second cheapest in the G7. Speeds above 10 megabits per second with Rogers, meanwhile, average out at about $34.27, which is 10% cheaper than the Canadian average and enough to score “an impressive first in G7 countries and third place among OECD countries.”
Of course, anyone who believes those numbers has probably also invested in swampland in Florida. A quick glance at Rogers’ website shows that even the slowest three-megabit connection can’t be had for that price. Proper broadband services above 10 megabits, meanwhile, start at $46.99 (or $47.25 U.S., for anyone keeping track). And don’t forget that modem rental fee, plus any overages you might incur for exceeding your usage cap.
The pricing discrepancies are explained through footnotes and a subsequent commentary (PDF). In a nutshell, the numbers have been massaged by a number of factors, including the application of foreign exchange differences and, bewilderingly, discounts and promotions. The inclusion of those last two is surprising because, while such data may be relatively easy to gather in Canada, it is difficult if not impossible to accurately measure in other countries. It’s highly unlikely, after all, that the authors scanned bus shelter ads in Turkey and billboards in Switzerland for local promotions.
Similar tinkering is also evident with speed estimates. While many reports accept speed test results compiled by Ookla – which gives Canada global rankings of 33rd and 65th respectively in download and uploads – the Lemay-Yates study gives weight to market share of internet providers in coming to its conclusion. The logic is that other countries may score faster speeds, but do they really matter if they’re coming from smaller ISPs that don’t service many customers?
In fact, there are problems with judging speeds by market share, the most obvious being that not all countries are served by just a handful of large providers, like Canada is. The UK, for example, is flush with ISPs – and many that receive top marks from customers are nowhere near the biggest.
The authors also rationalized their findings by pointing out that they’re similar to what the CRTC had in its Communications Monitoring Report last summer. As I wrote at the time, the regulator’s report was compiled using information supplied by the large service providers themselves, so it was therefore prone to the same methodology massaging. (Amazingly, the words “upstream” and “upload” are also absent from the Lemay-Yates report.)
Telecom studies are, as a result, more art than science for various reasons. Any scientist will attest to the fact that bias always plays a role in experiments and studies; the challenge is to put it aside and let the data speak for itself. Some experiments can be performed in only one manner, in which case it’s difficult to produce results that can be interpreted in different ways. It’s a nice built-in bias filter.
However, that’s not the case when it comes to studying things such as broadband and wireless performance across countries. Because there are so many variables and so many different methodologies that can be used, any number of different results can be arrived at. So, when numbers are so malleable, the issue of bias takes on a much higher input value.
Another strong check on scientific bias is peer review. Scientists publish their work in neutral periodicals such as Science or Nature for their peers to critique and verify. There are no similar publications for this sort of telecom report that I’m aware of and indeed, supposedly competing consultancies are more likely to agree with or defend such studies as they are to critique them – unless of course someone hires them to.
With this particular study, it looks like Lemay-Yates set out on a specific mission – to disprove numbers presented by the Organization for Economic Co-operation and Development last spring, which found Canada to be further sinking in many broadband measures. As the old saying goes, when you go looking for trouble, you will probably find it.
For those who aren’t familiar with it, the OECD is a Paris-based think tank that is funded by the governments of its 34 member countries. For the past 50 years, the organization’s mission has been to “promote policies that will improve the economic and social well-being of people around the world.” It does this by studying a wide array of topics, from birth rates to pension funding to even telecommunications.
Industry lobbyists and public relations people routinely dispute OECD studies, because they often find Canada to be wanting in telecom. The organization is attacked not just for its methodologies, but also because it supposedly has a Euro-centric view or bias that doesn’t properly take into account North American realities (U.S. lobbyists typically hate the OECD too).
The reality is, while its methodologies and numbers may sometimes have some flaws in them, the OECD’s neutrality can’t be disputed – unless it has been secretly infiltrated by some sort of Estonian conspiracy – and its findings can at least be believed to be altruistic. That’s why most regular people trust the organization’s studies over those of telecom consultancies.
In the end, studying numbers like these is more like looking at a painting than conducting a scientific experiment. It really all depends on what’s in the eye of the beholder.