If we are going to survive this century, we need to move away from decision-based evidence-making and truly make evidence-based decisions. Public access to publicly funded science would help ensure that the government relies on the facts, not on ideology. Science is too important to democracy to be kept in a government vault. -Elizabeth May
In this election, the most important issue facing Canada is not the economy, it isn’t deficits, and it is not conservative vs progressive values.
This election the most important issue facing Canadian voters is data. A government must govern the country as it is, and not the country as they wish it were. In other words governments need to create policy based on the real world and not on ideology.
Without good data, all policies inevitably become based on ideology instead of reality and this is a recipe for an inevitable disaster. Policies not based on reality are doomed to fail, and depressingly, without good data we might be unable to see and understand these failures until they become undeniable and smack us in the face.
It must be tempting as a Prime Minister to ignore or squash inconvenient data that contradict your world view. But as Richard Feynman said: ’Science is what we do to keep us from lying to ourselves’. Similarly a Government that doesn’t respect and demand good data is lying to itself and us.
Unfortunately The Harper Conservatives, which have been in power since 2006 have not recognized nor respected the vital importance data play in our democracy. In this regard, the Harper Conservatives have been lying to us and are leading us towards an inevitable disaster.
The Harper Conservatives’ disrespect for data has been so troublesome that the world’s leading scientific journal, Nature has written several editorials on the subject.
Since Prime Minister Stephen Harper’s Conservative Party won power in 2006, there has been a gradual tightening of media protocols for federal scientists and other government workers. Researchers who once would have felt comfortable responding freely and promptly to journalists are now required to direct inquiries to a media-relations office, which demands written questions in advance, and might not permit scientists to speak. Canadian journalists have documented several instances in which prominent researchers have been prevented from discussing published, peer-reviewed literature. Policy directives and e-mails obtained from the government through freedom of information reveal a confused and Byzantine approach to the press, prioritizing message control and showing little understanding of the importance of the free flow of scientific knowledge.
Since that editorial in Nature things have only gotten gotten worse. We have gone from muzzling scientists, thus preventing them from sharing their data with the public and responding to misunderstandings and misrepresentations of their work; to preventing data collection and even to destroying data that have already been collected.
World renown research stations, such as the Experimental Lakes Area in northwestern Ontario and the Polar Environment Atmospheric Research Laboratory (PEARL) in Eureka, Nunavut are being shut down. And while PEARL and the Experimental Lakes area might be the most famous closures they are far from alone:
Tax-funded environmental monitoring, conservation and protection has been debilitated with the closure of 200 scientific research institutions, many of which monitored food safety and environmental contaminants. Some were internationally famous. The Polar Environment Atmospheric Research Laboratory in Nunavut, which played a key role in discovering a huge hole in the ozone layer over the Arctic, closed in 2012. Also shuttered was a brand-new climate-controlled facility at the St. Andrews Biological Station in New Brunswick. The original station provided writer Rachel Carson with documentation of DDT killing salmon in local rivers reported in her 1960 book Silent Spring, credited with giving rise to environmentalism.
The data gap naturally affects policy. “How can Environment Canada know how pollution from the oil sands has changed over the last 30 years, if they don’t have access to baseline reports?” asks [Physicist Raymond] Hoff, who reports that former Environment Canada colleagues call him for reports they can no longer access internally. Fisheries scientist Jeffrey Hutchings, a professor at Dalhousie University, says he can’t find studies on cod stocks dating to the 19th century that he referenced two decades ago at the now-closed St. John’s library, which had profound implications for cod management. “The work I was able to do then couldn’t be done now.”
The National Research Council (NRC) is no longer doing basic science and instead has become little more than a consulting firm for Canadian business, a move that drew sharp criticism from scientists around the world. A move so problematic that even Andrew Coyne the editor at the right-wing National Post criticized:
Basic science, the kind of blue-sky research with no immediate commercial application, is an example of something the market cannot do, or not at a level that is optimal for society. Not only is there little obvious incentive for a private firm to spend money on research that does not pay off in new products or better processes, but so far as such research can be adapted to commercial uses it could as well benefit its competitors as itself: so the sharing of research that is a critical part of scientific progress is discouraged.
Hence it is well-established economic principle that basic research is the sort of thing governments should fund. By the same token, however, government should not be in the business of funding applied research, that is research directed to commercial uses. Not only is this unnecessary — business can perfectly well fund this sort of thing on its own — but it inevitably tilts the pitch in favour of certain activities over others: some technologies, innovations, products, firms and industries will be funded, at the expense of the rest.
Even basic data, that normally would be gathered by StatsCan in the Census is no more. When the Conservatives abolished the mandatory long-form census many communities were rendered statistical ghost towns as Anne Kingston detailed in her excellent article in Macleans last month:
When told that his small Prairie town had, in profound ways, fallen off the statistical map of Canada, Walter Streelasky, mayor of Melville, Sask., is incredulous. Streelasky had no idea Melville had been rendered a “statistical ghost town” after the mandatory long-form census was cut in 2010, and fewer than 50 per cent of the one third of Melville’s 4,500 residents who got the voluntary National Household Survey that replaced it in 2011 completed the form. Melville still exists—but as a shadow. We know how many people live there, but nothing about them—where they work, their education levels, whether they’re married, single or divorced, how many are immigrants, how many are unemployed, how many live in poverty. Melville’s numbers, then, aren’t factored into Canadian employment numbers or divorce rates or poverty rates. According to Sask Trends Monitor, the high non-response rate in the province resulted in “no socioeconomic statistics about the populations in about one-half of Saskatchewan communities.” Nationally, we’re missing similar data on 20 per cent of StatsCan’s 4,556 “census subdivisions,” making a fifth of Canada’s recognized communities statistical dead zones.
…
How many Canadians live in poverty now, compared to 2011? We don’t know; changes in income-data collection has made it impossible to track. Austerity measures, ironically, have resulted in an inability to keep track of the changes: StatsCan used to provide detailed, comprehensive data on salaries and employment at all levels of government; now we can’t tell where, or how deep, the cuts have been.
…
Government, too, is operating in the dark, as evidenced last year when StatsCan was unable to provide auditor general Michael Ferguson with job data during the contentious debate over proposed reforms of the Temporary Foreign Worker Program. The Department of Finance was relying on data from the online classified service Kijiji to back its position.
…
The report also noted that much of Canada’s trade data on resources, especially energy, “is inferred, because it is not available on a timely basis.”
Lack of tracking capacity also blinds us to whether there have been improvements in the inadequate housing and overcrowded conditions in the North, exposed in the 2006 census, says Yalnizyan: “We’ll never know whether that is improved or not.”
But the census is far from the only issue; less discussed is the 2012 elimination of four key longitudinal studies, some dating to the 1970s, which tracked health, youth, income and employment. Economist Miles Corak, a professor at University of Ottawa who studies income inequality and poverty, calls this a major informational loss, as well as “money down the drain.” “Longitudinal studies are very expensive,” Corak says, “but their value increases exponentially with time.” He compares the loss to stopping watching a movie halfway: “Only after you follow a group of children for 12 or 15 years, and they’re on the cusp of entering the labour market, do you have the capacity to see how adult success is foreshadowed by their family origins.” Statistics tell a human story, Corak says: “We think of statistics as cold, but they are the real lives of people embedded in bits and bytes. They live and breathe.”
Cutting the Survey of Labour and Income Dynamics, a longitudinal study tracking economic well-being since the mid-’90s, left the country unable to measure changes in income over the longer term, says economist Stephen Gordon, a professor at Université Laval. It was replaced by the Canadian Income Survey, which uses a different methodology; now, old income data can’t be connected with new income data, he says. The upshot? Comprehensive Canadian income-data history currently begins in 2012.
Gordon expresses alarm that 20 years of data history between 1960 and 1980 vanished in 2012 due to changes in the way national accounts, GDP and other data were compiled: “It’s now impossible to have a clear picture of the Canadian economy since the Second World War,” he says. And that’s a huge problem for analysts who need to look at pressing concerns, such as the current oil price crash in context. “You want to look at data about oil prices rising in the ’70s, but you can’t.”
Yet it gets even worse. The Conservative government has started a large effort to digitize vast datasets and printed publications stored at various research libraries across the county. On the surface this sounds like a great idea. We can reduce costs and ensure that everyone from researchers to the students, to an interested citizen can access data, payed for by our tax dollars, from their computers. No longer would they have to travel to research libraries (which, given Canada’s size, could be a considerable distance away).
However, this digitization has, either by design or by accident, resulted in vast swaths of data being destroyed before being digitized:
Physicist Raymond Hoff, who published more than 50 reports on air pollution in transport and toxic chemicals in the Great Lakes—including pioneering work on acid rain—at Environment Canada between 1975 and 1999, doesn’t seem to exist, either. “Nothing comes up when I type my name into the search engine on Environment Canada’s website,” says Hoff, now a professor emeritus at the University of Maryland. Also gone are internal reports on the oil sands experiments of the 1970s. “That research was paid for by the taxpayer. Now, the people who need to protect Canada’s environment can’t get access.”
Since 2012 16 research libraries have closed, and with those closures countless taxpayer-funded datasets, papers, and reports have been lost forever.
It has gotten bad enough that scientists, typically more interested in obscure topics than politics, from around the country rallied in Ottawa in 2012 in a mock funeral procession called the Death of Evidence.
After almost a decade in power the Harper Conservatives have decimated Canadian data. They have muzzled our scientists. They have governed without being burdened by data. And we are all worse off for it.
This is what the election is about. Forget the distractions of left vs right, of the economy, and of deficits. It’s not that these issues aren’t important, they are. But without data all we can do is shout our preferred ideology at each other.
And that is a recipe for disaster. Good government is dependent on good data.
When voting on October 19th, ignore the distractions and vote for a candidate that respects and supports data.
Leave a Reply