A consultation summary is meaningless without recommendations. Especially if it’s the product of poorly designed consultations.
Previous Master Innovation and Development Plan liveblog entries and relevant documents available here
A former Library of Parliament colleague once related to me the story of a university professor who, when discussing a federal parliamentary committee report, pointed to the fact that she had been quoted in the report (in the context of her appearing as a witness) as evidence that the government (or at least the committee) was endorsing her views.
As researchers tasked with writing such reports, however, my colleague and I realized she had misunderstood how these reports worked. To wit: A good report should provide an accurate record of what the committee heard. I would further argue that it’s good form to try to ensure that every person and group that takes the time to appear in front of a committee is quoted or mentioned at least once in the report.
However, if you’re interested in actual substance, in what the committee wants to happen, it’s the recommendations that matter. Ideally, these should be contextualized by the evidence heard by the committee, and including testimony both for an against a position allows the committee to legitimize its position. And of course, it’s important both that parliamentarians both listen and hear their constituents, and are seen to do so.
But in terms of policy, in terms of what a committee is actually endorsing, you’ve gotta focus on the recommendations.
All this is my way of explaining why I haven’t written up Waterfront Toronto’s report on the feedback that it received on its rushed and flawed three-week July consultations. Namely, absent anything of substance regarding what Waterfront Toronto is actually going to do, they don’t really tell us much.
In terms of substance, Waterfront Toronto made its move back in June when it published its Note to Reader, laying out its fundamental problems with the MIDP. It then followed it up with its renegotiated deadlines.
Nothing in this latest report does anything to change our understanding of the Sidewalk Labs-Waterfront Toronto dynamics. We’re still watching two organizations working behind the scenes to figure out what they’re going to do.
Flawed process, flawed data
And of course there’s the basic issue that this consultation summary, despite its use of precise percentages and quotes, is the result of a fatally flawed consultation process. As I’ve noted previously, people were given three weeks, in the depths of summer, to comment on a 1,500-page report that was designed not to be read. A report that Waterfront Toronto itself admitted it did not at the time yet fully understand. A report for which the expert Digital Strategy Advisory Panel, which issued its own report a couple of weeks after public consultations closed, could only provide a preliminary analysis due to the MIDP’s complexity and layout.
(For the record, I submitted a brief to this process, in the spirit of civic engagement. I got it in just under the deadline, even though I wouldn’t finish reading the whole document for a few more days. In fairness, at that point I was pretty sure of my opinions about it.)
A flawed, rushed process can’t help but yield flawed data. Given that Waterfront Toronto is keen to get into the data-collection game, one would’ve hoped that it would’ve grasped this kinda important point.
Not that it matters much in this case. Once we have something substantive in front of us, then we can talk.