Canadian AI Policy & Ethics
Jan 18, 2019 ● Craig Daniels
Ethical AI: People Write Algorithms. People Can Fix Them

Humans need to rein in AI, code ethically, and ask hard questions about it's purpose

Humans need to rein in artificial intelligence, ask hard questions about AI’s purpose, and motivate the companies and employees who create algorithms to code with non-harmful – read ethical – outcomes in mind, says the co-founder and co-CEO of a Waterloo- and Toronto-based startup that works in the AI space.

“Tech’s posture is to deny its own impact, [saying], ‘We just run a platform.’ But the effects are deep, real and troubling,” said Vubble’s Tessa Sproule on Wednesday, speaking to a lunchtime audience at the Communitech Data Hub during a discussion titled: It’s Time to Talk about Ethics in AI. The event was part of the Data Hub Sessions, a series of talks that probe trends and issues related to data.

“Just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there,” said Sproule.

“The most viral thing tends to be the most extreme. The things that really anger us travel fastest and furthest.”

Sproule was formerly the Director of Digital at CBC and grew increasingly frustrated in 2013 as the CBC, like most other legacy media companies, surrendered products to Facebook and chased an audience interested in clickbait and ever more sensational presentation and content.

She left the CBC to start Vubble with co-founder Katie MacGuire, also formerly of the CBC. Vubble is a platform that helps broadcasters tag and organize their video archives; it uses a machine learning layer to pull in data about their audience, matching content to an audience.

“I was increasingly worried how, in a post-broadcast media space, we were relying on algorithms, largely run by one company in Silicon Valley, to decide what people would see, what they would care about and what they would engage with and be informed about,” said Sproule.

Wednesday’s discussion explored themes – tech for good – that attendees to Communitech’s inaugural True North conference last May would be familiar with. The follow-up edition of True North, which once again will focus on tech for good, is slated for June 19-20, 2019.

Vubble, Sproule said, has drawn up a nine-point action plan that companies and leaders can use to help ensure AI initiatives generate a net positive ethical outcome. Companies, she says, should develop ethics codes to guide AI decision-making and specify how issues should be handled as they emerge. They should additionally consider hiring ethicists and give them the power to question AI use-cases.

And, she says, it’s time for governments, particularly those in Canada and the U.S., which have not been as strict as those in Europe, to beef up regulation and oversight of big tech players.

“I’m not saying the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. They are reaching farther into our private moments. They are watching us. We need to watch them.

“I’m for regulating specific things, like Internet access, and stronger protections and restrictions on data gathering, retention and use. The better we get at being out in front of the social consequences of AI, the better for all.

“Yes, it’s unusual for a company to ask for government regulation of its products, but at Vubble we believe thoughtful regulation contributes to a healthier ecosystem for consumers and producers alike. We advocate for a ‘technocracy’ approach – the production of technology that doesn’t just feed our business and that of our customers, but that does good and makes society a better place for us all.”


This article originally appeared in Communitech News 

Article by:

Craig Daniels