BC + AI backs Geoffrey Hinton’s call for “regulations with teeth.” We’ll keep building open‑source, community‑driven AI in BC and push governments to pass iron‑clad guardrails that protect people, planet and culture.


1. Why this moment matters

Dr. Geoffrey Hinton just told Canada—again—that Big Tech will fight any “regulations with teeth” and that only massive public pressure will deliver the AI guardrails we need. He’s worried about everything from deep‑fake disinformation to job wipe‑outs and, yes, an existential “oops.” (betakit.com, timescolonist.com)

BC public opinion lines up with him:

Our own Dialogue‑on‑Tech Symposium flagged the same gaps: low trust, patchwork policy, and the need for public‑first, ecosystem‑wide action.


2. Draft Position Statement (for press release, blog, council testimony)

Headline:

“Build the Future, But Bolt It Down—BC + AI Calls for Tough, Transparent AI Regulation”

Body (≈220 words):

“British Columbia’s AI movement was born in coffee shops, classrooms and co‑ops—not boardrooms. We love the crazy power of generative tools, but power without guardrails is just another extraction engine.

We echo Dr. Geoffrey Hinton: Canada needs regulations with teeth—now. That means binding rules, open audits and real penalties, co‑designed with Indigenous knowledge‑keepers, workers, artists and technologists.

We ask Ottawa and Victoria to:

  1. Pass and strengthen AIDA so every “high‑impact” AI system is safety‑tested and bias‑audited before launch.
  2. Mandate algorithmic transparency for any model used in hiring, housing, credit, health or policing.
  3. Create an AI Environmental Impact Registry—because data centres drink rivers.
  4. Fund community‑run safety labs in BC to stress‑test frontier models in the open.

BC + AI will keep mapping the mycelial network of creators building responsible tech here. But we refuse to carry water for companies that profit off opacity.

We’re rallying our 1 000‑plus members to show up at town halls, write MPs and prototype open‑source safety tools. Join us—because culture eats algorithms for breakfast, and the next course is legislation.”


3. Key Talking Points for Media & Panels

Sound‑bite Expansion
“Open source doesn’t mean open season.” Support innovation and pre‑launch safety reviews.
“Guardrails > guidelines.” Voluntary codes failed in climate and social media; we need law.
“Nothing about us without us.” Include Indigenous, labour and creative sectors in rule‑making.
“Transparency is the new tax.” If you profit from AI in BC, you publish model cards & energy stats.

4. Policy Asks in Detail

  1. Pre‑market licensing for models >10 B parameters or used in critical infrastructure.
  2. Legally binding bias & safety audits conducted by accredited, publicly listed testers.