And Why Most of Them Are Bullshit


The story of RAP: BC + AI's Responsible AI Professional Certification Program

I've watched companies spend six figures on AI infrastructure while their teams wing it on ethics.

No governance frameworks. No training. No one asking hard questions before deployment. Just vibes and velocity. "Move fast and break things" except what you're breaking is people's privacy, their careers, their trust in institutions that were already hanging by a thread.

Applied Ethical AI: Responsible AI Professional Certification (RAP) - Kris Krüg | Generative AI Tools & Techniques

And when it blows up, when the algorithm denies someone a loan because it learned historical racism, when the chatbot hallucinates a citation in a legal brief, when the facial recognition fails on darker-skinned faces at 34.7% error rates while hitting 0.8% on lighter-skinned men, they scramble to figure out what "responsible AI" even means.

By then it's too late.


I'm Complicit in This Mess

Before I go further, let me be clear about something.

I'm not standing outside this system throwing rocks. I'm inside it. I've got 1,800 of my photographs in the training datasets that power the tools I use every day. Images I shot for National Geographic, Getty, major publications over 18 years, absorbed into systems that generate competing work without my consent or compensation.

And I use those systems anyway.

image.png

Not because I've made peace with it. Because I've decided that holding both hands full, critique in one, curiosity in the other, is the only honest position. Total resistance means irrelevance. Uncritical adoption means selling out everything I believe about technology's potential to uplift rather than extract.

So I'm building something while remaining pissed about the conditions I'm building in. That's the tension. I'm not pretending it isn't there.

3. Both Hands Full


The Ethics Training Industrial Complex