The Coalition Strategy
The Mental-Health AI Act requires a broad coalition. Stitching it together will take work. But the urgency of the moment — and the freight-train impact so many of them have already experienced from the AI industry — will work to our advantage.
Here’s the playbook that has worked for every major CA tech-accountability law in the last decade:
Start with academia + civil society, get them to sign a joint framework letter.
Bring labor in early by framing this as worker-safety oversight—workers get blindsided first.
Secure medical authority next, because mental-health groups give the bill moral and scientific legitimacy.
Fold in parent and youth advocates, which provides emotional resonance and media coverage.
Let agencies know what power they gain, which turns them from neutral to invested.
Deliver a unified coalition packet to your bill author before week one of session.
Feed the press: frame this as California cleaning up the mess social media made by locking researchers out.
Keep tech companies divided, not unified. (Some will support transparency to differentiate themselves from competitors.)
Aim for early Senate momentum — if it clears Senate Judiciary with bipartisan support, Assembly becomes easier.
Coalition Members
1. Labor Unions
(The people who know what it’s like when opaque tech systems wreak havoc on workers.)
Who matters
SEIU California (healthcare, public-sector workers—frontline view of mental-health harms)
California Labor Federation (umbrella body)
Teamsters (now highly active in AI and algorithmic management debates)
California Teachers Association (CTA) (AI in classrooms, student mental health)
AFSCME (public workers, city/county uses of AI)
Communications Workers of America (CWA) (already litigating algorithmic harms in gig and call-center work)
What message wins them
“Opaque AI systems harm workers first.”
“Independent validation protects workers from psychologically manipulative automation.”
“External researchers = the first real leash on corporate black-box decision systems.”
Proven ways to build support
Run worker panels where gig, retail, and customer-service workers testify on how algorithmic systems affect mental health.
Offer a ‘worker impact review’ amendment so unions feel baked in to oversight.
Give unions a formal role in choosing which civil-society orgs become “vetted researchers.”
Present the bill as the next generation of occupational safety rather than an AI bill.
This lands because labor has already watched the “algorithmic management” train hit them with zero visibility. They know what happens when researchers can’t get inside the box.
2. Psychological & Medical Associations
(The people who testify with moral authority and data.)
Who matters
California Psychological Association (CPA)
American Psychological Association (APA) (particularly the Technology & Mental Health divisions)
California Psychiatric Association
National Alliance on Mental Illness–California (NAMI CA)
Stanford, UCSF, UCLA psychiatry and psych labs
American Academy of Pediatrics, California Chapter (children + AI is electric in hearings)
What message wins them
“AI will play therapist whether we regulate it or not.”
“We cannot evaluate or mitigate harm without legally guaranteed research access.”
“This law is the IRB for the AI era — a standard the medical field already understands.”
Proven ways to build support
Secure early testimony from clinicians who have treated patients citing AI hallucinations, AI dependency, or self-harm prompts.
Publish a short white paper led by two or three named researchers (UCSF, Stanford, UCLA)—legislators love local credibility.
Frame the bill as evidence-based medicine: these orgs respond to calls for empirical rigor, not outrage.
Emphasize how social-media secrecy has blocked years of mental-health research; clinicians already feel burned by that.
If medical groups sign on, the reputational weight becomes a shield in committee.
3. Civil-Society & Tech Accountability Groups
(The groups that can provide the scaffolding: model bill text, expert testimony, media hooks.)
Who matters
Electronic Frontier Foundation (EFF)
Center for Humane Technology (huge influence on youth mental-health debates)
AI Now Institute
Data & Society
EPIC (Electronic Privacy Information Center)
Common Sense Media (especially crucial for child/teen mental-health angles)
ACLU of California (if privacy and transparency protections are strong enough)
Fight for the Future (grassroots mobilization)
Partnership on AI (for cross-industry legitimacy)
What message wins them
“This law prevents AI from becoming the next Facebook—opaque, unaccountable, un-researchable.”
“External validation is the minimum standard for democratic oversight.”
“Article 40 of the DSA shows the global baseline; California risks falling behind.”
Proven ways to build support
Draft a joint letter signed by 10–12 civil-society orgs before the bill is introduced.
Offer amendments ensuring strong privacy protections for researcher access. EFF and ACLU care a lot about process and guardrails.
Give these orgs a role in the “vetted researcher” pipeline so they feel ownership.
Use them to generate early op-eds and “explainers” for the press—legislators read these.
These groups help define the narrative: California is fixing the research-access problem that broke social media accountability.
4. Parent Groups, Youth Mental-Health Advocates, and Schools
(They are the most emotionally compelling and politically unassailable allies.)
Who matters
California PTA
Common Sense Kids Action
Children Now
Mental Health America of California
County school boards + big district superintendents
What message wins them
“Kids are already using AI the way previous generations used social media—and we don’t want another lost decade of secrecy.”
“This law lets independent scientists detect harmful patterns before they spread.”
Proven ways to build support
Public listening sessions with parents and educators about student AI use.
Case studies of minors receiving disturbing or manipulative AI content.
Partnership with mental-health counselors in schools to show real-world relevance.
Parents create political cover for tougher oversight.
5. State Agencies with Jurisdictional Ambition
(They want to lead; your bill gives them turf.)
Who matters
California Privacy Protection Agency (CPPA)
DMHC and DHCS (health agencies)
Department of Education
Attorney General’s Office
California Office of AI Safety (assuming it’s created or empowered)
What message wins them
“This law gives your agency real authority in the most urgent frontier of AI regulation.”
“You get to set the standards for vetted researchers and safety benchmarks.”
Proven ways to build support
Offer explicit rulemaking power.
Create opportunities for cross-agency cooperation (bureaucracies love this).
Promise clear jurisdiction rather than vague “consult” language.
Agencies like clear lines of control; this bill gives them that.
6. Elected Officials in Sacramento
(The people who carry the water. You need the right authors and the right committees.)
Committee Gatekeepers
Senate Judiciary
Assembly Privacy & Consumer Protection
Senate Health
Assembly Labor & Employment
Senate Appropriations (where bills die quietly)
Likely bill champions
Legislators already working on AI, privacy, tech accountability, or youth mental health.
Members from the Bay Area and LA tech corridors.
Lawmakers with education or healthcare backgrounds.
What message wins them
“SB 53 was about catastrophic risk; this bill is about everyday, voter-visible harm.”
“This is the DSA Article 40 moment for California.”
“You get to lead the nation on researcher access after 15 years of social-media obstruction.”
“Tech secrecy is politically toxic now; transparency is a safe, bipartisan stance.”
Proven ways to build support
Offer a clean narrative: The bill fixes the thing voters are angry about — the secrecy.
Secure early committee-level briefings with academic coauthors.
Provide district-specific data (student use, workforce vulnerability, mental-health incidents).
Build early bipartisan optics: mental-health risk is a cross-party issue.
Offer a fiscal note that is modest and easy to explain.