Meta’s Own VP Left $10M on the Table to Testify Against Zuckerberg
The guy who built the ad machine just told a jury exactly how it was designed to hook your kids
11 years inside Meta. $10M+ in unvested stock abandoned. 4 million kids under 13 on Instagram. One man’s testimony could cost Meta billions.
Brian Boland, former VP of partnerships at Meta, took the stand in a California courtroom this week. He spent over a decade building the system that prints Meta’s money. Now he’s telling a jury that same system was designed to be — his words — “absolutely relentless” in chasing engagement from teens. And Zuckerberg’s response when Boland flagged harmful algorithm data? “I hope there’s still things you’re proud of.”

🧩 Dumb Mode Dictionary
| Term | What It Actually Means |
|---|---|
| Bellwether Trial | The first test case. If Meta loses this one, ~1,000 similar lawsuits get a whole lot scarier for them |
| Engagement Metrics | How long you scroll. Meta set internal goals of 40-46 min/day per user. That’s the number they optimize everything around |
| Section 230 | The law that usually protects platforms from liability for user content. These suits dodge it by suing over product design, not content |
| Unvested Stock | Shares you earn over time by staying at a company. Boland walked before his vested — roughly $10M+ left behind |
| Beauty Filters | Instagram AR filters that alter your face. Zuckerberg personally vetoed banning them in 2020 despite safety team objections |
📖 Backstory: The Man Who Built the Machine
Brian Boland joined Meta (then Facebook) in 2009. Over 11 years he held various advertising roles before becoming VP of partnerships.
- Started with what he calls “deep blind faith” in the company
- Gradually came to the “firm belief that competition and power and growth were the things that Mark Zuckerberg cared about most”
- Told Zuckerberg directly he’d seen concerning data showing “harmful outcomes” from algorithms
- Left the company in 2020, walking away from upwards of $10M in unvested stock
- Says he still finds it “nerve-wracking” to speak against Meta: “This is an incredibly powerful company”
He’s not a low-level leaker. He sat in rooms where decisions were made. That’s what makes this different from previous whistleblowers.
📊 The Numbers That Matter
| Stat | Detail |
|---|---|
| 4 million | Kids under 13 using Instagram in the U.S. (internal document shown in court) |
| 11-year-olds | 4x more likely to keep returning to Meta’s apps than older users (2020 internal doc) |
| 40-46 min/day | Instagram’s internal engagement time goals for users (2023-2026) |
| 16 hours | Longest single-day Instagram session by the plaintiff “Kaley” |
| 40+ states | Joined litigation against Meta — one of the largest coordinated legal actions against a tech company |
| ~1,000 | Similar personal injury cases waiting on the outcome of this trial |
| 17 strikes | The number of violations Instagram allowed before suspending sex-trafficking content accounts |
| 8 hours | Length of Zuckerberg’s testimony on the stand |
But here’s the thing nobody mentions: the plaintiff was also a heavy user of TikTok, Snap, and YouTube. TikTok and Snap already settled. Meta and YouTube are the ones fighting it out. The legal question isn’t whether social media is bad for kids — it’s whether the design choices constitute liability. Different question entirely.
⚙️ What Boland Actually Said on the Stand
The key testimony points, stripped of legal theater:
- Algorithms have “immense power” and are “absolutely relentless” in pursuing engagement goals
- “There’s not a moral algorithm, that’s not a thing”
- Meta’s culture of “move fast and break things” meant shipping products without considering potential harms
- When safety issues surfaced, the primary response was to “manage through the press cycle” rather than investigate deeply
- Zuckerberg made priorities clear in all-hands meetings: mobile-first products, staying ahead of competition
Meta’s counter: Zuckerberg said Boland “developed some strong political opinions” toward the end of his tenure. The implication is clear — they’re positioning him as a disgruntled ex-employee, not a credible witness.

🗣️ Zuckerberg's Defense (And Why It's Thin)
Zuckerberg’s 8 hours on the stand boiled down to a few key deflections:
- “If you do something that’s not good for people, maybe they’ll spend more time short term, but they’re not going to use it over time” — a sustainability argument
- Using time-on-app is just “seeing how we’re stacking up in the industry,” not trying to increase time. Internal docs showing 46-minute daily engagement goals suggest otherwise
- On underage users: “some users lie about their age.” Sure. But internal data showed 4 million of them
- On scientific evidence of harm: “I don’t have a college degree in anything” — actual quote under oath
- On beauty filters he personally kept live despite safety team objections: they support “free expression”
The courtroom also had a moment: the judge threatened contempt of court for anyone wearing Meta Ray-Ban AI glasses during testimony. Members of Zuckerberg’s own security team were spotted wearing them walking into the building.
📰 The Bigger Picture: Big Tech's Tobacco Moment
Legal experts are calling this social media’s “Big Tobacco” reckoning. The comparison has limits — Jonathan Zittrain at Harvard notes the causal link between social media and mental health harm is more complex than smoking and cancer.
But the legal strategy is smart. Plaintiffs aren’t suing over content (which Section 230 would shield). They’re suing over product design — infinite scroll, push notifications, algorithmic content recommendations. These are deliberate engineering choices, not user-generated content.
- 40+ state attorneys general have joined the fight
- A separate New Mexico trial against Meta is running in parallel
- School district lawsuits modeled after 1990s tobacco litigation are queued up
- The trial is expected to run 6 weeks total
- Potential damages: billions, plus forced platform redesigns
If the plaintiff wins, settlement talks for hundreds of other suits begin immediately.
Cool. So an insider confirmed what we all suspected about Meta’s algorithms… Now What the Hell Do We Do? ( ͡ಠ ʖ̯ ͡ಠ)

🛡️ Hustle 1: Family Device Audit Service
Schools and parent groups are desperate for someone who actually understands how these platforms work under the hood. Offer a paid 1-hour audit: review a family’s devices, set up DNS-level content filtering (NextDNS/Pi-hole), configure parental controls properly, and explain what each app actually does with their kid’s attention.
Charge $75-150/session. Parents will pay because they don’t understand the settings and don’t trust the platforms to protect their kids.
Example: IT consultant in Manchester, UK set up a “Digital Family Check-Up” service through local Facebook parent groups. Charges £80/session, does 3-4 per week remotely via Zoom. Pulls in ~£1,200/month on the side.
Timeline: 1-2 weeks to build a booking page and write a service checklist. First clients from local parent Facebook groups and Nextdoor.
💰 Hustle 2: Screen Time Analytics Dashboard (MicroSaaS)
Parents want data, not guilt. Build a simple dashboard that aggregates Screen Time (iOS) / Digital Wellbeing (Android) data exports and presents weekly reports with trends, comparisons, and gentle nudges. Think “Mint for screen time.”
Bark charges $14/month. Qustodio charges $55/year. There’s a gap for a lighter, privacy-first tool that doesn’t require installing invasive monitoring software — just reads the data the phone already collects.
Example: Solo dev in Lisbon, Portugal built a minimal Expo/React Native app that reads iOS Screen Time API data and generates PDF weekly reports. Launched on Product Hunt, hit 400 users in 3 months, charging $4.99/month. ~$2K MRR and growing.
Timeline: 4-6 weeks for MVP if you know React Native. Validate on r/Parenting and r/SideProject first.
📝 Hustle 3: Social Media Litigation Research Freelancing
With 1,000+ lawsuits pending, law firms need researchers who understand platform mechanics. If you can read API docs, explain algorithmic feeds, or analyze engagement patterns — you’re qualified for legal research subcontracting.
Law firms are hiring technical consultants at $50-100/hr to help attorneys understand how recommendation algorithms work, what “engagement optimization” actually means in code, and how to interpret internal Meta documents.
Example: Former social media marketer in Toronto, Canada pivoted to expert witness prep work for a mid-size litigation firm handling youth safety cases. Found the gig on LinkedIn. Bills 15 hrs/week at $85 CAD/hr. ~$5,100 CAD/month from one client.
Timeline: 2-3 weeks to build a portfolio of technical explainers. Cold email litigation firms handling social media cases (public docket search on PACER).
🔧 Hustle 4: School District Digital Safety Workshop
40+ states are suing Meta. School boards are scrambling for CYA documentation that they’re “doing something” about student social media use. Package a 90-minute workshop: how algorithms target teens, what parents can actually control, and hands-on device configuration.
Charge $500-1,500 per session. School districts have PD budgets specifically allocated for digital citizenship. One contract can lead to an entire district’s schools.
Example: Cybersecurity trainer in Nairobi, Kenya adapted their corporate training materials for a secondary school digital safety program. Partnered with an NGO, ran workshops across 12 schools in 6 months. Earned ~$8,000 in fees plus ongoing consulting retainer.
Timeline: 2 weeks to build slide deck and handout materials. Email your local school district’s technology coordinator directly.
💼 Hustle 5: Algorithm-Free Content Curation Newsletter
People want curated content without the algorithmic manipulation. Start a niche newsletter (tech news, creative inspiration, learning resources) that’s explicitly human-curated, no algorithms, no engagement tricks.
The anti-algorithm positioning is the marketing hook. Charge $5-8/month or run it free with sponsorships once you hit 2K+ subscribers.
Example: Freelance journalist in Berlin, Germany started a weekly “No Algorithm” tech digest on Substack. Positioned it as “what matters this week, chosen by a human.” Hit 3,500 free subscribers in 4 months, converted 8% to paid at €6/month. ~€1,680/month.
Timeline: 1 week to set up on Substack or Beehiiv. Consistency matters more than perfection. Post the first issue before you’re “ready.”
🛠️ Follow-Up Actions
| Step | Action | Tool/Resource |
|---|---|---|
| 1 | Read the actual trial filings | PACER (Public Access to Court Electronic Records) |
| 2 | Study Bark and Qustodio feature sets for gaps | App Store reviews, G2 reviews |
| 3 | Join r/Parenting, r/digitalminimalism for market research | |
| 4 | Set up a Google Alert for “social media addiction trial” | Google Alerts |
| 5 | Review your local school district’s digital safety policies | District website, FOIA if needed |
| 6 | Build a simple landing page for any service above | Carrd ($19/yr) or Framer (free tier) |
Quick Hits
| Want To… | Do This |
|---|---|
| Set up NextDNS + configure iOS Screen Time restrictions. Takes 20 minutes | |
| Family device audit service — lowest barrier to entry, highest immediate demand | |
| Follow @LaurenFeiner on X / The Verge’s policy section for courtroom updates | |
| Search “product design liability Section 230” — the key is they’re suing over engineering, not content | |
| Read “Algorithmic Harm, Explained” by Brian Boland himself on Substack (Overturned) |
The man who built the ad machine told you exactly how it works. The question isn’t whether you believe him — it’s what you’re going to build before the next billion-dollar settlement drops.
!