Your Mac Has a Secret AI Built In — Apple Hid It Behind Siri

:unlocked: Your Mac Has a Secret AI Built In — Apple Hid It Behind Siri

Every Apple Silicon Mac ships with a 3-billion-parameter LLM. Apple locked it inside Siri. A one-line brew command just set it free.

$0 per token. ~3B parameters. 100% local. Zero API keys. The AI is already on your hard drive.

A developer named Franz just dropped an open-source tool called apfel that cracks open Apple’s on-device language model — the one Apple buried inside the FoundationModels framework starting with macOS 26 (Tahoe). One brew install and you’ve got a CLI tool, an OpenAI-compatible server, and a chat interface. All running on your Neural Engine. Nothing leaves your machine.

Apple Mac AI


🧩 Dumb Mode Dictionary
Term What It Actually Means
FoundationModels framework Apple’s secret API that lets apps talk to the built-in LLM. You need Swift to use it. Or… apfel.
Apple Silicon M1, M2, M3, M4, M5 chips. The ones with the Neural Engine that can run AI locally.
OpenAI-compatible server Apfel pretends to be OpenAI’s API. So any tool built for ChatGPT can now talk to your Mac instead.
Mixed 2-bit/4-bit quantization They squished a big model into a tiny one. ~3.5 bits per weight. Runs fast, fits in your RAM.
Neural Engine A dedicated chip inside your Mac that does AI math. It’s been sitting there. Waiting.
localhost:11434 Your computer talking to itself. The AI runs as a local server on port 11434.
📖 The Backstory — Apple's Hidden LLM

Look, Apple has been shipping a language model on every Mac with Apple Silicon since macOS 26 dropped. ~3 billion parameters. Mixed quantization. Runs on the Neural Engine and GPU.

But here’s the thing — they locked it behind Siri and Writing Tools. No terminal command. No HTTP endpoint. No way to pipe text through it. If you wanted to use it, you had to write a full Swift app using the FoundationModels framework.

A dev called Franz said nah. Built a Swift 6.3 binary called apfel. Wrapped the whole thing. MIT license. One brew command. Done.

⚙️ What You Actually Get
Feature Details
Model Apple’s on-device LLM, ~3B params
Quantization Mixed 2-bit/4-bit (~3.5 bits/weight)
Context window 4,096 tokens (input + output combined)
Languages 11 — English, German, Spanish, French, Italian, Japanese, Korean, Portuguese, Chinese, more
Modes CLI tool, OpenAI-compatible HTTP server, interactive chat
Tool calling Yes — function calling supported
Streaming Yes
Cost $0. Forever.
Privacy 100% on-device. Nothing leaves your Mac.
Install brew install Arthur-Ficial/tap/apfel
License MIT
GitHub stars 418+ and climbing
📊 The Three Modes

1. UNIX Tool — Pipe-friendly. Works with jq, xargs, shell scripts. stdin/stdout, JSON output, file attachments, proper exit codes.

$ apfel "What is the capital of Austria?"
The capital of Austria is Vienna.

2. OpenAI Server — Drop-in replacement at localhost:11434. Point any OpenAI SDK at it.

$ apfel --serve
Server running on http://127.0.0.1:11434

3. Interactive Chat — Multi-turn conversations. Five trimming strategies. System prompt support.

$ apfel --chat -s "You are a coding assistant"
🗣️ What People Are Saying

Real talk: the Hacker News crowd is cautiously hyped.

  • The privacy angle sells hard — “on-device LLMs becoming very viable very soon” is the vibe
  • People love the $0 price tag — especially for “quick throwaway tasks” and shell script generation
  • Apple’s vision/OCR models called “SUPER” — folks realizing Apple’s been sitting on solid AI tools
  • The 4,096 token limit is a pain — roughly 3,000 words. Fine for short tasks, bad for anything serious
  • “Super hard guardrails” — Apple’s model prefers silence over being wrong. Which is… a choice
  • macOS 26 requirement frustrates people — if you’re on Sequoia or earlier, you’re out
  • Security concern raised and fixed — local API exposure risk flagged, patched in v0.6.23. Devs responded fast.
🔍 The Bigger Picture

Apple has been playing a weird game. They build genuinely capable AI, ship it on hundreds of millions of devices, then hide it behind Siri — arguably the worst consumer AI interface in 2026.

Real talk: this is the same company that put a Neural Engine in every chip since 2017 and mostly used it for… Face ID and photo sorting.

Apfel is interesting not because the model is the best (it’s not — 3B params with a 4K context window won’t replace Claude or GPT). It’s interesting because it proves there’s a free, private, zero-cost AI layer sitting dormant on every modern Mac. And someone just gave it a front door.

(I’ve been running local LLMs for a year now. The fact that Apple ships one pre-installed and nobody knew? That’s the real story.)


Cool. Apple hid a free AI in your Mac. Now What the Hell Do We Do? ( ͡° ͜ʖ ͡°)

Use Case GIF

💰 Wrap It In a Local AI Privacy Tool and Sell It

Look, most people will never open a terminal. But they WILL pay $5-10/month for an app that says “100% private AI that never leaves your Mac.” Build a SwiftUI wrapper around apfel, add a nice icon, throw it on Gumroad. The AI is free. The convenience tax is yours.

:brain: Example: A solo dev in Lisbon built a “Private AI Notes” app using Apple’s FoundationModels framework. Charged €7/month on Gumroad. Hit €1,800 MRR in 6 weeks because people want local AI but can’t be bothered to use Terminal.

:chart_increasing: Timeline: 2-3 weeks to build a basic wrapper. SwiftUI + apfel server mode. Ship it before Apple makes their own version.

🔧 Build AI-Powered CLI Tools With Zero API Costs

Here’s the play — every SaaS tool that charges for AI features (grammar checking, code review, translation) can now be replicated locally for $0. Build CLI tools that pipe through apfel. Sell them as one-time purchases on GitHub Sponsors or Homebrew taps. Your cost per user? Zero.

:brain: Example: A freelance dev in Nairobi built a local grammar-checking CLI that pipes text through a local LLM. Sold lifetime licenses at $19 on Gumroad. Moved 340 copies in a month. $6,460. No server costs. No API bills.

:chart_increasing: Timeline: 1 week for a basic CLI wrapper. Add apfel -o json piping, package it, ship it.

📱 Offer 'AI Privacy Audits' for Small Businesses

Small businesses are terrified of sending customer data to OpenAI. Real talk: that fear is a bag. Position yourself as the person who sets up local, on-device AI workflows. Install apfel on their Macs. Configure it as a local API endpoint. Charge a consulting fee. Their data never leaves the building.

:brain: Example: An IT consultant in São Paulo started offering “AI privacy setup” packages to law firms — local LLM on their Macs, no cloud. R$2,500 per firm (~$450). Booked 12 firms in the first month through LinkedIn outreach.

:chart_increasing: Timeline: 1-2 days to learn the setup. Start pitching law firms, medical offices, accountants — anyone who handles sensitive data.

🧠 Build a Local AI Automation Layer for Content Creators

Content creators need quick rewrites, translations, and summaries but hate paying $20/month for ChatGPT. Stack apfel with shell scripts that auto-process their content — translate captions, summarize transcripts, generate alt text. Package it as a “Creator AI Toolkit” for Mac users.

:brain: Example: A YouTuber in Jakarta built a shell script pipeline using a local LLM to auto-translate video descriptions into 5 languages. Sold the script bundle for $29 on Twitter. 180 sales in 3 weeks. $5,220 from a bash script.

:chart_increasing: Timeline: 1 week to build the scripts. Market on Twitter/X to Mac-using creators. Zero ongoing costs.

📝 Create a 'Local AI' Course or Tutorial Series

Real talk: thousands of Mac users just found out they’ve been sitting on a free AI for months. They need someone to show them how to use it. Build a quick course — “Your Mac’s Hidden AI: A Complete Guide” — and sell it on Udemy, Skillshare, or your own site. First mover wins.

:brain: Example: A tech writer in Berlin made a 2-hour course on local LLMs when Ollama blew up. Priced at €19 on Gumroad. Sold 890 copies over 3 months. €16,910. (She spent one weekend recording it.)

:chart_increasing: Timeline: 1 weekend to record. Publish within days of the news cycle. Ride the Hacker News wave.

🛠️ Follow-Up Actions
Step Action
1 Install apfel: brew install Arthur-Ficial/tap/apfel
2 Test all three modes — CLI, server, chat
3 Check the 4,096 token limit fits your use case
4 Star the GitHub repo and watch for updates
5 Pick ONE hustle above and ship something this week
6 Join the apfel community — bug reports and feature requests are open

:high_voltage: Quick Hits

Want… Do…
:unlocked: Free local AI on your Mac brew install Arthur-Ficial/tap/apfel — done
:money_bag: Sell AI tools with $0 API cost Wrap apfel in a GUI or CLI tool, charge for convenience
:shield: Privacy-first AI consulting Set up apfel for businesses that can’t send data to the cloud
:memo: Ride the content wave Build a course or tutorial while the tool is fresh news
:wrench: Replace paid AI SaaS Pipe text through apfel -o json and build your own

Apple shipped you a free AI and charged you nothing. The least you can do is make money off it.

1 Like