Apple's $599 MacBook Neo Beats a 16-Core Cloud Server — On Cold Queries

:bar_chart: Apple’s $599 MacBook Neo Beats a 16-Core Cloud Server — On Cold Queries

DuckDB ran 43 analytical queries on Apple’s cheapest laptop. The results are… complicated.

0.57 seconds median cold query. 59.73 seconds total. $599 laptop vs. $0.768/hr cloud box with 4x the RAM. But wait — the hot-run numbers tell a different story entirely.

DuckDB’s team bought Apple’s brand-new MacBook Neo on launch day and pitted it against AWS instances costing thousands per month. The headlines write themselves: “budget laptop destroys the cloud.” The data says something more nuanced.

macbook-gif


🧩 Dumb Mode Dictionary
Term What It Actually Means
DuckDB An in-process analytical database. Think SQLite but for crunching data instead of storing app state
ClickBench A benchmark with 43 queries that hammer aggregation and filtering on 100 million rows
TPC-DS A harder benchmark — 24 tables, 99 queries, window functions, the works
Cold run First query execution when nothing is cached. Disk speed dominates
Hot run Repeat execution when the OS has cached data in memory. CPU dominates
NVMe SSD Fast local storage soldered to the laptop. No network hop required
A18 Pro Apple’s iPhone 16 Pro chip, now inside a $599 laptop. 2 fast cores + 4 slow ones
Spilling to disk When data won’t fit in RAM, the database writes temporary files to storage
📖 What Apple Actually Shipped

The MacBook Neo dropped March 11, 2026. Here’s what $599 gets you:

  • Chip: Apple A18 Pro — 6-core CPU (2 performance, 4 efficiency), 5-core GPU
  • RAM: 8 GB unified memory. That’s it. No upgrade option
  • Storage: 256 GB base ($599) or 512 GB ($699). Also no upgrade
  • Display: 13-inch Liquid Retina, 2408×1506, 500 nits
  • Weight: 2.7 lbs. Two USB-C ports. No MagSafe
  • Colors: Silver, Indigo, Blush, Citrus
  • Battery: 16 hours claimed. EU box doesn’t include a charger

This is the first Mac powered by iPhone silicon. Apple claims 50% faster than Intel Core Ultra 5 laptops and 3x faster on “AI workloads.” Education pricing drops to $499.

📊 The Benchmark That Started the Argument

DuckDB v1.5.0 ran ClickBench (43 queries, 100M rows, ~14 GB Parquet) on three machines:

Machine Cores RAM Cold Median Cold Total Hot Median Hot Total
MacBook Neo 6 (2P+4E) 8 GB 0.57s 59.73s 0.41s 54.27s
AWS c6a.4xlarge 16 vCPU 32 GB 1.34s 145.08s 0.50s 47.86s
AWS c8g.metal-48xl 192 vCPU 384 GB 1.54s 169.67s 0.05s 4.35s

Cold runs: The Neo crushed both cloud instances. Sub-second median. Under a minute total. The AWS boxes with their network-attached storage couldn’t keep up on first reads.

Hot runs: Different story. The 192-core Graviton monster finished in 4.35 seconds — 12.5x faster than the Neo. Even the mid-tier 16-core box edged ahead on total time.

But here’s the thing nobody mentions: the Neo was only 13% slower on hot total runtime than a machine with 10 more CPU threads and 4x the RAM. Dollar for dollar, that’s absurd.

🔍 The Harder Benchmark (TPC-DS)

DuckDB also ran TPC-DS, which is closer to real analytical workloads with joins across 24 tables and 99 queries:

Scale Factor 100 (6 GB memory limit):

  • Median query: 1.63 seconds
  • Total runtime: 15.5 minutes

Scale Factor 300 (6 GB memory limit):

  • Median query: 6.90 seconds
  • Query 67 alone: 51 minutes (spilling ~80 GB to disk)
  • Total runtime: 79 minutes

At SF300, you’re asking a $599 laptop with 8 GB RAM to chew through a dataset that requires 80 GB of temporary disk writes. It finishes. It just takes a while.

DuckDB’s team set the memory limit to 5 GB (not 8) to avoid fighting macOS for swap space. That’s a common trick — let the database manage its own memory rather than trusting the OS page cache.

🗣️ What People Are Actually Saying

The HN thread split into three camps:

The Believers: One dev replaced 400 lines of C# with 10 lines of DuckDB in an AWS Lambda. Another ported a legacy Python app to Polars/DuckDB and got a 40-80x speedup in two days.

The Skeptics: “Slack and a browser will bring it to its knees” — 8 GB is tight when you’re also running dev tools. Docker containers on 8 GB configs cause noticeable performance hits.

The Realists: “The important question is whether the Neo displaces $350 Windows machines or $1,300 MacBook Airs.” The benchmark story is neat, but the market story matters more.

Multiple commenters also questioned whether 14-75 GB datasets qualify as “big data” in 2026. Fair point. But that’s also the range where most actual analyst work happens.

⚙️ Why Cold Runs Matter More Than You Think

The Neo’s cold-run dominance isn’t a fluke — it reveals a real architectural gap.

Cloud VMs use network-attached EBS volumes. First-read latency on a cold EBS gp3 volume runs 1-3ms per 4KB block. The Neo’s local NVMe? Under 0.1ms. That’s 10-30x faster on random reads.

For analysts who spin up a query, get results, and move on — cold performance IS the performance. You’re not running the same 43 queries in a loop. You’re running one query, reading the answer, then writing another.

The hot-run case (where the 192-core monster dominates) assumes the OS has cached everything in RAM. That requires 384 GB of memory and a stable workload. Most real analytical sessions don’t look like that.


Cool. A $599 laptop runs analytical SQL. Now What the Hell Do We Do? ( ͡° ͜ʖ ͡°)

💰 Sell 'Local-First Analytics' Consulting to SMBs

Small businesses pay $200-800/month for cloud data warehouses they barely use. Most of their datasets are under 50 GB.

Show them DuckDB on a cheap MacBook (or any laptop) can run the same queries faster on cold reads. Set up automated Parquet exports from their SaaS tools. Charge $2-5K for the migration plus $500/month for maintenance.

:brain: Example: A freelance data consultant in Portugal migrated 3 small e-commerce clients from BigQuery to DuckDB running on $800 Mac Minis. Saved each client ~$400/month in cloud costs. Charges $1,500/month across all three for ongoing support.

:chart_increasing: Timeline: 2-3 weeks to build a template pipeline. First client within a month if you’re already doing data work.

🔧 Build a 'Portable Data Lab' Product for Field Researchers

Scientists, field biologists, and NGO workers often collect data in places with unreliable internet. They need to run analysis locally.

Package a pre-configured DuckDB environment with Jupyter notebooks on a cheap MacBook Neo. Bundle domain-specific query templates. Sell it as a turnkey “field analytics station” to research grants and NGOs.

:brain: Example: A marine biology PhD student in Indonesia built a portable water quality analysis kit using DuckDB + Python on a refurbished MacBook. Sold the setup guide and template notebooks to 4 other research groups at $800 each. Now consulting for an ocean conservation NGO at $3K/month.

:chart_increasing: Timeline: 1-2 months to build the first kit. Target grant-funded research groups who already budget for equipment.

📊 Create a 'Benchmark-as-Content' Newsletter for Data Engineers

DuckDB’s post generated massive engagement because it had real numbers comparing real hardware. Data engineers are starved for honest benchmarks.

Run monthly benchmarks across different hardware/database combos. Publish a free newsletter. Monetize with sponsored deep-dives from database vendors and cloud providers who want their products tested.

:brain: Example: A data engineer in Germany started a Substack benchmarking DuckDB vs. ClickHouse vs. Polars on consumer hardware. Hit 4,000 subscribers in 6 months. Now earns $2,800/month from paid tiers and $1,500/month from two vendor sponsorships.

:chart_increasing: Timeline: First issue in a week. Monetization at 1,000+ subscribers, typically 3-4 months.

💼 Flip Cheap MacBooks Into 'Data Analyst Workstations'

The $599 MacBook Neo with DuckDB pre-installed is genuinely more capable for analytical work than most $1,500 Windows laptops running Excel. Most people don’t know this.

Buy MacBook Neos at education pricing ($499). Pre-install DuckDB, Jupyter, common data connectors. Sell them as “analyst-ready workstations” to bootcamp graduates and career switchers for $899-999.

:brain: Example: A tech reseller in Nigeria buys refurbished MacBooks, pre-installs a data science stack (DuckDB, Python, VS Code, sample datasets), and sells them to data analytics bootcamp graduates for a 40% markup. Moves 8-12 units monthly.

:chart_increasing: Timeline: Immediate if you already have a resale channel. First batch within 2 weeks.

🛠️ Follow-Up Actions
Step Action Resource
1 Install DuckDB and run ClickBench locally DuckDB docs
2 Study the actual benchmark scripts ClickBench DuckDB implementation
3 Compare your current cloud spend vs. local DuckDB DuckDB performance guide
4 Build a sample Parquet pipeline for a real dataset DuckDB Parquet import guide
5 Join the DuckDB Discord for community benchmarks DuckDB community

:high_voltage: Quick Hits

Want to… Do this
:magnifying_glass_tilted_left: Test DuckDB yourself brew install duckdb and run ClickBench locally — takes 20 minutes
:money_bag: Cut cloud analytics costs Export your warehouse to Parquet, query it with DuckDB on any laptop
:bar_chart: Benchmark your own hardware Fork the ClickBench repo and run the DuckDB implementation
:wrench: Handle datasets > RAM Set memory_limit to 60-70% of your RAM and let DuckDB spill to disk
:mobile_phone: Run on even less hardware DuckDB ran TPC-H SF100 on an iPhone 16 Pro in under 10 minutes

A $599 laptop with a phone chip just humiliated a cloud server on first read. The cloud’s not dead — but the bill might be optional.

2 Likes