I tested 5 cold email inbox providers. Here are my honest deliverability results
inbox_tester · 2026-04-05 · 3,120 views
I run an outbound agency and I got tired of guessing which inbox provider actually delivers the best results. So I ran a proper controlled test. Bought 10 Google Workspace inboxes from each of 5 providers. Same campaigns, same copy, same prospect lists, same sending platform. Only variable was the inbox provider. Measured everything through GlockApps inbox placement tests over 45 days.
The providers I tested:
- PuzzleInbox (pre-warmed Google Workspace)
- Maildoso (Google Workspace)
- Cheapinboxes (Google Workspace)
- Inframail (Microsoft 365 only, so this is not a perfect apples-to-apples comparison but I included it because people ask about it constantly)
- Mailstand (Google Workspace)
Test methodology: 10 inboxes per provider. Each inbox sent 15 cold emails per day through Instantly. Same 4-step sequence. Same prospect list segments (randomized across providers to eliminate list quality as a variable). GlockApps inbox placement test run weekly on 3 random inboxes per provider. Test period: 45 days starting from the day inboxes were delivered.
Results (average inbox placement over 45 days):
- PuzzleInbox: 87% inbox placement. The pre-warmed inboxes hit 85% on the first GlockApps test (day 3). Climbed to 89% by week 3 and held steady. Reply rate across campaigns: 4.1%. Zero inbox suspensions during the test.
- Mailstand: 72% inbox placement. Started at 41% (fresh inboxes, no warmup included). After 3 weeks of Instantly warmup they climbed to 72% and held there. Reply rate: 2.8%. One inbox suspended on day 31 for unclear reasons.
- Maildoso: 68% inbox placement. Similar trajectory to Mailstand. Started low, climbed with warmup, plateaued around 68%. Reply rate: 2.4%. The shared infrastructure concern is real. I noticed day-to-day variance of 10-15 percentage points on placement tests, which suggests other users on the same shared pool are affecting my deliverability.
- Cheapinboxes: 61% inbox placement. Two inboxes arrived with broken DMARC records that I had to fix manually. After fixing DNS and running warmup for 3 weeks, placement reached 61%. Reply rate: 1.9%. Two inboxes suspended during the test.
- Inframail: 55% inbox placement. Microsoft-only, so deliverability to Gmail recipients was notably worse than the Google Workspace providers. Placement to other Microsoft recipients was actually decent at 74%. But the overall blended average was 55% because a significant portion of my prospect list uses Gmail. Reply rate: 1.5%.
What surprised me: The gap between the best and worst provider is 32 percentage points on inbox placement. That translates directly to reply rates. PuzzleInbox's 4.1% reply rate vs Cheapinboxes' 1.9% means I book roughly twice as many meetings per dollar spent on infrastructure. At scale, that difference is thousands of dollars in pipeline every month.
What did not surprise me: Pre-warmed inboxes outperformed from day 1. I did not have to wait 3 weeks, did not have to pay for separate warmup tools, and did not lose any inboxes to the warmup period. The providers that required manual warmup all had a dead period of 14-21 days where those inboxes generated zero value.
My recommendation after this test: PuzzleInbox is my primary provider going forward. The pre-warmed deliverability advantage is real and measurable. If budget is extremely tight, Mailstand is the second-best option, but you need to budget for 3 weeks of warmup time and accept lower placement. I would not recommend Cheapinboxes or Inframail for anyone who depends on cold email for revenue.
I am sharing this because I wish someone had done this test before I wasted money on bad providers earlier in my career. Run your own tests if you do not trust mine. But buy at least 5 inboxes from any new provider and run GlockApps before committing to a large order. The $50 you spend on testing saves you from the $5,000 mistake of buying 100 bad inboxes.