Is Adobe’s Marketo truly “the worst” at getting emails delivered?

For 60 days, I monitored the top 7 Email Automation vendors to learn how well they manage the mailer reputation of their systems.

Come along as I spend $340 of my own money to monitor 3,735 IP addresses and rank the vendors. Discover Pardot and MailChimp‘s tricks for staying off email spam lists, Eloqua’s surprising spot in the rankings (not at the top!) and learn whether Adobe’s Marketo is actually The Worst™ at avoiding blocklists. Oh, and HubSpot, Act-On and Campaign Monitor are also here.

Read on for the full story, listen to it on Mike MacFarlane’s podcast (archive link) or go straight to the vendor rankings!

Backstory

I am a Marketing Operations Manager who’s been in Marketing Ops since 2013. In late November 2022, I was having a normal day at Auvik when an urgent message popped up on my screen.

We were getting a spike in website traffic – 80,000 extra visits in a day. A quick investigation showed that the traffic was coming from clicks on that day’s newsletter. It was sent from our enterprise-grade Marketo email automation system. 4,800 recipients were clicking on every single link in the email (and there were lots of links…) As a result, our web conversion figures for the month got muddled and hundreds of records were mistakenly passed sent for Sales follow up because of this link click activity.

I looked deeper into potential issues with our email sending servers. Bounce error messages showed that our emails were blocked because Marketo’s email sending systems were on the SORBS and SpamCop blocklists (usually reserved for spammers).

This was a promising thread for investigation. We were on Marketo’s shared IP range, meaning that the “blocked” email sending systems were ones that we shared with other customers. Any spamming activity from those sharing “neighbours” could get us in trouble.

Data from SpamCop showed that many of Marketo’s IPs were spending 1 out of every 3 days on the blocklist. And data from SORBS showed that the event that triggered the blocklistings wasn’t our fault. We didn’t send out any mailings on the day the trouble began. It must’ve been our neighbours!

I emailed the Marketo team with all this data. Sat back in my chair. And waited for a “sorry we let things get so bad” letter from Marketo.

Marketo’s team responded with a big shrug.

Instead of admitting that they goofed, they reminded me that we could always pay them $6,000/year to rent a dedicated IP address. Eventually, we did get the dedicated IP and our email reputation became pristine. But I resented paying Marketo to solve a problem that their negligence created.

“Marketo must have the worst deliverability team. With the worst reputation monitoring out of all the vendors!!!” I was so mad.

Hey, wait a minute… I actually had the skills to check if this was true.

With a little creativity, clunky programming skills and some anger-money, I got to test this hypothesis.

The adventure begins

In late December 2022 I set out to monitor the spammer blocklistings for every shared IP address that the big Marketing Automation vendors use to send email.

I gathered the list of IPs using vendor documentation and the SPF Surveyor tool. With the help of the Talos Intelligence tool, I picked out only the active email-sending addresses that were used in the “shared sending pool” for customers.

BlackListMaster would check the blocklisting status for my monitored IPs every 4 hours. It would then send the status to a custom script I developed. My script would log the data in an SQLite database for later analysis.

For 60 days – from Jan 19, 2023 to March 20, 2023 – I kept track of the following top 7 vendors:

  • Act-On
  • Campaign Monitor
  • Eloqua
  • HubSpot
  • MailChimp
  • Marketo
  • Pardot (Marketing Cloud Account Engagement)

It was 3,735 email sending IP addresses in total.

I met up with the BlackListMaster himself in a dark alley. I handed over $340 sweet plastic Canadian Dollars into his grime-streaked hand.

And the game was afoot!

Which blocklists matter?

Ok, ok, let’s slow down a little.

You should know that every Blocklist monitoring service will boast of the astronomical number of blocklists they monitor. BlackListMaster monitors 100 of them. But most of them don’t matter.

I decided to keep things simple by focusing only on the three that Marketo’s own documentation prioritizes. Those are SpamHaus (tier1), SpamCop (tier 2) and SORBS (tier3).

SpamHaus matters the most. Getting on that list has serious email delivery consequences and it is difficult to get de-listed. However, you have to do something heinous, like sending phishing emails, to get on the SBL list in the first place.

SpamCop is the second most important blocklist. Getting on SpamCop has an impact on your deliverability. It isn’t too easy to get on that list, and it isn’t too hard to get off it. It is the “goldilocks list” that a lot of my analysis focused on. Remember that, originally, the problems that hit my employer’s emails were mixed with bounce messages citing a SpamCop blocklisting. You get on SpamCop’s radar by emailing spamtraps, when angry recipients flag your emails, when you use open mail relays + proxies, and through automated spam reports. Recipients manually report 587k messages per week to SpamCop.

SORBS is the least important list of the 3. It is very easy to get on it, easy to get off, and there are few consequences to landing on this list. Discussion in the emailgeeks community generally indicates that a SORBS listing doesn’t impact deliverability. Validity’s docs support this view. People view it as a useful leading indicator of list hygiene problems.

SORBS is helpful as a troubleshooting tool because it shows when an active listing period first began – this lets you narrow down which of your emails triggered a listing. This is how I knew that our neighbours got our shared IP range blocklisted: on the day the IP was listed, we didn’t send out anything notable.

Example of a listing on SORBS. Note the helpful dates.

SORBS’ listing criteria include using open relays, sending to spam traps, or sending from IP ranges that are meant for household internet usage. There is no way for a person to manually complain to SORBS – remember this for later.

Results

How vendor IPs landed on a blocklist?

Let’s look the shared IPs for each ESP (Email Service Provider), and the cumulative percentage of IPs that were listed on tier 1, tier 1 & 2, and finally tiers 1, 2 & 3 in the period.

Vendor Total Shared IPs Monitored # of IPs on SpamHaus % # of IPs on SpamHaus OR SpamCop % # of IPs on SpamHaus OR SpamCop OR SORBS %
Act-On 252 0 0.0% 19 7.5% 28 11.1%
Campaign Monitor 344 17 4.9% 18 5.2% 74 21.5%
Eloqua 81 0 0.0% 16 19.8% 21 25.9%
HubSpot 1,113 2 0.2% 169 15.2% 550 49.4%
MailChimp 1,488 0 0.0% 133 8.9% 133 8.9%
Marketo 262 0 0.0% 32 12.2% 34 13.0%
Pardot 195 0 0.0% 0 0.0% 2 1.0%
Total 3,735 19 0.5% 387 10.4% 842 22.5%


The most worrisome is Campaign Monitor. SpamHaus is Serious Business™ and has the most serious impact on mailers. 17 of CM’s IPs – almost 5% of their shared pool – landed on this important blocklist. This means their customers had a 5% chance of getting their emails blocked during the listing period. CM tends to stay off SpamCop and SORBS, but that fact doesn’t make up for their time on SpamHaus.

Right behind Campaign Monitor, are Eloqua and HubSpot. For them, the fact that 20% and 15% of their monitored IPs landed on SpamCop puts them at the bottom of email reputation management. They also have the highest % of IPs listed on any of the 3 lists when you factor SORBS into the mix.

(Note: HubSpot has 2 IPs on SpamHaus, but because they have >1,000 shared IPs the impact is small and I don’t hold it against them.)

The stars are Pardot, with just 2 IPs in total landing on ANY of the 3 significant blocklists. Act-On, with just 8% of their IPs landing on SpamCop. And MailChimp, with a low 8.9% appearing on SpamCop and none of their 1,488 IPs landing on SORBS.

Marketo, the target of my ire, is just… middling. Not in the “best” group and not in the “worst” group. Only 32 of their IPs experienced a tier 2 block during the period but, from firsthand experience, the “blast radius” of these listings is high because a single IP in their pool usually sends out many different customers’ emails at once. So 1 IP getting listed could still impact 40 customers.

Curious finding: how does Mailchimp stay off SORBS while HubSpot lands on it?

Let’s look at listing data for just SORBS to explore a mystery:

Vendor Total Shared IPs Monitored IPs listed on SORBS in period % of vendor IPs that landed on SORBS
Act-On 252 10 4.0%
Campaign Monitor 344 72 20.9%
Eloqua 81 21 25.9%
HubSpot 1,113 486 43.7%
MailChimp 1,488 0.0%
Marketo 262 5 1.9%
Pardot 195 2 1.0%
Total 3,735 596 16.0%

Note that MailChimp and HubSpot both manage big sending pools – each hsas over 1,000 IPs in them. But their performance with SORBS is very different: a full 43% of HubSpot’s IPs were trapped by SORBS, while none of MailChimp’s got in trouble during the 60 days. What can we learn about the 2 vendors from this difference?

When looking at their listings on SpamCop, HubSpot and MailChimp have more comparable blocklisting rates (15% and 9%).

So, the emails that both of these ESP send out land them in hot water with SpamCop equally. But MailChimp stays off SORBS completely. Can we figure out what is different between SORBS’ and SpamCop’s listing criteria?

The answer is that SORBS is fully automated, while SpamCop uses people’s manual spam complaints to get senders in trouble. So, neither ESP can stop the manual action of angry recipients that flag annoying emails. That’s why both get caught by SpamCop.
But, MailChimp has been avoiding the automated ways in which a blocklist catches spammers (which, for our purposes, means avoiding spamtraps)

Hot take: I believe Mailchimp buys 3rd party email lists and web-scrapes addresses to pre-emptively build a database of risky emails. If a customer tries to communicate with one of these addresses, the customer is blocked from doing so. I believe that this is what MailChimp’s Omnivore system does under the guise of “Magical AI 🧙‍♂️”.

What we’re seeing is data-driven proof that Omnivore works.

MTBF and MTTR

MTBF (Mean Time Between Failures) is a standard metric that explains “how long does this email server go between blocklistings?”. A larger number is better.

For this analysis, MTBF was calculated as an average figure, and was calculated only for IPs that were blocklisted in the period. This isn’t exactly the way you’d calculate this figure for a group of machines in a factory, but does a good job of comparing how well ESPs keep their servers out of trouble.

MTTR (Mean Time to Recovery) is a measure of a vendor’s responsiveness once an IP lands on a blocklist. It answers the question: “Once an email server is blocklisted, how long does it take the ESP to get it off the list?”

Tier 1 – SpamHaus

Vendor Unique IPs that landed on SpamHaus # of stints: SpamHaus Avg. MTBF (hours): SpamHaus Avg. MTTR (hours): SpamHaus
Campaign Monitor 17 22 1,318 8
HubSpot 2 2 1,362 42
Grand Total 19 24 1,323 12

Only Campaign Monitor and HubSpot got on the SpamHaus list during the monitoring period. For SpamHaus, the consequences of any time on this blocklist are severe. Campaign Monitor has a long average time between incidents, but this is still serious. I’m less concerned about HubSpot’s 2 IPs landing on the list because HubSpot’s huge pool of 1,113 IPs reduces the impact of these two listings.

Tier 2 – SpamCop

Vendor Unique IPs that landed on SpamCop # of stints: SpamCop Avg. MTBF (hours): SpamCop Avg. MTTR (hours): SpamCop
Act-On 19 44 1,155 19
Campaign Monitor 1 1 1,412 24
Eloqua 16 244 62 34
HubSpot 168 470 788 17
MailChimp 133 198 1,124 23
Marketo 32 284 315 23
Pardot 0 0 NA NA
Total 369 1,241 857 21

Pardot has zero listings! It is the winner in terms of proactively staying off this important blocklist.

As for the rest:

Act-On and MailChimp are great at keeping their IPs off the list, and ensuring that they stay off the list (with their high MTBF).

Campaign Monitor does a great job of staying off SpamCop too – but their stints on SpamHaus mean that this vendor doesn’t get a pat on the back.

Note that Maketo is middle of the road: 32 IPs on SpamCop (12% of their total) but the impacted IPs are landing on the list fairly frequently – once every 315 hours on average (that’s every 13 days).
The fact that a small number of their IPs are repeatedly getting in trouble has me speculating that they have their shared IP pool segmented into several parts with different risk levels. Marketo does offer a “Trusted Shared IP” pool for smaller senders. This might be how, in fact, we end up with a “Better Senders” segment and a “Worse Senders” segment.

HubSpot is similar to Marketo in terms of the number of IPs on SpamCop (15% of their pool). But they do stay off the list for longer – 33 days on average (vs. Marketo’s 13) and the IPs get delisted quicker (17 hours vs. Marketo’s 23).

Eloqua is the worst here. 16 IPs – one in five of their IPs – made it onto SpamCop. And these IPs are getting blocklisted once every 2.5 days (62 hours) on average. Given how few IPs Eloqua has in rotation, they are definitely neglecting the reputation of their shared servers.

Tier 3 – SORBS

Vendor Unique IPs that landed on SORBS # of stints: SORBS Avg. MTBF (hours): SORBS Avg. MTTR (hours): SORBS
Act-On 10 12 1,244 48
Campaign Monitor 72 420 544 48
Eloqua 21 209 245 49
HubSpot 486 1,318 805 47
MailChimp 0 0 NA NA
Marketo 5 6 1,222 42
Pardot 2 2 1,354 50
Total 596 1,967 766 48

SORBS rules indicate that an IP is de-listed 48 hours after the last spam message was received from that address. That’s why the MTTR for all vendors is so close to 48 hours. (Note: the 48-hour is supposedly a “short” duration for first-time offenders. But the data shows it actually applies to repeat offenders, too.)

Mailchimp is not present on SORBS at all, because of their Omnivore system. Pardot is also excellent, with their 2 listed IPs only being on the list for one stretch each and getting off the list in 50 hours.

HubSpot had 44% of their IPs landing on SORBS, going 30 days between stints. Campaign Monitor and Eloqua also have a notable number of their IPs on the list (over 20%) and go 22 and 10 days on average between each IP getting listed. This only matters in terms of a “forward looking” indicator of how frequently their clients are sending mail to scraped/purchased lists.

Vendor Rankings

Pardot ← BEST

The Pardot team are ⭐STARS⭐ for the way they manage shared IP reputation!

Pardot is a case study in running a small but well-managed pool of IPs. Their IPs stayed off SpamCop and SpamHaus completely, with just 2 IPs listed on SORBS (the list that’s easiest to fall into).

What is Pardot’s secret for staying off the blocklists?

I’ve administered a Pardot system for 5 years. In that time, we used a shared IP and had no deliverability problems. From personal experience, this is how their team maintains good mailer reputation:

  1. By default, you are blocked from emailing a set of generic addresses – such as “admin@XYZ.com”, “office@…”, “accounting@ …” and so on. These tend to be shared mailboxes that are disengaged and are a high risk for spam complaints.
  2. Every time that you upload a list into Pardot, they make you acknowledge their Permission-Based Marketing Policy. The policy clearly blocks activities that are known to destroy your mailer reputation:
    Our customers certify that they will not use rented, traded, or purchased lists, email append lists, or any list that contains email addresses captured in any method other than express, customer-specific opt-in when using our system to send emails.
  3. Pardot’s policies force you to get a dedicated IP if you send out more than 250,000 emails a month. So, larger mailers who have a bigger blast-radius in case they’re flagged as spammers, are kept away from the shared IP pool.
Screenshot of Pardot’s list upload screen. Notice the mandatory “I shall not scrape” compliance checkbox at the bottom.

Pardot and Marketo have a similar-sized address pool of 195 and 262 respectively. It is reasonable to expect Marketo to do an equally good job of keeping their mailer reputation clean. Marketo fails on that front.

Act-On

Act-On’s their team stood out as being solid stewards of their shared IP range.

They’re not one of the “heavyweight” marketing automation vendors, but they are at the top of their industry for maintaining a clean email reputation. Which means better delivery rates for their customers.

With a similar number of IPs to Pardot and Marketo, only 8% of Act-On’s servers landed on SpamCop in the 60 days. But when an IP did get listed, their team sprang into action and got it off SpamCop in 19 hours on average (second only to HubSpot’s 17 hours, the fastest of all vendors).

MailChimp

MailChimp is one of the stars.

They have a massive IP pool – 1,488 servers. Although many of their servers landed on SpamCop, they made up only a small percentage (9%). This is a classic example of a “get a large pool of IPs and rotate them quickly” playbook. MailChimp and HubSpot are playing the same game, with MailChimp doing it more successfully.

MailChimp is remarkably proactive when it comes to managing their clients’ email IP reputation. As noted above, I believe that their Omnivore program involves buying/scraping email addresses and then blocking customers from loading these “known to be risky” contacts into the database.

HubSpot

HubSpot is a middle performer when it comes to email reputation on their shared IP range.

They’re not good at preventing their IPs from getting blocklisted (15% got on SpamCop, 44% on SORBS). However, they are fastest to get their IPs de-listed (17 hours on SpamCop). This is a solid strategy.

An interesting finding: it appears that HubSpot directs customers to allow all of SendGrid’s servers send email on HubSpot customers’ behalf (through SPF policy). That’s a potential security problem. It means that HubSpot customers are doubly vulnerable to bugs in the HubSpot and the SendGrid systems. (While monitoring email blocklistings, I made sure to exclude any SendGrid IPs that did not belong exclusively to HubSpot.)

Marketo

Ahhh… my old frenemy Marketo.

I started off on this research project thinking that “Marketo must be the worst“. But it turns out that Marketo is an average to below-average player when it comes to the health of their shared IP addresses. Nothing exceptionally bad.

12% of Marketo’s shared email servers landed on SpamCop during my monitoring. That’s better than Eloqua and HubSpot. Marketo mostly stayed off SORBS.

What makes blocklistings particularly irritating on Marketo is that a single customers’ single mailing is split over a large number of individual servers. During blocklistings, usually 4-7 of those IPs would be blocked at the same time. That means that one customer’s bad behaviour impacts about 4 to 7 servers, which then impact a large number of other customers. So, the “blast radius” from the blocklisting is pretty wide. I’m not sure if this is standard behaviour. Perhaps Marketo can improve customer deliverability by distributing customers’ mailings over a smaller number of machines.

If you are a Marketo customer who’s looking to improve deliverability, the best thing to do is to shell out the money for a dedicated IP address. Show them this article and ask them how they can justify charging you full price for fixing a problem that their negligence created.

The second-best thing to do is to ask your Marketo Account Manager about the “Trusted Shared IP range”. This is a good boys and girls section of their shared IP pool that’s reserved for smaller mailers with a good reputation. There are several conditions for qualifying. The main condition is that your email volume has to be 100,000 emails a month or less.

Eloqua ← WORST

Eloqua’s approach is the worst of all worlds – they have a small pool of shared IPs, and they’re sloppy with them.

20% of Eloqua’s IPs landed on SpamCop during the monitoring period. That’s the highest proportion of any vendor. These IPs were landing on that blocklist every 3 days. It took the Eloqua team a day and a half to get each IP de-listed. These are the worst MTBF and MTTR figures out of all vendors.

Why is Eloqua so bad at staying off SpamCop? My guess is that Eloqua’s parent company, Oracle, focuses on large enterprise customers. Those customers are normally going to have a dedicated IP. Any customers who use the shared IP range are probably “too small to matter”. Perhaps the Eloqua team tries to spend the minimum effort on maintaining the IP pool’s reputation.

Interestingly, it appears that Eloqua uses the servers in the shared IP range to send out both customer mailings and their own application’s emails (like alerts, reports and list-export notifications). So spam listings that are caused by customer mailings would impact delivery of Eloqua’s product-generated notifications to Eloqua’s own customers.

Eloqua’s documentation lists the shared sending IPs for their clients. It states that all app-generated reports also go out from this same IP range.

Campaign Monitor ← Dishonourable Mention

Campaign Monitor is a vendor that I’ve struggled to rank. It is in the “bad kids corner” but there are grounds for disagreement.

The positive thing about Campaign Monitor is that only 1 of their IPs (out of 344) made it onto the SpamCop blocklist in the monitored period. That’s fantastic.

The negative thing is that 17 of Campaign Monitor’s shared IPs (5% of the total) landed on the SpamHaus blocklist. That’s the one with the most serious consequences. 14 of their servers landed on that list in an incident on January 30th and were quickly removed from the list. But several of their IPs returned to the blocklist in 8 separate incidents. In those subsequent incidents in March, they stayed on the list for longer. The impact on customers mailing out of those IPs would’ve been severe.

To me, Campaign Monitor’s stints on SpamHaus are a big red flag.

Campaign Monitor has many sister brands – like Emma, Liveclicker and Sailthru – all under the “Marigold” umbrella. In my monitoring, I grouped all those shared IPs together. It is possible that it is Campaign Monitor’s sister brands that were getting in trouble with SpamHaus. But I doubt it.

Conclusion: Vendor actions matter

A Marketing Automation vendor’s actions around deliverability make a difference.

The data shows that there are simple policies and tools that will reliably boost the reputation of a vendor’s email servers.

If you work at an ESP or a Marketing Automation software vendor, I urge you to apply these 4 proven techniques:

  1. By default, block customers from importing/mailing to generic email addresses that are known to be a “bad fit” for email marketing. Mailboxes with names like “office”, “help”, “info”, “accounting”, “ar”, “ap” etc. ← from Pardot
  2. Set rules against web scraping and list-buying in the customer contract. Remind users of these rules as part of every list upload into your system. No – this isn’t going to deter everyone from web-scraping. But it does give powerful ammunition to Marketing Ops Managers like me to push back the next time the VP of Marketing has the brilliant idea to start web scraping. ← from Pardot
  3. Pre-purchase the biggest, dirtiest lists of email that you can find. Go to every scuzzy forum to find them. Grab the latest web crawl data from Common Crawl and extract every email address you see. Then block your customers from using those emails. Because if your customer is uploading them, your customer got them by either buying lists or scraping websites. ← from MailChimp
  4. If you run a pool of shared mailing IPs, break it apart into at least 2 segments so that you can manage them separately. This limits the fallout of a serious blocklisting incident, and allows you to set different policies for different types of customers. The implementation can be something like Marketo’s “Trusted Shared IP” range for small mailers, or something as extreme as Pardot’s “Ya gotta get a dedicated IP, Buster” policy once email volume hits 250,000 a month. ← from Pardot and Marketo

And one more thing, vendors:

If I can monitor EVERY vendor IP, every 4 hours, with my homebrew web app… then there is no excuse if you aren’t monitoring your servers this way, too.


Raw data

You are welcome to perform your own analysis of the underlying data. Below is the Excel .xlsx file that I used, tidied up with comments and explanations. Start at the “Readme” tab.

If you are a glutton for punishment, here is the raw SQLite file containing log data like “192.168.0.1 got onto Blocklist X at time Y”.

Notes about methodology

  • To download a MAP vendor’s entire range of sending IPs, I looked at the default SPF “include” entry that they instruct customers to set up during onboarding. Then, I used https://dmarcian.com/spf-survey/ to do a recursive deep-dive into those records and bring up a complete list of IPs and IP ranges.
  • I used https://talosintelligence.com/reputation_center in order to get basic email sending volume and hostname information on every IP address within every IP range. I got a large volume of data by using repeated manual lookups and DeveloperTools to capture the responses for each range.
  • For ongoing blocklist monitoring, I used the excellent paid tool https://www.blacklistmaster.com/
  • ExactTarget (Salesforce Marketing Cloud) was not one of the vendors monitored as part of this research for cost reasons. Their base of 2,500 sending IPs would have upped the costs of monitoring to a higher tier. Perhaps you, dear reader, would like to monitor their reputation and write up your findings? (You can use my tools gratis to make analysis faster.)
  • For Campaign Monitor, I monitored the entire CM range that included Sailthru and CreateSend addresses. This might be intermingling the email reputation score of different departments (assuming that they don’t all intermingle their mailouts).
  • I didn’t monitor every single IP address in an ESP’s range. I chose ones that had non-zero email send volume. So, if the ESP added or dropped an IP from their sender pool, I would not have detected this. It is possible that some ESPs take out an IP from the sending pool once it hits a blocklist and “tag in” a new IP. I wouldn’t have been able to see this.
  • The minimum monitoring interval used was 4 hours.
    That means that a blocklisting that lasted for 2 hours may not show up at all, or show up as 4 hours. It also means that an outage that lasted for 5 hours would show up artificially long – as 8 hours.
  • If an IP got on a blocklist and never came off of it during our monitoring period, then I used an “end date” of “2023-03-20 17:38:03” for the listing. (Maning that the actual stint on the blocklist would be longer than my assumed duration.)
  • The blocklist monitoring for different vendors did not all start at the same time. This would impact some average figures. Earliest loads for each vendor (UTC):
    • Act-On: 2023-01-19 23:28:03
    • Campaign Monitor: 2023-01-20 00:18:03
    • Eloqua: 2023-01-21 04:38:02
    • Pardot: 2023-01-21 05:00:04
    • HubSpot: 2023-01-21 05:28:03
    • Mailchimp: 2023-01-21 05:28:05
  • The “end date” for all monitoring was 2023-03-20 17:38:03 UTC
  • The MTBF and MTTR “total time” calculation used different total times based on the different monitoring durations for each vendor (noted above). The monitoring times were rounded up to the nearest 4-hour block. These are the times used:
    • Act-On: 1436
    • Campaign Monitor: 1436
    • Eloqua: 1404
    • Pardot: 1404
    • HubSpot: 1404
    • Mailchimp: 1404
    • Marketo: 1404
  • The way in which I calculated “MTBF” is not the same methodology as you’d use in a factory to calculate MTBF across a “pool of machines” (“Example 2 – Industrial Machinery” on https://nextservicesoftware.com/news/mean-time-between-failures-mtbf). I calculated MTBF on an IP-by-IP basis for IPs that made it onto a blocklist. This approach treats each IP as an individual, as opposed to looking at the health of the entire shared pool. I believe that this simpler approach still teaches us valuable lessons.
  • When I mention the “SpamHaus blocklist”, I am specifically referring to sbl.spamhaus.org. For SpamCop it is bl.spamcop.net and for SORBS it is dnsbl.sorbs.net.

Other resources

The team at emailtooltester has great reports on deliverability from multiple ESPs. Their data focuses on actual mail delivered, broken down by receiving email provider (where my focused only on blocklistings).

Categories: Writing

1 Comment

  1. This is brilliant research, Jacob. Thank you so much. It’s hard for companies to believe that the best of an email marketing platform is ‘unseen’. Yet, you made it visible. Great job.

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2024 Jacob Filipp

Theme by Anders NorenUp ↑