Freelance scribbler exploring worlds real and imagined

On Journal Acceptance Ratios, What They Mean, and Why They Matter

One of the cool features of publisher databases like Duotrope, The Grinder, and Chill Subs is that they give real-world data about journals, like the average response time and acceptance percentage. This info doesn’t come from the publishers, but from submitters who use the submission trackers on these sites.

The response time part of this is usually pretty helpful. Even if there are only a few reported responses, you can get a sense from them of how much the journal’s response time varies, and a rough time-frame—at least whether you’ll be waiting a few days, a few weeks, or a few months.

The acceptance ratio can be a trickier wicket, however. When you’re looking at this kind of data, having a small sample size can dramatically skew the results. Who you’re sampling to collect that data makes a difference, too.

So how much can submitters trust the acceptance percentages on these platforms, which platform is the most accurate—and what does that figure really tell you, anyway? Here are my thoughts on the matter.

What the acceptance percentage tells you

The acceptance percentage shown on a listing is calculated from that site’s submission tracker data. The number of reported acceptances is divided by the total number of reported submissions, producing an average.

Looking at this data gives you a few potentially useful pieces of information:

Those can be useful things to know before you submit, especially if you’re trying to build an audience or career as a writer. That said, there are some things writers infer from acceptance percentages that they shouldn’t, like:

How accurate are Duotrope, Grinder, and Chill Subs acceptance percentages?

This depends heavily on both the site and market. Across the board, though, this number is not going to be completely accurate. Most of the time, the journal’s actual acceptance ratio is lower than reported. There are also some markets with a 0% acceptance across platforms, whose actual acceptance percentage is higher than reported (though likely not by much).

There are a few reasons for this skew:

As far as which is more accurate, the answer is whichever has the most reports for that market. Usually this will be The Grinder for speculative journals and Duotrope for literary ones.

As a general rule, The Grinder seems to have more active users. Some markets there have upwards of 1,000 reports, while the most popular markets on Duotrope have around 600-700. In either case, though, that’s a good chunk, and the data for those listings is likely pretty close to the real statistics. There are only a handful of places with that many, though. The vast majority on all 3 sites have 100 reports or fewer and the percentages need to be viewed a bit more skeptically.

I’m lucky enough to have access to the backend side of a literary journal (After Happy Hour), so for one market I can actually say with certainty how accurate the reported percentages are.

First, the behind the scenes stats: in the last year, After Happy Hour has received 1,640 submissions and accepted 51, for an overall ratio of 3.11%.

This is skewed upwards a bit by our annual contest, which gets fewer entries than a standard submission period. If I look just at our last free online reading period, we accepted 21 of 850 submissions, or 2.47%:

Now, that’s based on overall submissions. We consider packets of up to 3 poems or flash fiction pieces and rarely accept more than one piece at a time, so some of those acceptances were also 1-2 “rejections” in a certain sense, but for the sake of comparison that stat is close enough.

For After Happy Hour specifically, at this moment in time, Duotrope is the most accurate. It reports a 3.41% acceptance ratio for works, or 4.11% for submitters—a smidge high, but definitely in the ballpark.

Something else to note: there are still 2 pending responses on Duotrope, but I can state with confidence that all responses have been sent for the last reading period and the journal is currently closed. Those are likely a couple of those lingering rejections I mentioned earlier that skew the ratio higher (not to judge—I’m guilty of this myself on occasion).

On the other side, Chill Subs is the least accurate. There aren’t that many fewer reported submissions, but the reported acceptance rate is more than 3 times our backend stats:

And Submission Grinder is inaccurate in the other direction. They split fiction and poetry listings, but there’s not a single reported acceptance across the two—as far as it’s concerned, After Happy Hour’s acceptance ratio is 0%.

Submission Grinder also has the least reports (37 total), which makes sense. The Grinder is better-known among genre writers, and while After Happy Hour does publish a lot of genre fiction (we did a whole issue of it last year), it’s not known as a “genre” journal.

Obviously, this exact trend isn’t going to be the same for every market. It all depends on where (or if) the submitters they happen to accept report their submissions. But it gives you a sense of how variable this stat can be. Even though Duotrope is fairly accurate, that’s still based on just 88 reports, roughly 5% of the submissions the journal actually received.

So should I look at acceptance percentages?

You can look at them—but don’t put too much stock in them. These stats give you a vague sense of a journal’s place in the broader literary ecosystem. Comparing the stats between sites can be interesting, too, giving you a sense of which subsection of the community is the most interested in that publication. Ultimately, though, they shouldn’t be too much of a factor when you’re deciding where to send your work.

See similar posts:

#Publishing #Submissions