On Journal Acceptance Ratios, What They Mean, and Why They Matter
One of the cool features of publisher databases like Duotrope, The Grinder, and Chill Subs is that they give real-world data about journals, like the average response time and acceptance percentage. This info doesn’t come from the publishers, but from submitters who use the submission trackers on these sites.
The response time part of this is usually pretty helpful. Even if there are only a few reported responses, you can get a sense from them of how much the journal’s response time varies, and a rough time-frame—at least whether you’ll be waiting a few days, a few weeks, or a few months.
The acceptance ratio can be a trickier wicket, however. When you’re looking at this kind of data, having a small sample size can dramatically skew the results. Who you’re sampling to collect that data makes a difference, too.
So how much can submitters trust the acceptance percentages on these platforms, which platform is the most accurate—and what does that figure really tell you, anyway? Here are my thoughts on the matter.
What the acceptance percentage tells you
The acceptance percentage shown on a listing is calculated from that site’s submission tracker data. The number of reported acceptances is divided by the total number of reported submissions, producing an average.
Looking at this data gives you a few potentially useful pieces of information:
- Reach and audience. Generally, journals with more reported submissions have a broader reach and are better-known, at least within the literary community. This can mean more readers for your work if you’re published with them.
- Which genres they get the most of. Both The Grinder and Duotrope let you filter the responses to just fiction and poetry (as well as just art or CNF on Duotrope). You can compare these to see which markets might be searching for more of what you write, or which tend to be popular journals for those writers.
- Rough idea of their exclusivity. Every journal only publishes the best of what they receive. That’s a narrower band of total submissions for some journals than others, though. Journals that are considered “top-tier” typically accept fewer than 1% of the submissions they receive, while most established journals fall in the 1-5% range. Journals with an acceptance higher than this are often newer, or have a small or niche audience.
Those can be useful things to know before you submit, especially if you’re trying to build an audience or career as a writer. That said, there are some things writers infer from acceptance percentages that they shouldn’t, like:
- The acceptance percentage isn’t an indication of the journal’s quality. There is a strong correlation here—a more popular market can be pickier, publishing only the most polished of the many strong works they get for each issue. They’re also usually popular for a reason, like that they pay well or have an established readership, which both often indicate a higher-quality publication. But this doesn’t mean a journal is “better” than another just because its acceptance ratio is lower—things like publication frequency and the number of pieces per issue are factors, too, so it’s not always a 1-to-1 comparison.
- It’s not your odds of getting accepted. Acceptance percentages do give you a rough idea of the journal’s exclusivity, but in the end it still comes down to your piece. Work that is polished, fits the journal’s aesthetic, and excites the editors is going to get accepted 100% of the time. On the flip side, something with poor writing and sloppy editing isn’t going to get published by even the most forgiving market.
How accurate are Duotrope, Grinder, and Chill Subs acceptance percentages?
This depends heavily on both the site and market. Across the board, though, this number is not going to be completely accurate. Most of the time, the journal’s actual acceptance ratio is lower than reported. There are also some markets with a 0% acceptance across platforms, whose actual acceptance percentage is higher than reported (though likely not by much).
There are a few reasons for this skew:
- The sample size is self-selecting. Think about the kind of writer who’s going to use a submission tracker. They’re at least semi-serious about their craft, enough to be aware of these websites and to submit things often enough it’s worth tracking them. These are also the submitters that are the most likely to be accepted—they’re more likely to have done their research to find the right market, and to have put care into the writing and editing process.
- People are more likely to report acceptances. Getting work published is exciting. You want to tell everyone, and that includes the folks on your sub tracker of choice. You might not be as eager to go in and mark your rejections, so these are more likely to linger as “pending responses” longer, or to never be reported, than acceptances.
- Only a portion of total submissions get reported on each platform. A large percentage of submitters don’t use an online submission tracker—they have their own system, or no system at all. Folks who do use a tracker probably only use one of the three, so each platform shows a different sub-segment of the submitting population. This means each one usually doesn’t have a big enough sample size to produce an accurate percentage. And since the majority of responses are rejections, a smaller sample size will tend to skew the results higher.
As far as which is more accurate, the answer is whichever has the most reports for that market. Usually this will be The Grinder for speculative journals and Duotrope for literary ones.
As a general rule, The Grinder seems to have more active users. Some markets there have upwards of 1,000 reports, while the most popular markets on Duotrope have around 600-700. In either case, though, that’s a good chunk, and the data for those listings is likely pretty close to the real statistics. There are only a handful of places with that many, though. The vast majority on all 3 sites have 100 reports or fewer and the percentages need to be viewed a bit more skeptically.
I’m lucky enough to have access to the backend side of a literary journal (After Happy Hour), so for one market I can actually say with certainty how accurate the reported percentages are.
First, the behind the scenes stats: in the last year, After Happy Hour has received 1,640 submissions and accepted 51, for an overall ratio of 3.11%.
This is skewed upwards a bit by our annual contest, which gets fewer entries than a standard submission period. If I look just at our last free online reading period, we accepted 21 of 850 submissions, or 2.47%:
Now, that’s based on overall submissions. We consider packets of up to 3 poems or flash fiction pieces and rarely accept more than one piece at a time, so some of those acceptances were also 1-2 “rejections” in a certain sense, but for the sake of comparison that stat is close enough.
For After Happy Hour specifically, at this moment in time, Duotrope is the most accurate. It reports a 3.41% acceptance ratio for works, or 4.11% for submitters—a smidge high, but definitely in the ballpark.
Something else to note: there are still 2 pending responses on Duotrope, but I can state with confidence that all responses have been sent for the last reading period and the journal is currently closed. Those are likely a couple of those lingering rejections I mentioned earlier that skew the ratio higher (not to judge—I’m guilty of this myself on occasion).
On the other side, Chill Subs is the least accurate. There aren’t that many fewer reported submissions, but the reported acceptance rate is more than 3 times our backend stats:
And Submission Grinder is inaccurate in the other direction. They split fiction and poetry listings, but there’s not a single reported acceptance across the two—as far as it’s concerned, After Happy Hour’s acceptance ratio is 0%.
Submission Grinder also has the least reports (37 total), which makes sense. The Grinder is better-known among genre writers, and while After Happy Hour does publish a lot of genre fiction (we did a whole issue of it last year), it’s not known as a “genre” journal.
Obviously, this exact trend isn’t going to be the same for every market. It all depends on where (or if) the submitters they happen to accept report their submissions. But it gives you a sense of how variable this stat can be. Even though Duotrope is fairly accurate, that’s still based on just 88 reports, roughly 5% of the submissions the journal actually received.
So should I look at acceptance percentages?
You can look at them—but don’t put too much stock in them. These stats give you a vague sense of a journal’s place in the broader literary ecosystem. Comparing the stats between sites can be interesting, too, giving you a sense of which subsection of the community is the most interested in that publication. Ultimately, though, they shouldn’t be too much of a factor when you’re deciding where to send your work.
See similar posts: