How The Bandit Submissions Process Works

Photo by Yannik Mika on Unsplash

Greg Forrester
Managing Director, Bandit Fiction

How does a publisher decide which submissions are good and which aren’t? What makes something ‘publishable’? And, perhaps more importantly, what makes something ‘unpublishable’?

The answers to these questions aren’t always things writers want to hear; typically, they’re a little smoke and mirrors, using words like form, character, and technique. I’m not here to tell you that we’re not like those other publishers, that, here you go, here’s our checklist, enjoy yourself. What I am here to tell you is exactly how our submissions process works, from inbox to outbox, and as I go, I’ll try my best to explain and expand on some of the qualities we look for, and what they mean to us.

Step One: Allocating

The vast, vast majority of our submissions come in via a Google Form*. Writers are asked to attach up to three works to the form itself and tell us some of the key things we need to know (like their email address and the name/s of the work/s). Once a writer submits their form, all attachments are dropped into a Google Drive folder which our team of readers can access.

*I’ll address now one of the questions we get asked the most: why don’t you use Submittable? The answer is very simple: it’s too expensive. The last time we enquired about it, over a year ago at this point, we were quoted a yearly price roughly three quarters of our annual turnover.

Next, one of our Submission Managers Lucy or Letitia, or occasionally me, will go through the drive and add submissions to a spreadsheet. The spreadsheet we use looks a little like this:

As you can see, once submissions are added to the spreadsheet they’re randomly assigned to two readers (this usually happens on a Sunday), and each reader is asked to score each submission out of 10.

Step Two: Scoring

We’ll now set aside the admin aspect of things for a little while and I’ll talk about how decisions are made, because it’s much less clinical that a score of 1-10 might suggest.

First and foremost, readers are simply asked to think about how much they liked the submission. After all, we’re all readers ourselves, and liking a story, even if we can’t always quantify why, is key; and similarly, we want the works we publish to connect with readers, and no casual, reading-for-fun reader arrives at a story or a poem with a list of criteria to be ticked off before the enjoyment can begin. So that’s point of consideration one: how much do you like the submission? After that, we move on to some of the more technical aspects.

With that sense of enjoyment still fresh, readers are asked to consider the following:

  • How original is it?
  • How’s it formatted?
  • Has it been edited properly? Or are there spelling and grammatical errors abound?
  • Is there a sense of craft coming through in the sentence structure, the syntax, the word choice?

Typically, there’s no wrong answer to any of these questions, but there are answers that are more right than others. Let’s look at the question of originality, for example. As a rule, stories and poems which do something brand new stay with the reader the longest; perhaps they tell a story in a style you’ve never read before, express emotion in a way that you’ve always understood but never been able to verbalise, place characters and countries before you in a way that’s immediately engaging. All of this counts towards originality, but there’s also something to be said for the old faithful, for the familiar tale told so exquisitely that it doesn’t matter if you’ve heard it a hundred times before. That’s why all those questions above are considerations rather than rules: because there isn’t any one, single correct answer to them.

Step Two and a Half: My Way of Thinking

I’ve been scoring stories with Bandit since day one and during this time I’ve developed my own, over-simplified way of seeing the scoring process, which goes a little something like this:

  • Anything scoring 4 or less – these stories weren’t ready for submission. There could be some decent, redeemable things in there, but as is, they’re a distance from being of publishable standard.
  • Between 5 and 6 – these stories generally fall into one of two categories: either they’re strong works with one very clear weakness, or they’re stories I’ve forgotten about 5 minutes after I’ve finished reading.
  • 7 and 8 – these are the toss-of-a-coin submissions, good, well written and engaging stories that don’t necessarily have anything wrong with them, but might also be missing a little something that would make it a great story. It’s these submissions which generally benefit (or lose out) because of personal preferences in style, genre, and the like.
  • 9 and 10 – I’m telling someone about these stories, usually before I’ve finished reading them, because there’s just something in it that’s grabbed me. I’d be highly surprised if I can’t remember every story I’ve ever scored 9 or 10.

Step Three: Deciding

Once readers have thought about all those aspects of the submission and have decided on a score, those scores are added to our spreadsheet. Let’s revisit the spreadsheet to talk about what happens next.

As you can see, each submission has scores of between 1 and 10 assigned it, one score per reader. You might also have noticed the far right column, which is an average score of what each reader felt about the submission. This is the key number for us, as far as deciding which subs are accepted is concerned, because submissions which score 7.5 or higher are accepted.

Using the screenshot above as an example, Submissions 3 (with an average score of 8) and 9 (with an average score of 7.5) would be accepted. The others, with the exception of Submission 10, would be rejected.

If you look closely at the Submission 10, the final one on the spreadsheet, you’ll notice that it’s got scores of 5 and 9. it’s not unheard-of for submissions to receive vastly different scores, but, where one of those scores is a 9 or 10 and the piece isn’t automatically accepted, we have somewhat of a failsafe, an extra step in the process. Even though this sub didn’t make the 7.5 grade, we ask for a third opinion from our Deputy Editor-in-Chief, Tom. The reason we do this is, hopefully, quite simple: for a reader to score anything either 9 or 10, they must have loved it, so there’s every chance there’s an audience out there who’ll love this submission just as much.

Step Four: Informing

Once a decision is made on a submission, it falls to our Submission Managers Lucy and Letitia to email out either acceptances or rejections. Acceptance emails are lovely to send out; rejection emails are genuinely unpleasant. I’ve sent out hundreds of rejection emails myself, and that feeling, especially as a writer myself, never gets easier.

To make the process manageable (sending acceptance/rejection emails takes a couple of hours every week, which, given that we’re all volunteers, is a huge chunk of work), standard templates are used for both acceptances and rejections. What this means is writers never find out what scores their submissions received; works which were so close to being accepted but ultimately fell short receive the same email as those which, to be blunt, weren’t yet ready to see the light of day.*

*We’re currently looking at this, and investigating whether it’s feasible (and helpful) for us to point out where a submission very nearly made the grade.

So that’s how our submissions work, from start to finish. If you’ve got any questions about any stage of the process, feel free to leave a comment below and I’ll try to answer as many questions as I can.


About the Contributor

Greg (he/him) is Managing Director and a Co-Founder of Bandit Fiction. He has BA and MA degrees in Creative Writing from the University of Sunderland and Newcastle University respectively, and is currently working towards a PhD in Creative Writing. He writes mostly magical realism, is interesting in folklore and explorations of otherness, and has been published by Fairlight Books and TL;DR Press.

Leave a Reply