I think my vision of a good journal is a very close relationship between readers and writers -- readers becoming writers, writers staying on as readers even as they stop publishing. I think the landscape now is a huge variety of mostly interchangeable journals that writers are mass submitting to. "Read the journals to see if your work is a good fit," sure, but the distinctions are not always easy to describe or see.
Which is to say, if new tech is causing trouble it's because our monocultural world was already headed in this direction. My hope would be that journals can continue to thrive by more clearly defining themselves and thereby cultivating a closer relationship between readers and writers, drawing writers from an enthusiastic readership.
I agree with you, but I'd also say there's always a lot of writers who just spam out submissions to magazines they don't read. When I was in college in the pre-online submission days, our universities tiny no-circulation lit mag got tons of mailed submissions from all over just because, I guess, we were listed in some lists of lit mags that exist. Online submissions just made spamming out submissions easier. Most of this stuff is easy to weed out... but it still takes time if there is a lot of weeds to pull.
I hadn't even thought of the onslaught of AI submissions to Lit. Mags. I know Upwork is already swarming with lazy people trying to get writers to revise ChatGPT stories and romance novels, people offering thirty dollars for developmental editing of 30,000 words. It’s ridiculous. Becky Tuch’s recent essay, which you mentioned, was another eye-opener for me.
I suspect many of the people behind this flood of submissions might not be "wannabe writers" so much as content farms, whether already established or aspiring, following the well-worn spam model of just getting so much low-effort material out there that even with a very low return rate, you can get something out of it. This might not be very realistic on their part—unlike with a phishing email or link to clickbait, an AI-written short story that somehow made it through to an initial acceptance would probably still be found out under increased scrutiny from the editor—but they may just be seeing the word "paid" on the website and assuming, without any awareness of the actual state of lit mag publishing, that this is some sort of untapped gold mine.
I'm trying to work out the appropriate level of panic editors should have. ChatGPT and its ilk, and the torrent of mediocrity they produce already have copyeditors and proofreaders fretting that they'll be supplanted by flawless copy with minimum viable substantiveness. They're not wrong. Developmental editors are safe for now, but I'm not too confident this will last.
So... should I stop writing? I don’t have any established connections/recommendations, seems like I’m wasting my time if publishing will close it’s doors even tighter. PS: of course, no one stops me from writing for myself, but what’s the point?
I'm not familiar with the technology, but it doesn't seem too incomprehensible that a kind of receipt/tag/token could correspond with each output. Instead of email, all submissions must be sent through a kind of "submittable" platform that auto-filters each submission matching a generative AI receipt. Maybe, finally, blockchain found its purpose.
This is interesting. I tried a fiction experiment with ChatGPT & found it competent only in a very basic way. As ChatGPT stands now, a reader for a lit mag can likely tell within the first couple of paragraphs if a story is AI generated and simply hit the reject button. That may soon change, though. I think AI crap is probably already gunking up the already vastly overcrowded landscape of self-published books.
Without straying into panic mode, this looks for all the world like the end of the centralized gatekeeper. (Or another big step in that direction.)
Agents, editorial teams, whole publications and publishing houses, have finite resources to filter out an infinite supply of content.
What can be done when the cost of filtering and curation is much higher than a near-zero cost of production?
Your suggestions about closing the open pipeline are about the only thing that makes sense.
I expect from the other side, from the artists and producers of work, we're going to see more of need to seriously re-think the business model. It may well be that the future is becoming a 'trusted human source', building communities through networking, and otherwise publishing, distributing, and reaching fans far away from any of the big public curators -- whether that means a magazine, a traditional publisher, or (may the lord help us) the Kindle store.
I wonder why this is why magazines are taking so long to get back to me. I have two short stories out on submission and one is 4 times beyond the usual response rate. I sent an email query, but I haven’t received a response yet.
Sending out a hopeful pitch or submission to a magazine often feels like sending out a message in a bottle. Like you say, this will ultimately lead to more personal and elitist interactions between editors and writers. Not sure how to counter it, but is there a way to anonymize such interactions while maintaining a personal touch?
This dumpster fire is getting bigger and bigger. I wrote an article about AI on campus. Tom Gauld was kind enough to give permission to use one of his cartoons in my post. It pretty much sums it up for this new development:
"But it's just churning out derivative drivel? ... How do I make it stop?"
Def think Ai is enabling to savvy indie authors to self pub a book a week...
I find ChatGPT so useful for research though. The other day I was like, tell me every book and peer reviewed study regarding LSD and ADHD and it shot out a useful bibliography in five seconds. Would have taken me way longer to do it myself.
I think my vision of a good journal is a very close relationship between readers and writers -- readers becoming writers, writers staying on as readers even as they stop publishing. I think the landscape now is a huge variety of mostly interchangeable journals that writers are mass submitting to. "Read the journals to see if your work is a good fit," sure, but the distinctions are not always easy to describe or see.
Which is to say, if new tech is causing trouble it's because our monocultural world was already headed in this direction. My hope would be that journals can continue to thrive by more clearly defining themselves and thereby cultivating a closer relationship between readers and writers, drawing writers from an enthusiastic readership.
I agree with you, but I'd also say there's always a lot of writers who just spam out submissions to magazines they don't read. When I was in college in the pre-online submission days, our universities tiny no-circulation lit mag got tons of mailed submissions from all over just because, I guess, we were listed in some lists of lit mags that exist. Online submissions just made spamming out submissions easier. Most of this stuff is easy to weed out... but it still takes time if there is a lot of weeds to pull.
I hadn't even thought of the onslaught of AI submissions to Lit. Mags. I know Upwork is already swarming with lazy people trying to get writers to revise ChatGPT stories and romance novels, people offering thirty dollars for developmental editing of 30,000 words. It’s ridiculous. Becky Tuch’s recent essay, which you mentioned, was another eye-opener for me.
It will only get worse from here.
I suspect many of the people behind this flood of submissions might not be "wannabe writers" so much as content farms, whether already established or aspiring, following the well-worn spam model of just getting so much low-effort material out there that even with a very low return rate, you can get something out of it. This might not be very realistic on their part—unlike with a phishing email or link to clickbait, an AI-written short story that somehow made it through to an initial acceptance would probably still be found out under increased scrutiny from the editor—but they may just be seeing the word "paid" on the website and assuming, without any awareness of the actual state of lit mag publishing, that this is some sort of untapped gold mine.
I'm trying to work out the appropriate level of panic editors should have. ChatGPT and its ilk, and the torrent of mediocrity they produce already have copyeditors and proofreaders fretting that they'll be supplanted by flawless copy with minimum viable substantiveness. They're not wrong. Developmental editors are safe for now, but I'm not too confident this will last.
So... should I stop writing? I don’t have any established connections/recommendations, seems like I’m wasting my time if publishing will close it’s doors even tighter. PS: of course, no one stops me from writing for myself, but what’s the point?
Netflix is full of content and still sometimes we just surf through.
You are the content we are looking for
The echos we hear and pay attention to
that's crazy! Big Four trade publishers will be next. Never thought the submission slush pile could become infinite.
I'm not familiar with the technology, but it doesn't seem too incomprehensible that a kind of receipt/tag/token could correspond with each output. Instead of email, all submissions must be sent through a kind of "submittable" platform that auto-filters each submission matching a generative AI receipt. Maybe, finally, blockchain found its purpose.
This is interesting. I tried a fiction experiment with ChatGPT & found it competent only in a very basic way. As ChatGPT stands now, a reader for a lit mag can likely tell within the first couple of paragraphs if a story is AI generated and simply hit the reject button. That may soon change, though. I think AI crap is probably already gunking up the already vastly overcrowded landscape of self-published books.
Without straying into panic mode, this looks for all the world like the end of the centralized gatekeeper. (Or another big step in that direction.)
Agents, editorial teams, whole publications and publishing houses, have finite resources to filter out an infinite supply of content.
What can be done when the cost of filtering and curation is much higher than a near-zero cost of production?
Your suggestions about closing the open pipeline are about the only thing that makes sense.
I expect from the other side, from the artists and producers of work, we're going to see more of need to seriously re-think the business model. It may well be that the future is becoming a 'trusted human source', building communities through networking, and otherwise publishing, distributing, and reaching fans far away from any of the big public curators -- whether that means a magazine, a traditional publisher, or (may the lord help us) the Kindle store.
Interesting times ahead.
Oof... :/
I wonder why this is why magazines are taking so long to get back to me. I have two short stories out on submission and one is 4 times beyond the usual response rate. I sent an email query, but I haven’t received a response yet.
Sending out a hopeful pitch or submission to a magazine often feels like sending out a message in a bottle. Like you say, this will ultimately lead to more personal and elitist interactions between editors and writers. Not sure how to counter it, but is there a way to anonymize such interactions while maintaining a personal touch?
This is a sign for whomever has been waiting to start writing, go for it
If people are really reading randomized words then you have nothing to worry about your first work.
If people are tired of all these generic writing then you are a welcome read
Thank you for the great article !
This dumpster fire is getting bigger and bigger. I wrote an article about AI on campus. Tom Gauld was kind enough to give permission to use one of his cartoons in my post. It pretty much sums it up for this new development:
"But it's just churning out derivative drivel? ... How do I make it stop?"
"It never stops! You're welcome!"
https://nancyscuri.substack.com/p/i-think-you-know-what-the-problem
Def think Ai is enabling to savvy indie authors to self pub a book a week...
I find ChatGPT so useful for research though. The other day I was like, tell me every book and peer reviewed study regarding LSD and ADHD and it shot out a useful bibliography in five seconds. Would have taken me way longer to do it myself.
I'm definitely not trying to argue these LLMs have no use, just that this is one bad use I'm worried about.
I know that Clarkesworld ruled it out, but I don't see how paper submissions only doesn't mostly solve this problem.