This year’s Blaugust event features well over a hundred participating blogs with fifty-nine completely new participants, which is incredibly rad, but with a certain technology on the rise, there have been concerns about AI-written content and the event. Belghast provided some great insight about his thoughts over here on his blog (highly recommend giving that a read since it inspired this post) and as such, I wanted to also share some thoughts.
This post here will just share some thoughts on AI-generated content in the context of Blaugust. I don’t hold any power and obviously cannot prevent people from doing anything of the sorts and I don’t want to tell people to not use this thing or that thing… but what I can do is just share my perspective and express my discomfort, so that’s what this post is essentially about.
AI Art in Blog Posts
For starters, I don’t like AI-generated images but I don’t think those are in issue if people end up using them to simply break up walls of text. For what it’s worth, I split up the text with stock images hosted on Pixels, typically of cute animals, because it’s pleasing to look at. Most of the time, it has literally nothing to do with the post in question. It’s literally just eye candy.
I also do use screenshots when I talk about games or I use bad Paint drawings that I literally create within minutes. I think that adds a bit of individuality to my blog but also it might make someone chuckle a tad. Most of these paint drawings (or doodles) are years old at this point, drawn on a laptop’s mousepad… and you can tell.
With that in mind, I’m sure that someone might think that AI Art isn’t that different from me drawing something intentionally very poorly in Paint… I mean both take barely any effort and look ugly. But I think the core difference here is that one of these requires you to actually do something creative and the other one requires you to boot up a program that generates questionable art based on stolen artwork. One of these requires you to even be good at it or at least practice until you’re good… and the other requires you to literally search for “cat holding an umbrella” so that the program can vomit some amalgamation of various artists’ hard work and efforts onto your screen so that you can use it in whatever capacity you wanna use it in.
I think it’s clear which one is AI.
But as long as people don’t sell AI Art or don’t submit it to fucking art contests, as people have done, I think it’s fine. Suppose you want actual art to be done for you. In that case, I think commissioning it from artists and supporting the scene is incredibly important at the moment. For what it’s worth, AI-generated art can be pretty good for reference images.
In the context of blogging, using it to break apart walls of text is perfectly fine. I won’t do it, though, and I might feel a bit uncomfortable with people using it because of the ethical concerns of it. It’s just that, basically, but I won’t tell people off… at least I currently don’t think that’s the way to go about it.

AI Blog Posts
The Blaugust event, at least to me, is about a few things:
- Bring the Community together
- Give newbies some help and give blogging veterans a reason to get back into it
- Have a fun little challenge to participate in
To do this, we’ve got a discord, where people have conversations and discussions, we comment on each other’s blog posts or riff off of them, link to other folks, and generally… try to have fun by doing the same hobby as others have done. For the challenge aspect of it, there are even awards and there certainly are many people out there who do participate in Blaugust especially because of the awards. Heck, the Rainbow Award is pretty darn amazing. It looks quite stellar, doesn’t it? Why not go for 31 posts this year as well?
So, to me at least, AI Blog Posts don’t really fit into that because you’re not participating in anything community-related and if anything, I’d imagine that using AI for this might even cause other people discomfort or draw their ire. After all, AI is being trained on stolen data. For AI “Art”, the programs just regurgitate stolen assets until it looks good. LLMs will also just do the same but with articles, posts, reviews, etc. that other people have put a lot of effort into and then it will just replicate all that.
For what it’s worth, I think that AI-written content on its own isn’t the future. Rather, it’s a punch to the face for anyone that actually goes through the effort of writing a post. It’s the same as literally Copy-Pasting blog posts onto your blog and then pretending that you’re the originator.
Remember the whole section at the bottom of my posts? If not, you’ll see it later. It’s there precisely because content scrapers are a huge pain in the arse. They won’t have the same domain authority as my blog and hence they won’t outperform but I’ll get spammed with their pingbacks and they might still somehow generate ad revenue through that shit, so I have to go through the effort to remove their stuff via DMCA.
Of course, you end up with completely different posts if you use AI because it meshes together a lot of data from a lot of websites and sources… but it lacks heart and passion and I don’t really get why one would do that.

Transformative Content
Now, I do think, that there is a way for AI/LLMs to be used ethically… and that is transformative content.
If you’re around on YouTube, you might remember that word being used when it comes to “react content”. If not, it essentially boils down to whether or not your content can use another creator’s content in theirs.
So, a video might include snippets of another creator’s videos if it criticizes it and if it doesn’t go beyond certain parameters. If you have some insights that you can provide on a matter, you might show clips to provide context. What’s important is that your “reaction”/”insights”/etc. are more than what the video on its own could provided.
In that way, I do think that using tracing, references and tutorials is perfectly fine when you do art and wanna practice, as long as you don’t simply copy someone else’s work and then pass it off as your own. A lot of people frown upon tracing but I think it’s a good way to get better, specifically at stuff like proportions, anatomy, line art, etc.
AI Art can be used in art as long as the originally generated image is no longer recognizable – is what some folks have deemed for copyright and contests and stuff.
So, in Art, if you use AI to generate an umbrella, for instance, and make it appear at the bottom of the screen, you might draw a cat on top of that umbrella and transform the image to have much more content than the original AI prompt had. Who knows, maybe you’ll draw over the umbrella or simply use its shape to figure out how the tail is gonna look, or maybe you’ll use AI to generate a colour palette or some abstract shapes that you then create an image out of.
Taking inspiration from AI Art or using AI Art as a reference image is perfectly valid and in the same vein, I don’t think it’s bad if AI is being used to generate prompts or create inspiration for blog posts.
The issue is, however, when the whole thing or a large portion of the post in question is generated by AI.
Ethics of AI-generated Blog Posts
So, when it comes to ethics, I think there are a few considerations that one should make, namely:
- Human Involvement
- Creativity
- Transparency
- Legal and Copyright Issues
I think these are the four main things that one needs to consider when it comes to ethics here. I think that this is the same as with “react content” on YouTube (which I’m not a fan of, btw, but that’s a topic for another time, perhaps).

Look at me being all cool and hip and super rad and stuffs
I’ll go into more detail about my thoughts on these but I figured it would also be funny to ask ChatGPT about it all and provide the machine’s answer at the end of each section. I hope this appeases the AI stans.
Oh, btw, I’ll say “ChatGPT said/says” in these instances but ChatGPT is not a sentient being. I know that. It’s also not AI, it’s an LLM, but people seem to use those words interchangeably these days, so I’m just going with those words. :P
(1) Human Involvement
The first point is about how much work a person has put into the content past the generation of a prompt. So, if a human plays an active role in the creation of a post, it can be considered written by a human. In that way, there are articles on BuzzFeed, for instance, which are generated by AI but edited by a human… and those are posts that I would not consider “written by a human”. They’re just edited to appease the algorithm and corrected to fix issues or adjust the language.
If you simply ask a Large Language Model for a prompt and then write the post according to that prompt, that’s totally fine by me, and I personally would deem that totally okay. It’s still written by you, after all.
On this matter, ChatGPT had to say…
“If a human actively directs the model, selects the prompts, and then reviews, edits, or curates the output, the content can be considered as “written by a human” to some extent. In this case, the human acts as the creative director or editor, using the AI as a tool.”

(2) Creativity
This one is about the creative process, essentially, or the “heart” that you pour into this hobby. I think, especially when it comes to events like Blaugust, this one is an important factor as you’re participating alongside other humans and in a way perhaps even competing with them.
Without creativity and without some form of passion, I think generating content through LLMs would just spit into the face of other people that participate as well.
Now, it’s important to note here that, in my opinion, it’s difficult to have a creative process in the first place if a machine writes out the whole thing, just like how it’s difficult not to be influenced by other reviews you’ve read on a game, book or movie that you’re reviewing.
Just by having an outline or even blocks of text generated by LLMs, your creative process is already heavily tampered with. It will be a lot harder to get away from what the machine said and create something yourself.
Anyway, ChatGPT said on this matter…
“The core creative process—deciding on the topic, structuring the content, and infusing it with unique insights—typically defines authorship. If a person plays a significant role in these aspects, even if an AI drafts the content, the person could be considered the author.”
So, when I asked “What are your thoughts on the creative process involved when using AI for blogging? Please keep it short”, it started talking about authorship and how a person could be considered the author as long as their structure, unique insights and the general core creative process stems from the author and not the machine itself.

(3) Transparency
When it comes to ethics, transparency is something that one just has to talk about. In the case of AI usage, for instance, that should be expressed in the post, in my opinion, just like how it should be declared on a video game’s store page that there is AI art in the game and just like how food in the grocery store should tell you when there is something in the food that you might wanna know about.
Being transparent is very important in content creation in general. If I got a key for free by developers, I’ll declare that to my viewers and readers so that they know that I got something for free or that I didn’t pay the price for a game, for example.
Your perception of my review of a game will change based on me having bought the game or me having gotten the key for free. If I got paid for an opportunity, I’d also have to tell you that, after all.
By being transparent with this stuff, you build up trust and also prepare readers and viewers for what is to come. A lack of transparency, however, damages trust and could result in people even shying away from all future content you create. Hence, it’s better to declare it right from the get-go rather than to deceive others. In the case of reviews, for instance, if nothing is mentioned, people might think you bought a game at the full price. In the case of blog posts, people will often assume that it’s written by a human… if something feels odd, they might check if it’s written by AI or even accuse of such.
If a food item contained pork, for instance, I would not buy it since I don’t eat pork and I’d be upset once I found out that it wasn’t declared on the packaging. If a raving review was sponsored, I’d be incredibly sceptical of it, especially if it wasn’t declared as such at the beginning of the post.
The feelings and perceptions of anything one does is heavily based on that level of transparency, in my opinion, and ethical usage of AI in written content requires an extra amount of transparency, in my opinion.
Anyhow, ChatGPT says:
“Ethical considerations come into play regarding how the content is attributed. If the post is fully or partially AI-generated, it’s considered best practice to disclose this to maintain transparency. This is especially important in contexts where originality and authorship are highly valued.”
…which feels a bit short, so I asked why transparency is important and ChatGPT said:
“Transparency is important because it builds trust, ensures honesty, and helps readers understand the true source of the content. It also maintains the integrity of the work and respects the audience’s right to know how the content was created.”
…which I think is valid.

(4) Legal and Copyright Stuff
Last but not least, this one is fairly simple: Copyright belongs to humans. AI isn’t human. As such, AI does not have a copyright to anything it generates.
There is no copyright for AI’s output and as such, it belongs to nobody, no matter whether it’s art or text or whatever. Just because you entered a prompt, that does not mean that you own the generated text or art or whatever.
But this is where the biggest ethical concerns come up: What about the art or text that was used to train these LLMs and AIs?
Well, that data (art, writing, voice, etc.) belongs to the original author. If they gave their consent for the usage of their stuff in AI training, that’s ethical… but that’s not what’s happening.
A while back, a list got leaked of art used to train Midjourney and there’s this funny clip over here from a podcast where RubberRoss (pretty cool guy) talks about being on that list of artists.
Anyhow, that was just a side note but essentially, the stolen data used to generate AI text or AI Art is essentially being used to train the AI so that it can then generate content… and that content is not owned by anyone. If it suddenly looks exactly like RubberRoss’ art and animation, for instance, it is not owned by him. That’s a problem.
But also, you don’t own shit when you generate art like that, so if someone steals from you, that’s totally okay to do because you don’t own it and you can’t be robbed of something you don’t own. Like, you can’t steal money from me because I’m broke. You can’t steal my car because I don’t own a car. Get what I mean?
Anyhow, ChatGPT generated an answer for this issue as well and, as I mentioned, I don’t own that answer or writing there… anyway…
“From a legal perspective, the question of authorship can be more complicated. Current copyright laws in many jurisdictions don’t recognize AI as an author, so any content generated by AI under the direction of a person might still legally belong to that person. However, this area is evolving and may vary depending on specific cases and jurisdictions.”
or so it says but in reality, I’m using the free version of ChatGPT so it’s data is a bit old on that front, I think. Officially, there is no ownership over AI text since AI can’t be an author since it’s not human. You as someone that generates the prompt or text or whatever also don’t own it because you didn’t create it.
There are, however, as ChatGPT said, laws in the works. This area is evolving and discussions ar being had over and over and over again over this stuff.

Damn, this post was long
Overall, to make AI usage in an event like Blaugust “ethical”, it would take a lot of effort and control and I think it’d be much better if people just didn’t participate if all of their content is written by AI.
More importantly, though, I think it’s just unfair and kinda shameless for people to pass of AI-generated text as blog posts written by a human. So, I’ll judge people that use AI to generate posts, extensively, if they aren’t transparent about their AI usage. >:O
If you have any thoughts on any of this stuff, let me know. Also, how do you like my drawings?
This post was originally written by Dan Dicere from Indiecator.
If you see this article anywhere other than Indiecator.org then this article has been scraped. Please let me know about this via E-Mail.

I wouldn’t use it to write entirely. Proofreading is one thing though. As for images. As long as people don’t use them for profit and it’s just for personal use should be fine. One blogger I follow uses them a lot because not everyone is an artists and they want to create their own images and this is just another way to do it along with programs like Canva.
I found a CNN article about someone who won an art contest with AI image and said they did more than just type something and did some tweaking. It did make me learn not to be so hostile towards AI art but for future it seems unfair for that to be in a contest for actual art that was painted or drawn manually.
AI creations should be separate in those things. If people want to use them for their own personal use like their website then that’s fine.
LikeLiked by 1 person
A good read that caused me to do a bit of additional thinking about the application of AI.
I don’t have a ‘kneejerk’ “Kill the AI!” reaction. I think there is a place for AI in human creativity as an aid: acting as a proofreader as Emily says, providing writing prompts for inspiration, maybe a few images to break up the walls of text.
But any creation that is wholly AI generated should be designated as such. And I personally don’t see a place for such content in human content events.
LikeLiked by 2 people
I think of AI as starting point not the end of creation but human must continue the final work himself. We all get some ideas from random sources and it is up to us to decide the final results.
LikeLike
Yeah, if AI is used as a tool to aid, I think that’s fine, but when it’s replacing the writing process altogether, that’s bad.
LikeLiked by 1 person
Sure you are right. there is always need for human mind
LikeLike