OP's example is pretty convincing, but every time I see a Newbie or Jr. Member either creating threads or just making posts in impeccable English, all the alarm bells go off in my head because I know such a phenomenon was almost unheard of prior to idiots gaining access to AI tools. In other words, most new members' first language isn't English (and it's probably not even in the top five) and hasn't been for years.
That's the basic way to detect that a post has been generated through an AI model. You can't expect someone having broken English in their previous posts which they possibly wrote themselves to become a completely perfect English writer in their latter posts which makes you realize that there is something wrong. The AI uses perfect punctuation, sentence starters as I mentioned in the OP, and most importantly, perfect grammar. Even a native speaker might make mistakes sometimes but AI doesn't, and that makes it easy to detect unless someone edits the text after generating it but those who can't write a few sentences themselves would barely have enough brain to do this.
The reason I think the punishment should be so harsh is that if this shit isn't nipped in the bud, bitcointalk is going to turn into a forum consisting of bots talking nonsense to each other and will have a flood of threads with moronic topics in which the OP doesn't really say much of anything.
That's why I suggested there should be an official rule about it, and if there isn't and AI content falls under plagiarism, the culprits should get the same punishment as plagiarizers.
Actually, has anyone checked out the Economics section lately? Armageddon might have already arrived.
I do, every day, but you know what? Discussion boards are worse than that. Bitcoin Discussion is still better, but Altcoin Discussion is so full of spam and spammers that if you report one post, two more are posted.