Across Saint Lucia and the wider world, there’s a growing sameness creeping into writing and speech, especially in professional and public-facing settings. Whether it’s a corporate email, a ministerial statement, a government report, or an Instagram caption, some have noticed language becoming more predictable, polished, and less personal.
To some, the culprit seems clear: the overuse, or misuse, of artificial intelligence (AI) tools like ChatGPT and other generators.
“You may have noticed that responses you’re getting are starting to look or sound alike,” says Jim Joseph, managing director of MAP IT Solutions Ltd.
“So just like when you type on your phone, you have what’s called predictive text. It predicts the next letter or word. AI is doing that, but on a much more advanced level. It’s predicting the next phrase or sentence based on the context of what you typed.”
Joseph, a tech expert who also conducts AI literacy workshops, says this is not necessarily the fault of the technology but how people use it. He explains that many users prompt AI systems with vague instructions and accept the first draft they get back, resulting in content that sounds polished but lacks depth or originality.
“People often take the first or second draft it gives them without understanding they’re supposed to prompt it further to refine the result,” he says.
“It’s not just asking ChatGPT to spit something out for you; it’s a dialogue. The human has to be the final authority.”
“You have to see [AI] as a researcher, as a writer, an editor, a proofreader, a critic. It’s a companion, not a know-it-all. You must guide it.”
While Joseph approaches AI as a powerful tool to be used with intention, others — especially creatives — worry about what’s being lost in this growing dependence on machine-generated language.
Amanie Mathurin, a Saint Lucian writer, has noticed the shift in tone and voice not just in government statements and corporate memos, but in everyday writing too.
“There has certainly been a noticeable shift in how people express themselves,” Mathurin says.
“It almost sounds like everyone is striving to come across extremely knowledgeable and professional but paying little attention to sounding relatable, authentic, or unique.”
She describes the language as increasingly “mechanical”, even when the sentences are technically well written. For her, it’s not just about sounding the same; it’s about feeling disconnected.
“While the sentences appear well-constructed on the surface, they often lack any truly engaging element. Other signs include sentences that feel extremely wordy and use very cliché phrases. In a Saint Lucian context, some of these phrases can sound awkward because they may not typically be used by us in that way — something which AI may not yet be able to recognise.”
Mathurin hasn’t personally used AI in her creative process, deliberately so.
“Writing for me is a very personal process. It is not simply about stringing the right words together but about stirring emotion, making people feel seen,” she explains.
“While AI can generate well-crafted sentences, it cannot capture some of the most beautiful intricacies of human emotion and experience.”
In that sense, for now, AI can’t replicate culture. It can’t mimic the rhythm of Kwéyòl slipping into a conversation or the way Saint Lucians use irony, humour, and story to build a point.
“For us in Saint Lucia and the Caribbean in particular, we’re also compromising on our distinct ways of expressing ideas and making sense of the world that are largely shaped by our heritage and history as a people,” Mathurin says.
“We are sacrificing that to sound just like everybody else.”
That sameness, she argues, is especially concerning in spaces that demand authenticity, like speeches and storytelling.
“Many speeches and other pieces of writing sound correct but not compelling. That’s because the words are there, the ideas are there… but there’s little to nothing of the writer in there, and often less of the audience,” she adds.
“In the end, it becomes a generic piece of writing that uses buzzwords and clichés — not material shaped by the human experience or crafted to represent it.”
The solution, according to both Joseph and Mathurin, isn’t to reject AI altogether but to be intentional in its use. To remember that AI is a tool, not a voice. That writing, especially in a small island society like Saint Lucia, should still reflect real people, places, and experiences.
Joseph emphasises that prompting AI is a skill and one that must be learnt if we are to use these tools without losing what makes us distinct.
Mathurin, meanwhile, urges writers to return to the why of their work.
“Whether it be essays, speeches, creative writing — the question should always be, ‘Why are you writing? What are you hoping to achieve?’” she says.
“And the follow-up question is, ‘What does AI add to that process that you, as a writer, do not feel capable of bringing yourself?’”
As AI becomes more embedded in daily life, from schools, boardrooms and government ministries, the challenge now is not just using it well but ensuring it doesn’t drown out the very voices it’s supposed to help express.
As in the end, no matter how “perfect” a sentence may sound, it must still say something real.



