Rest assured, I shall return to my normal scholarly tone in August. But for now – I’m on holiday, and this has been quite a month in more ways than I can explain. So, humour me. I’m distracting myself!
Today, a friend’s Facebook posting alerted me to the fact that sentimental stories on Facebook are often concocted by AI and have no truth in them at all.
Now, I may already have mentioned that I loathe, with a fierce and burning passion, the horrible meme of mean and entitled ‘Karens’ – inevitably white women, of a certain age. Similarly, I hate all the vile YouTube videos about a nasty ‘Karen’, usually a white American living in an HOA (Home Owner Association). I’ve heard from various sources that they, too, are made up by AI, or acted using an AI script. When I overhear an endless series of videos narrated in the same voice, same tone, same cadences – with ‘Karen sneered’, ‘Karen snapped’, ‘Karen glared’, I feel an almost visceral response. How DARE they turn my name into a mean, misogynistic slur? I don’t find them interesting or amusing – they’re pathetic and offensive. I hate my name!
But if I complain – then I’m an entitled middle-class white woman called Karen just proving their point for them by complaining. As ‘Karens’ do. (Or am I? My understanding is that the meme originated with people of the global majority who quite rightly objected to condescending treatment by a certain type of white person. I have seriously taken steps to educate myself about racism, and I don’t believe I am racist. Nor do I think I’m ‘entitled’. If anything, I apologise too often for being in the way, or daring to ask for something. This Karen is not a ‘Karen’.)
However, to test the theory that these things are written by AI, I decided to ask Chat GPT to write me a story, to see what the result would look like. How easily can one prompt a very predictable story to fit a predetermined template? And how would I rate the ‘writing’? (This is a one-off, I hasten to add, because I realise I’m wasting resources, especially water, by playing around with a robot. Moreover, the results are frankly boring! I won’t do it again.) My AI story is about a bigoted man called Bert. Here goes:-
“Could you please write me a realistic story about an American called Bert who lived in a housing complex and was mean and intolerant to anyone who looked different. The story must end with Bert learning his lesson the hard way.”
And Chat GPT replied, Certainly — here’s a realistic short story about Bert, with a strong narrative arc and a moral conclusion. (Chat GPT didn’t warn me how dull it would be!):-
Title: “The View from Apartment 3B”
Bert lived in Apartment 3B of the Chestnut Grove housing complex on the east side of Cleveland. He was 67, retired from thirty years at the steel plant, and proud of never needing help from anyone — especially not the government, not the church, and certainly not “those people.”
To Bert, “those people” included anyone who didn’t look, speak, dress, or worship like he did. He hated loud music in foreign languages. He hated headscarves and saggy pants, long beards and bright turbans, rainbow flags and prayer rugs. He especially hated when people didn’t speak English around him. He never said “hate” out loud, of course — he just called himself “old-fashioned.”
He’d sit on the bench outside the complex with his scuffed thermos of black coffee, offering grunted nods to the other older white tenants who tolerated his gruffness. But if anyone else came by — the young Nigerian nurse from 2A, the Filipino single dad from 4C, the Mexican-American delivery driver from the next building — Bert’s face hardened like a storm cloud.
He muttered under his breath, sometimes louder than he thought.
One evening, someone spray-painted “Go home” on the wall near the mailboxes. It wasn’t signed, but everyone suspected who might’ve said something to encourage it. Bert just shrugged. “Well,” he said to no one in particular, “maybe they should.”
Then winter came.
In early January, Cleveland got hit with one of the worst ice storms in a decade. Power lines went down, roads froze over, and the Chestnut Grove complex lost heat and electricity for four days. The management company was slow to respond — they’d been cutting corners for years.
Bert, stubborn and reclusive, didn’t reach out for help. His old space heater didn’t work without power, and he refused to go to the warming shelter set up at the local community center. “Not gonna be packed in like sardines with a bunch of strangers,” he grumbled.
By day two, the temperature inside his apartment had dropped to the low 40s. Bert bundled up in layers of flannel and tried to wait it out, but his breathing became shallow, his chest tight. On the third night, he passed out on his kitchen floor, alone.
He came to in a different kitchen — warm, full of the smell of ginger tea and something simmering on the stove. A woman with dark brown skin and a soft voice was leaning over him, holding a warm compress to his forehead.
“Don’t try to sit up yet,” she said. “You fainted from the cold. My husband helped carry you over here.”
It was Amina from 2A — the nurse he always avoided in the hallway.
He blinked, confused. “Why…?”
“You live alone,” she said simply. “And we saw the light out in your window. We were worried.”
Beside her stood her husband, Yusuf, who offered him a bowl of soup. Bert’s hands trembled as he accepted it. In the corner, their children whispered, glancing at him with worried eyes.
He stayed the night in their warm apartment, too weak to argue.
When the power came back two days later, Bert returned to 3B. He barely spoke for a week. Then, one morning, he walked down to the common area and quietly scrubbed the wall where the hateful graffiti had been. No one asked him to. He didn’t say a word.
Later that month, he surprised everyone by showing up at the potluck in the rec room. He brought cornbread — a little dry, but appreciated — and sat with the Filipino dad from 4C. They talked about old cars. Bert even laughed once.
He never gave a grand apology. He never became warm and fuzzy. But he nodded more often. He started saying “hello” without suspicion. And one rainy evening, he left a small thank-you card outside 2A with a packet of tea tucked inside.
Bert didn’t change all at once. But he learned this much the hard way:
It’s the people you push away who might just be the ones who save your life.
Now, here’s where it gets interesting. Chat GPT offered to make any changes I liked. So, let’s say that ‘Bert’ is now a New Englander called Joe, chairman of his local residents’ association, and the story takes place in a parking lot. (I don’t want to offend any Berts with a series of ‘Bert’ stories, so I’ve rung the changes.)

Chat GPT obliged. ‘Absolutely! Here’s a revised version of the story with your requested details’:
Title: “Lines in the Lot”
Joe Whitman was the chairman of the residents’ association at Maple Court Condominiums, a tidy, if slightly worn, complex in a leafy corner of suburban Massachusetts. A retired school administrator, Joe spoke in crisp, measured tones and had a fondness for pressed chinos, classic novels, and rules — especially those printed in the Residents’ Handbook.
He considered himself a man of principle. Decorum, order, and “shared standards” were his watchwords. In practice, this meant keeping the hedges trimmed, the recycling correctly sorted, and — most importantly — the parking lot free of chaos.
The parking lot was Joe’s domain.
He had lobbied for years to have the lines repainted and finally succeeded last spring. Now, every car was expected to park precisely within the marked bays. Double parking, backing in, or — heaven forbid — using visitor spaces without prior notice, were cardinal sins.
Joe’s frustration grew when residents who “didn’t understand the culture of the place” — his euphemism of choice — failed to follow the unspoken rules. A young Haitian couple had moved in recently. A Syrian family rented two units nearby. And there was the quiet Korean graduate student who sometimes worked late and came home after midnight, parking hastily across two lines.
Joe didn’t shout or sneer. He simply left laminated notices on windshields, written in the same font used by the association newsletter. “Please respect community standards. Repeat infractions will be reported.” He believed he was being fair.
One cold November afternoon, Joe spotted the Haitian couple — Daniel and Mireille — unloading bags of groceries. Their toddler was crying. The minivan’s front bumper was clearly over the line, intruding into the next bay.
Joe approached, arms crossed, ready to correct.
“Afternoon,” he said stiffly.
“Afternoon,” Daniel replied, adjusting his son’s hat.
“I see your vehicle is once again outside the boundary. The lines are there for a reason.”
Daniel looked at him for a long moment. “We’ll move it in just a moment. My wife needs to carry the baby upstairs.”
Joe gave a tight smile. “The rules apply to everyone, I’m afraid.”
That evening, Mireille posted in the community WhatsApp group — politely, but firmly — asking for a bit more compassion from the association when families had their hands full. Joe didn’t respond. He didn’t like being called out, especially not in public.
The storm hit two nights later.
Freezing rain coated the lot in slick black ice. Residents stayed indoors, waiting for the salt truck. But it never came.
Around 10:30 p.m., Joe went out to inspect the situation. It was his duty, he reasoned, to ensure no one parked irresponsibly overnight. He didn’t see the ice patch beside the Honda CR-V.
He slipped.
There was a sharp crack — his hip — and then nothing but pain and darkness.
He woke up to voices and headlights.
Daniel and the Korean student, Ji-Hoon, had found him. They’d been helping another neighbor move her car when they saw the shape on the ground.
They called an ambulance, covered him with their coats, and waited until help arrived. Daniel even rode along to the hospital.
Joe spent three nights recovering. The doctor said it could’ve been much worse — if he’d been alone for another hour, in that cold, things might’ve turned fatal.
Back at Maple Court, Joe took a long pause from association duties. He returned slowly — first attending meetings quietly, then helping organize a community clean-up day in spring.
He made no dramatic speeches. But a few changes spoke volumes.
The association adopted a “compassion clause” for parking enforcement. Laminated notices were quietly retired. Joe began greeting Daniel and Mireille with genuine warmth. He even asked Ji-Hoon about his dissertation over coffee one Saturday.
Joe never stopped valuing rules. But he learned that people mattered more —
especially the ones you once only saw as problems.
Since I can’t bear my own name becoming a meme, we won’t let Bert or Joe become memes. But I couldn’t resist one more attempt! (To think I once published thirty-odd short stories and a serial in The People’s Friend, but now a robot can write sentimental stuff at the drop of a hat!) Could you tell the stories in this blog post were written by AI? They lack any sparkle that might make a reader want to rea it again!
I asked:- Could you please write me a very short sentimental story about a Devonian man called Derek, who had a much younger wife. A bit old fashioned in his ways, he would gently grumble (sorry, ‘comment’) about anything that displeased him, and his favourite insult was to tell poor Daisy that she was becoming more difficult with age. The story must end with Derek learning a salutory lesson.

Derek, a retired railway clerk from Devon, lived in a neatly kept cottage just outside Tavistock, with his much younger wife, Daisy. He was a man of habits — porridge at seven, gardening by nine, and a cup of milky tea at precisely four. Derek didn’t shout or rage; no, he commented. On the weather, on the state of the world, and most especially, on Daisy.
“Bit extravagant with the basil again, love,” he’d murmur as she cooked.
“Back in my day, women didn’t wear jeans to the pub.”
And, all too often: “You’re becoming more difficult with age, Daisy.”
Daisy, ever patient, would smile — though sometimes it flickered.
One Tuesday, Derek caught a chill. Nothing serious, but enough to keep him in bed for a few days. Daisy brought him soup, fluffed his pillows, and even read aloud from The Daily Telegraph, skipping the bits she knew would make him “comment.”
By Friday, Derek’s spirits lifted, but Daisy looked pale. “Just a bit tired,” she said, brushing it off. On Saturday, she fainted in the kitchen.
At the hospital, while waiting for news, Derek sat with hands trembling and heart thudding like an old train engine. A nurse appeared. “She’s all right,” she said gently. “Exhaustion. You’ll need to let her rest. And perhaps—help out a little more.”
Derek nodded mutely. That night, back in the quiet cottage, he made tea — weak, a bit tepid — and placed it by her bedside. Daisy opened her eyes, surprised.
“I made it myself,” he said. “Don’t say it’s awful. I’ve decided I’m the difficult one after all.”
She smiled, tired but amused. “Took you long enough, Derek.”
And for once, he didn’t comment. He just held her hand.
As I said, I won’t be doing this again. But I do now agree that a robot is perfectly capable of churning out sentimental, sarcastic or scurrilous material ad infinitum. They are boring stories. The second one has just the same ‘feel’ as the first – it doesn’t make you want to read on. The tone is so bland that, even in the days when doctors’ waiting rooms had magazines lying around, you’d have had to be pretty fed up to want to read to the end of these particular stories! However, it’s grammatical, and my results had acceptable language – no inappropriate academic turns of phrase.
But why would I bother? It’s not nearly as satisfying as writing about something more interesting oneself!
