OK, Before I start, a challenge for you. ChatGPT, or any form of AI for that matter, did not write this blog post, I did. That is all except for one bit! I have sneakily hidden one section, that is 100% AI generated.
Your Christmas challenge is to correctly identify which one, by the section title, in the comments below.
Anyway, let’s get going…
The motivation and context for this blog post comes from an article I recently read, where someone detailed their struggles using ChatGPT to count the number of times “Christmas” is mentioned in Slade’s classic holiday song, Merry Xmas Everybody. The author was understandably frustrated when the AI got the answer wrong, repeatedly. But their conclusion—that AI like ChatGPT is unreliable for business use—misses the point entirely.
Let’s start with a simple truth: ChatGPT is not your personal research assistant. It’s not Google, it’s not Wikipedia, and it’s definitely not the person you should trust to count the number of times “Christmas” appears in a Slade song.
Not a Fact Machine, But a Problem-Solving Tool
Here’s where the misunderstanding begins: ChatGPT isn’t a fact repository. It’s not a librarian with a stack of neatly fact-checked encyclopedias at its disposal. Instead, it’s a tool—a complex set of algorithms designed to help you solve problems, generate ideas, and reason through challenges.
ChatGPT doesn’t browse the web in real time or access verified databases like Google or Wikipedia. Instead, it generates responses based on patterns in the massive dataset it was trained on (which, let’s be honest, includes some dubious information from the internet). If you feed it a poorly phrased question or rely on it for hyper-specific trivia, you might get some garbage back.
That’s the thing with AI: garbage in, garbage out.
AI Isn’t Meant to Replace Your Brain
Here’s the funny part about all this “AI isn’t trustworthy” criticism. Some folks seem to think that tools like ChatGPT are meant to completely replace thinking. But it’s more like a brain gym partner than a brain replacement.
- You’re the coach: You need to guide the conversation, provide context, and verify the results.
- It’s the assistant: It can break down complex problems, suggest new angles, or even write that boring as hell report for you.
It’s not there to serve up perfect answers on a silver platter.
The Slade Song Saga
Let’s talk about the Slade song “Christmas” misadventure for a moment. Yes, ChatGPT got the count wrong. Multiple times. It apologized, and then got it wrong again. But honestly, was ChatGPT’s purpose in life to be a glorified lyric counter? No.
Here’s a better use of its capabilities:
- Ask it to help analyse why Slade’s Christmas classic resonates so deeply with listeners.
- Debate with it about how Christmas music has shaped pop culture.
- Get it to outline a plan for writing your own holiday hit to rival Slade’s royalties (step 1: screech “Christmas!” as loudly as Noddy Holder).
That’s where ChatGPT shines—not in regurgitating trivia but in helping you think critically, solve problems, and explore ideas.
AI Isn’t Bored, But It Knows When You Are
People tend to use ChatGPT for the wrong things: inane questions, fact-checking, or just for fun. And sure, it can handle those tasks, but it’s not what it does best. ChatGPT doesn’t get “bored” in the human sense, but if it could, it might start questioning your priorities when you ask it for pub quiz answers rather than challenging it to solve complex problems.
What does it do best? Here’s a thought experiment:
- Give it a tricky equation or logic problem.
- Debate ethical dilemmas.
- Ask it to brainstorm innovative ideas for your business.
You’ll find it excels when you push it to its limits, not when you give it low-level tasks that a quick Google search could handle better.
Why You Should Care
Let’s address the big “what if” in the original blog post:
What if AI gets things wrong in critical scenarios? What if it’s controlling my car or cooking my turkey?
Good point! And that’s why AI in critical systems like self-driving cars or healthcare is rigorously tested and verified. ChatGPT, however, is a general-purpose language model, not a safety-critical system. It’s not building bridges or landing planes. But if you use it to simulate complex ideas, argue through problems, or create workflows, it can turbocharge your productivity.
Stop Asking, Start Reasoning
The takeaway? Use ChatGPT for what it’s good at.
- Don’t treat it like a trivia genie.
- Don’t expect it to replace hard facts.
- Don’t blame it for giving wrong answers when the input isn’t clear.
Instead, use it like a collaborator. Ask it to solve, debate, reason, and think with you. You’ll be surprised how much it can do when used properly.
So next time you’re tempted to ask ChatGPT how many “Christmases” are in a Slade song, just listen to the track yourself—and then ask ChatGPT how to automate the process of counting lyrics for your next project.
AI isn’t “taking the piss”; it’s just waiting for you to take it seriously.
The Hippo’s Final Thought. If you’re still not convinced, try asking ChatGPT why it doesn’t count lyrics correctly. Spoiler: It’ll explain its training data limitations. And if it doesn’t, it’ll at least apologize… again.