Are You Being Forced To Use AI Tools?
Author: AnswersWanted
Date: April 23, 2025
We help people market and sell healing foods, supplements and herbs online, and our market tends to love tradition. We love the age old way of doing things. We believe the ancients weren’t dumb and that nature has something to offer us. We’re well aware that too much chemical, machinery or modern involvement in our health isn’t always helpful. Don’t get me wrong, our tribe isn’t anti-technology… and most of us will use some types of modern medicine when absolutely necessary.
The point is… most of us would rather avoid the use of AI.
Like medicine and other materials that heal, we prefer the age old time honored ways. That means that human written language (when done well) is just packed with flavour, experience and the richness of emotion! However, the way a capitalist society operates, we also have to take advantage of streamlined manufacturing sometimes. In the same note, the new web tools can be handy too. Just like the bottling and packaging machines though, we have to check the output and avoid putting a product (or content) out there that doesn’t meet the standards.
AI has some powerful uses, like combing through content and finding answers. It’s great at pouring over research, or simplifying complex topics. It can even help generate ideas or help you language something when you’re facing writer’s block. It just doesn’t have the experiences, knowledge and feeling you might want to portray.
Do we need to use the tools?
The tools can’t really be ignored anymore. That said, they have to be used carefully. We’re at the point now where it’s really not wise to avoid using them. They can really speed up certain tasks. We also have to be careful to not lean too hard on them and use the tool (or a group of them) as your sole source of ideas, information or content output. They’re inherently flawed in a number of ways.
Just recently, a discovery was made where the AI had created a new (fake) scientific term, and that term showed up in 22 different research articles!
AI systems can “feed off themselves” when they are trained on their own generated content, creating what tech people call a recursive feedback loop. That’s not the only issue, there are plenty of examples where the user asks for something and the resulting content created isn’t exactly what they asked for… but what if the user doesn’t know the difference?
I could write a list of all the examples of recent issues when trying to get fresh ideas, mashing two types of content together, or simplifying complex studies on functional nutrition where the output wasn’t really usable for publishing. The thing is, I knew the results were unusable.
How do you wisely use the tool?
As a content strategist, if I need a short writeup on the benefits of Vitamin C, an AI generated paragraph can be handy. I can have it write that to a 6th grade reading level, or in the voice of a news broadcaster, or as a fairytale! Those short snippets make getting through the boring tasks so much easier. However, if you ask it to create a content strategy, it’s going to give you textbook answers!
What’s wrong with that?
I remember when I worked for a company that was one of the first online marketing companies in existence. At the top of their game they were making over $60m per year selling nothing but ‘how to make money on the internet’ kind of content. However, being young guys they felt they should get some formal training in the art, so they sent one of their executives to school. In one class session, a professor laid out how a marketing plan for an online business should work. Meanwhile, the untrained successful marketer squirmed in his chair before shouting “that doesn’t work, we tried it!”, he said.
The professor said “Well, if you know so much, come show us how you’d do it” and so, he did. He grabbed the dry erase marker and drew out a full scale marketing plan complete with all the steps for finding the audience with paid ads, the webpage, the opt-in offer, the back end follow-up marketing and affiliate partner plan.
The typical, textbook way of doing things pales in comparison to experience.
There’s still plenty of proprietary industry knowledge you won’t find online or in a book. It’s the stuff that comes from actually doing something for a living. The raw mistakes made in the field of study that sometimes crop up in bits and pieces on a forum chat, a reddit thread or facebook group of industry professionals. A language model isn’t necessarily going to easily be able to decipher who’s blowing smoke, and who’s talking from lived experience though.
Let’s also not forget that the internet is riddled with ‘highlight reel’ and not the hard luck experiences. Most people don’t like to share when they screw up, only when they succeed, so the balance of content leans toward the good results. One of the most powerful chunks of data you won’t find in the raw material online are things like which ads worked, which page converted, which content resonated (not in views, but in conversions). It’s heavily protected data that, when you have those case studies, makes a world of difference.
Work with people that use the tools, but don’t rely on them.
It’s not worth hiring or working with anyone who’s head is in the sand and won’t use modern technology… but those who ONLY use AI generated output for all their work are already cropping up everywhere. You can smell it a mile away and it’s really frustrating. Machines are meant to enhance our lives, not replace us. We’re about to see various firms attempt to lean on machines on a massive scale in an attempt to save a dollar, but we’re not really replaceable as they’ll soon find out.
A content strategist (like us) can craft engaging emails, articles and social posts as part of an overall strategy designed to reach a specific set of needs while providing real tangible value to customers with emotional impact. We guide the tools, but we don’t use their unedited output and declare it as our work. That’s not only unethical, it’s ineffective.