Skip to main content

It Doesn't Work.

A quiet lesson in why most AI rollouts fall flat — and what to do instead.

The problem started small.

A few confused emails from the support team.
A Slack message in the product channel:

"I tried using ChatGPT to write the release notes. It just sounded weird."

The sales team had already given up.
Marketing was experimenting, but mostly just toying with ideas late at night.
The CTO still believed in it — mostly.

They had introduced AI into the company a few months earlier with big hopes: smarter support replies, faster internal docs, less repetitive writing.

The idea made sense. The tools were there.
And yet, by month three, it was barely being used.
When asked, most people said the same thing:

"It just doesn't do what I want."

It's a phrase that comes up in every company that experiments with AI.
And it's usually not true.

What's actually happening is simpler:
the tool is fine — the prompt is broken.


Learning how to talk again

Language is natural when we speak to each other.
You say something vague, and the other person fills in the blanks.

But AI doesn't do that. Not really.
It doesn't know your priorities, your tone, your intent. It can't read the room. It doesn't know your role.

If you say,

"Make this better,"
you'll get back something that sounds like it was written by an intern on a deadline.

But if you say,

"You are a product marketer. Rewrite this release note for our B2B users. Make it clear, concise, and focused on speed improvements. Include a headline suggestion."
you're now giving the system something to work with.

The difference is huge — and yet, most teams never cross that line.


One meeting changed everything

When Emplex got involved, it didn't start with a big transformation.
Just a short session with a handful of people.

Someone asked the AI to summarize a meeting. It spat out a bland list of bullet points.
Then we reworked the prompt:

"You're a project manager preparing a weekly update. Summarize the transcript into:
- What was decided
- What needs follow-up
- What blockers still exist
Write in a clear, direct tone. Avoid filler."

The output?
Actually useful. Actionable. Something you could send without editing.

People leaned in.
They started to ask:
- "Can it also rewrite proposals like this?"
- "What if we show it our style first?"
- "Can it remember how we talk to clients?"

That moment — when it clicks — is the real beginning of using AI well.


Prompting is not magic. It's communication.

It's a skill, not a hack.
And like any skill, it gets better with a little structure and practice.

Some techniques are simple:
- Start with a role ("you are a senior copywriter...")
- Break the task into steps
- Give examples
- Ask it to reflect on its own output

Others take time:
- Building reusable prompt templates
- Training team members to think in iterations
- Creating context-aware AI bots that actually reflect your business

But the outcome is worth it.

Not just faster writing or better summaries.
The real value is clarity. Focus. Less noise. Fewer delays.

It's the feeling of handing off work to something that understands the assignment — because you told it clearly enough.


That same company?
They didn't overhaul anything massive. They just got better at the conversation.

Within weeks, the support team had a full prompt library.
Marketing used AI to generate first drafts faster.
Sales started prototyping emails based on call transcripts.
Even leadership used AI to prep memos and digest meetings.

No fancy integrations. No expensive tools.
Just better prompts — and a team that knew how to use them.


It turns out the problem was never the AI.
It was how we spoke to it.
And like most things in business, once you learn the language, everything gets easier.

👉 Contact us to learn more.