Refined Draft

Refined Draft

Share this post

Refined Draft
Refined Draft
AI and Technical Writing (#3)

AI and Technical Writing (#3)

What's Still a Human Job?

Kevin A. Cornelius's avatar
Gabriel Laskey's avatar
Kevin A. Cornelius
and
Gabriel Laskey
Apr 15, 2025
∙ Paid

Share this post

Refined Draft
Refined Draft
AI and Technical Writing (#3)
Share

So GitHub Copilot cranked out 1,000 lines of code for your engineers yesterday.

Amazing.

The engineering team's sprint velocity is up 20%. Engineering resources are being "optimized." Executives are thrilled.

And yet your users still can't figure out how to actually use the thing the team built.

That's because AI doesn't eliminate complexity. It just moves it. Products still need to be explained, understood, and trusted. The need for clear communication hasn’t changed, but how we’re involved has.


AI-Written Documentation is not Enough

AI can generate passable documentation – the same way it can generate passable code, marketing copy, and dating profiles. But passable work doesn't build trust or create real understanding.

Consider the following things that AI-generated content consistently gets wrong.

Fabricated Information

Ever had ChatGPT invent a citation? Or explain a feature that doesn't exist? AI systems hallucinate with remarkable confidence. A Stanford study found legal-focused models fabricate information in one out of six queries.

This isn't theoretical. In 2023, a lawyer was sanctioned after submitting a legal brief containing six completely fictional cases that ChatGPT had confidently invented. When questioned by the judge, the lawyer admitted he had no idea the tool could make up cases that sounded real but didn't exist.

The difference between sounds right and is right is the difference between a system that helps users and one that actively misleads them.

Human Frustration

AI has never stood in a kitchen staring at a pot of burned rice, wondering what went wrong. It has never muttered, "This should have worked!" while following vague instructions. It doesn’t understand the emotional journey of users.

The most useful documentation anticipates where users will get stuck.

# Typical AI-generated instruction:

1. Add rice and water to the pot.
2. Bring to a boil.
3. Reduce heat and simmer until done.
# Human documentation:

1. Rinse the rice thoroughly until the water runs clear to remove excess starch.
2. In a pot add 1.5 cups of water for each 1 cup of rice.
3. With the pot uncovered, bring the water to a boil.
4. Reduce the heat to Low and cover with a tight-fitting lid.

IMPORTANT: DO NOT STIR during cooking. This makes the rice gummy.

5. After 15 minutes, turn off the heat. Leave the lid on for another 10 minutes to steam.
6. Remove the lid.
7. Fluff the rice with a fork before serving.

Additional Tips:

- If your rice burns or sticks, check your heat settings. If the low temperature is too high, the rice will be scorched.

- If your rice turns out mushy, you may have used too much water for the type of rice you are cooking.

AI excels at listing steps that work under ideal conditions. It struggles to address what humans actually need: help when things inevitably go wrong.

Personalized Echo Chambers

The best documentation meets users where they are — guiding beginners, offering depth for experts, and providing clarity for decision-makers. But AI-generated content often fails at this because it doesn’t challenge users — it panders to them.

Recent research highlights a growing issue: modern LLMs tend to mirror user preferences and biases instead of delivering truly objective information. This leads to AI-generated echo chambers, where interactive documentation adapts to what the user expects rather than what they actually need.

The result?

  • Beginners get oversimplified explanations that hide critical nuance.

  • Experts see jargon-heavy content that lacks real insight.

  • Decision-makers receive vague, noncommittal answers.

AI tailors content to reinforce each user’s preconceptions — which means every audience gets something slightly wrong.

This isn't documentation that serves everyone. It’s documentation that subtly misguides each audience in different ways.


Human Value in an AI World

If you create products or explain complex ideas, you're already a technical writer – whether that's in your job title or not. And if you want to remain valuable, focus on the areas AI stumbles.

Keep reading with a 7-day free trial

Subscribe to Refined Draft to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Inscribe Wisdom
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share