|2023-10-10, 2023-10-25, 2023-11-22, 2024-01-29
|technical writing, strategy, content, AI, risk, quality, blog.
Learn how human technical writers matter, and are more sustainable than AI for your software business.
Is AI right for the task? Do you know the future costs and risks for your business? Will you spot errors, or will your customers?
In this article, I’ll go against industry trends, and enable you to make more informed choices when considering AI services for your content needs.
When I describe important attributes for technical content, I say: clear, correct, concise. That, along with ease of discovery, is what keeps customers productive with your software and adding to your future revenue.
But if AI cannot interpret software interfaces, then it cannot then make correct content. A UX research exercise  put this to the test, and found “Testing ChatGPT-4 for ‘UX Audits’ shows an 80% Error Rate & 14-26% discoverability Rate”. Perhaps this article was written to serve a marketing purpose, and the AI model was not customized for a content workflow, but this finding might help set expectations for content strategists and managers seeking HR savings using AI solutions.
I propose that ‘AI content’ (a marketing term for ‘machine learning assisted content’) fails to deliver on two of essential attributes for content: correct, and concise.
AI content looks plausible and well-written to a non-expert. However, it might waste the reader’s time if it’s too wordy, or is irrelevant to their needs.
Not only that, but plausible content is a lucky outcome, because AI (LLM) output is based on patterns found in its training data. These patterns are hollow, and are not based on mental models of the systems you are trying to document .
If it’s not factually correct, it will mislead your consumers, and you’ll fail to meet their expectations, or harm your credibility and business relationships. To counter this risk, you’ll need subject matter experts to verify AI output, put it right, and shape it to best help your consumers.
With current technology, and without exactly the right model training, AI writing is likely a high risk to your business, because the words can be wrong, while being hard to detect. In journalism, we’ve seen the consequences of inadequate checks on AI-generated content . More so than usual, your experts will need to be especially competent with both the product and what users need to know.
If your product is proprietary, AI will struggle to draw upon suitable material and adapt to your case, because there is no relevant training source data. Even if AI gives you something to work with, errors are more likely in this situation, because your product innovations will be exceptional, rather than conform to patterns in generic training data.
Regardless of whether you use AI or not, your content still needs peer reviewers to ensure excellent quality.
To do it right, the costs of generating and checking AI output are likely as high as the hiring costs for human authors, and you’ll likely be paying a lot for the technology. With a leaner organisation powered by AI tools, you’ll have fewer skilled writers for the occasions where AI doesn’t cut it, your writers might not spot errors, and they’ll be more stretched and make more errors themselves.
At the moment, you can’t replace good writers with AI. You need people who are present from the start of your product life cycle, who know the user’s needs, can pick up on the essential problems during development, and write content that aligns with how your company pitches the product to customers.
To release documentation on time for your new features, your writers develop their content well before release, when all product information is not yet available. During this time, you need people who can ask the right questions of others in your organisation, and adapt to create a minimum viable product. These collaborative skills significantly boost the quality of work and lower delivery risks, and are unlikely to be replaceable by AI . You’ll need people who can write for your personas, using language tailored to your users’ knowledge, skills, and needs.
Felton  lists technical writers at 90th place (of 774) in a list of professions likely to be exposed to AI, alongside chief executives, economists, and psychologists. This doesn’t mean that technical writers in the software industry are being replaced by AI, but it does mean that facets of the work can involve AI tooling, and make that work more efficient. As a consequence, the remaining work of writers could become more skilled and highly paid.
There are some scenarios to avoid. As decision makers, if we force AI into products and services without plugging the capability gaps left by redundant humans, we risk hollowing out the ‘strength in depth’ that we aim to build for a strong and capable company.
It’s possible to go all-in on AI while stripping out creators, with conviction that AI can replace human resourcing expense now. This looks like a cutting-edge business decision that attracts investment, because capital expenditure is reduced in favour of scaleable operational expenditure.
Such AI transformations are unsustainable façades, because they hollow out the experience and strength of the capability, and you can’t currently replace the human skills or technical writers with AI tooling. Further, you’re exposing yourself to the risks of price increases for AI services, technology lock-in, and poor value scaling with the needs of your business.
Any gains of a complete AI transformation of the writing function are temporary, because AI output feeds from existing expertise. When that expertise is missing, the business capability will decay. Quality degrades when AI consumes AI content, especially without adequate traceability of training data to identify that risk. If systems weaken without the expertise to contribute source data and correct the output, then the systems eventually fail to provide meaningful service for your customers.
If your business joins the AI revolution as a consumer, and you pursue it aggressively and without care, the benefits are only temporary, and the resulting capability can easily become unsustainable.
An ecosystem of such companies will try to lean on each other for on-demand services, but not provide the strength in depth that they need for resilience in challenging times, and to properly deliver to fulfil demand.
All that said, it’s possible to run an efficient content organisation without investing all-in on AI. If you’re looking for cost-savings or better efficiency in technical writing, consider:
My view is that AI can’t yet replace most of a technical writer’s role, and AI business transformation carries risks that entrepreneurs may waive to pursue a good market valuation for their companies. There is a temptation for business leaders to convince others that these cost savings provide other benefits.
Doctorow says, “We’re nowhere near the point where an AI can do your job, but we’re well past the point where your boss can be suckered into firing you and replacing you with a bot that fails at doing your job.” .
Be sure you know the total cost of AI.
Assuming AI can perform the tasks you need of it, you have to weigh up resourcing savings and the kudos that AI brings, against the operational costs of outsourced AI technology. The market valuation of core AI tech companies is based on a future where businesses like yours are estimated to pay trillions (USD equivalent) in total.
Also weigh up the loss of core expertise, and the many costs of unscreened errors. Don’t underestimate how a customer’s frustrating experience can erode the relationship with your business and impact future sales.
If you’re in the business to provide a service and make money from that, then human writers will continue to deliver self-serve support content. If you’re transforming businesses for short-term shareholder profit, then AI delivers a more compelling balance sheet, but is not a sustainable prospect for long-term customers and employees passionate about genuine service.
In the context of AI, I foresee a future polarization of the workforce, of ‘enablers’ who follow the money to seek opportunities to execute AI transformation for top-level executives, and a ‘resistance’ who care about what the business does for its customers.