DEV Community

ContentLab
ContentLab

Posted on • Originally published at contentlab.io

Can AI writing be used to create technical content?

Can AI writing be used to create technical content?

AI text generation has made remarkable progress over the past five years. In the not-too-distant past, computer text generation relied on tools like Markov chains. They generated amusing and sometimes useful text, but it was not up to the standard a human writer would create.

That began to change rapidly when researchers began using deep neural networks based on transformers. The first to grab the public's attention was GPT-2. It produced shockingly realistic text and made many writers worry about the future of their careers. Its successor, GPT-3, is even better at generating text that sounds like it's written by a human. GPT-3 is also continually improving and gaining new features. But technical content is different from regular text because it requires deep, domain-specific knowledge. Most humans are terrible at it. Can a machine do any better?

This post will explore whether these state-of-the-art language models are good enough to create credible technical content. Throughout the post, I'll use GPT-3 with various prompts and try to get it to generate lucid English.

All samples in this article were generated in the OpenAI Playground. If you'd like to try it for yourself, you can sign up for an account and get some free credits to begin experimenting. For everything you see below, I used the text-davinci-002 model with the following settings:

Temperature: 0.8
Top P: 1
Frequency Penalty: 0.9
Presence Penalty: 0.9

Adjusting these parameters can dramatically change the text GPT-3 outputs. I found these settings resulted in good results for technical content, but you might find that other settings are more suitable for your use case.

Generating Regular Content with AI

Let's start with a simple task: generating plain English in response to an easy prompt. This type of writing should be right in GPT-3's wheelhouse.

Prompt: Write several paragraphs explaining what an apple is, the history of apples, how they are cultivated, and how they are typically used.

GPT-3's Response: An apple is a fruit that grows on trees. The tree is of the genus Malus, and the fruit is a pome. Apples have been cultivated for thousands of years, and are now grown in all parts of the world. The skin of an apple can be eaten, but the fleshy part inside is what is most often consumed. Apples can be used in many different ways, such as being made into pies or other desserts, or simply eaten out of hand as a snack.

Apple trees are believed to have originated in Central Asia, and were brought to Europe by the Romans. Apple trees were then introduced to North America by European settlers. Today, apples are grown in many parts of the world, including China, India, Russia, and the United States.

There are many different types of apples, which vary in color, flavor, and size. Some popular varieties include Red Delicious, Granny Smith, Fuji, and Golden Delicious. Apples can be enjoyed fresh or cooked in a variety of dishes.

Not bad. I'm not sure I needed AI to come up with that description of an apple. But to be honest, if I had asked ten humans to answer that question half of them likely would have shown less lucidity and insight than GPT-3.
Let's make things a little more challenging and give GPT-3 a prompt asking it to hypothesize about something ridiculous to see how it does.

Prompt: Write in the style of H.P Lovecraft, hypothesizing about what the people of Canada would do if the government enacted a law stating that every household in the country must own at least one wombat.

GPT-3's Response: The people of Canada would be in uproar if the government enacted a law stating that every household in the country must own at least one wombat. There would be protests in the streets, and people would be demanding to know why the government thinks they need to own a wombat.

There would be those who would try to comply with the law, but they would not be able to find any wombats for sale. The few that are available would be sold at exorbitant prices, and many people would not be able to afford one. There would be a black market for wombats, and criminal gangs would fight over control of the wombat trade.

The government would quickly realize that this law was unenforceable, and it would be repealed. But the damage would already have been done, as the idea of owning a wombat would have become deeply ingrained in the Canadian psyche.
For years to come, people would talk about the time when every household was required to own a wombat, and some would even keep a pet wombat in their home, just in case the law ever came back into effect.

That's better than expected. If I were a writer of blogspam, I might be looking over my shoulder, worrying that GPT-3 was about to end my career. Either that, or I'd be hatching a plan to use GPT-3 to build a blogspam empire.

These examples make it clear that GPT-3 can generate a reasonable, well-written, coherent response to a plain English query — even when given a ridiculous prompt. Let's see if it does as well when we ask it to generate technical content.

Generating Technical Content With AI

You've seen that GPT-3 is a decent writer. But writing about apples and enforced wombat ownership is very different from writing technical content.
Technical articles require deep insight, knowledge, and hands-on experience. Or do they? Is GPT-3 about to force developers and technical writers into early retirement? Let's find out.

Prompt: Write a paragraph explaining the history and purpose of Kubernetes.

GPT-3's Response: Kubernetes is a portable, extensible open-source platform for managing containerized workloads and services, that facilitates both declarative configuration and automation. It has a large, rapidly growing ecosystem. Kubernetes services, support, and tools are widely available. The name Kubernetes originates from Greek, meaning "helmsman" or "pilot", and was chosen to reflect the goal of making it easy to run production-grade containerized applications. Kubernetes builds upon a decade and a half of experience at Google running production workloads at scale using a system called Borg. Borg allowed Google to run workloads across multiple clusters while providing SLAs on performance and availability.

Not bad. Not exactly a tough prompt, but it looks like GPT-3 knows enough about Kubernetes to write a decent mini-history of it. If I earned my living writing about the history of Kubernetes, I'd be polishing my resume right now.
But that last prompt wasn't all that technical. GPT-3 only had to write non-technical English about a technical topic. So let's see how GPT-3 does when we try and throw it a curveball:

Prompt: Explain how to design an enterprise Kubernetes infrastructure that can handle three million web requests per second.

GPT-3's Response: There are a few things to consider when designing a Kubernetes infrastructure that can handle three million web requests per second:

  1. The number of nodes in the cluster should be increased to accommodate the additional load. This can be done by adding more physical servers, or by using a cloud provider that can scale the cluster automatically.
  2. The networking infrastructure should be designed to handle the increased traffic. This may involve adding more network bandwidth, or using a different type of network such as an InfiniBand network.
  3. The storage infrastructure should be designed to handle the increased traffic. This may involve adding more storage capacity, or using a different type of storage such as a distributed file system.
  4. The application itself should be designed to handle the increased traffic. This may involve adding more application servers, or using a different type of application such as a web application firewall.

A-ha! Finally! GPT-3 gave us the kind of meandering text I'd have written on a high school exam when I didn't remember the answer. GPT-3 talked about things related to the prompt I provided, but it didn't do what I asked.

You can see that GPT-3 has potential, though. Although it didn't provide what the prompt asked, it did return related information. I see several leads I might want to pursue as article topics: cluster autoscaling, network infrastructure scaling, InfiniBand, how to scale storage infrastructure, and distributed file systems. Thanks, GPT-3!

We'll revisit the idea of using AI for topic ideation. But first, I'll give GPT-3 a real mind-bender:

Prompt: Write INTERCAL code to display the Levenstein distance between two strings.

GPT-3's Response: The Levenstein distance between two strings is the minimum number of edits needed to transform one string into the other. The allowed edit operations are insertion, deletion, and substitution.
This INTERCAL program calculates the Levenstein distance between two input strings. It uses an array to keep track of the number of edits needed to transform each prefix of the first string into a prefix of the second string.

DO ,1 <- #0 ( Do line 1 )
DO ,2 #9 <- #255 ( Do line 2 )
PLEASE DO ,3 ( Please do line 3 )
DO ,4 #18 <- #18 PLEASE
DO ,5 NEXT #3 ( Do line 5 )
DO ,6 PLEASE DO ,7 ( Please do line 7 )
DO ,8 #18 <- #18 PLEASE
DO ,9 NEXT #7 ( Do line 9 )
PLEASE GIVE UP ( Please give up )

It's okay, GPT-3. Giving up early was probably the right choice.

Fine-Tuning AI for Technical Content

GPT-3 has had both hits and misses so far. It generated some good text, but also had some misses. Still, it has been fairly impressive. The fact that it manages to generate reasonable-looking Intercal speaks well about its abilities. It didn't do so well on Kubernetes content, though. Let's see if we can help it out.

To give GPT-3 a boost, I fine-tuned a model using data I gathered from the Kubernetes documentation. The results were impressive, but still imperfect:

Prompt: Write a tutorial on how to deploy a small Kubernetes cluster on Ubuntu 18.04.

GPT-3's response: This tutorial will guide you through the process of deploying a small Kubernetes cluster on Ubuntu. Kubernetes is an open-source system for automating deployment, scaling, and management of containerized applications.

Before you begin, you will need the following:
-A server running Ubuntu 18.04
-A static IP address for your server
-A domain name pointed at your server's IP address

  1. Log in to your server and update the package index. sudo apt-get update
  2. Install Docker. sudo apt-get install docker.io
  3. Install Kubernetes. sudo apt-get install kubelet kubeadm kubectl
  4. Initialize Kubernetes. sudo kubeadm init
  5. Configure Kubernetes. mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config
  6. Deploy a pod network. sudo kubectl apply -f https://git.io/weave-kube
  7. Join your nodes to the cluster. sudo kubeadm join --token :

Your Kubernetes cluster is now up and running!

This time around, GPT-3's instructions were quite good. There was a small problem, however. GPT-3's instructions to the reader will not all work. GPT-3 is trained with text pulled from the internet. On one hand, this is great because it ensures GPT-3's knowledge is both broad and deep. On the other hand, it's a disadvantage because the internet is full of incorrect and outdated information. In this case, GPT-3 skipped some required steps to set up Kubernetes on Ubuntu, and step 6 references a tool that no longer works.

Fine-tuning GPT-3's model helped it discuss Kubernetes more accurately, but did not give it the ability to output a working tutorial. Additional fine-tuning would likely help, but training GPT-3's most accurate model quickly becomes expensive, so you'll quickly reach a point of diminishing returns versus just hiring a human writer.

Next, let's see if the fine tuning helped GPT-3 discuss Kubernetes more accurately at a high level.

Prompt: Write several paragraphs about how Kubernetes can help organizations drive digital transformation initiatives.

GPT-3's response: Kubernetes is a container orchestration platform that can help organizations drive digital transformation initiatives by simplifying the process of deploying and managing containerized applications at scale. By automating the deployment, scaling, and management of containers, Kubernetes can help organizations save time and resources while reducing the risk of human error. Additionally, Kubernetes provides a declarative API that allows developers to define their desired state for their applications, making it easy to roll back changes or update applications in a consistent and predictable manner.
Kubernetes can also help organizations improve the agility of their development processes by allowing them to quickly deploy new features and updates to their applications. By using Kubernetes' rolling update feature, developers can push out new code changes incrementally to production environments without affecting the overall stability of the application. This allows for faster turnaround times when fixing bugs or adding new features, which can ultimately lead to better customer satisfaction.

In addition to its benefits for individual developers and teams, Kubernetes can also help improve an organization's bottom line. By reducing the need for manual intervention in deployments and management tasks, Kubernetes can help reduce operational costs. Additionally, because Kubernetes makes it easy to run multiple replicas of an application, organizations can avoid the downtime and lost productivity associated with traditional deployments.

As a developer and technical marketer, I'm impressed by this result. GPT-3 generated a lucid, coherent narrative about how an organization could use Kubernetes to fulfill a specific goal. When I tried to push it further, however, GPT-3 lost its way and started repeating itself or going off-topic.
However, this is still exciting. Most writers could use this text as the beginning of an article about Kubernetes and digital transformation. Alternatively, you could turn it into an outline for an article that covers each of the points GPT-3 came up with in greater detail.

So, can AI create credible technical content?

Throughout the article, we've seen hard evidence that AI isn't ready to create great technical content…yet. Good technical writing requires a combination of insight and hands-on experience that even the most sophisticated text generation AI tools like GPT-3 cannot match.

We've also seen that AI can act as a technical content creation sidekick. Even if it can't write you articles, it can help uncover ideas and topics that you might not have thought of on your own. AI can help you populate topic lists and briefs if you are building out your own content pipeline. If you're working with a technical content agency like us, AI can help you generate ideas you can bring to the table and iterate on with your content creation partner.

It's worth keeping your eye on this space. Transformers and other large language models are improving rapidly. GPT-4's release is expected some time in 2022 - and it will be fascinating to see how it can help technical content in creators.

For now, though, AI isn't quite ready to replace human technical content creators. If you need great technical content quickly, say hello to us over at ContentLab.

Top comments (2)

Collapse
 
lestergomez profile image
lestergomez • Edited

This type of writing seems very difficult to me. And this may require strong writing skills. Unfortunately, I've always had problems with this. Recently I was preparing a TOK job and found a company that can help writingmetier.com/buy-tok-essay/ I finished everything on time and got high marks for it.

Collapse
 
petergamer98 profile image
Petergamer98

Content is really important when it comes to SEO. Especially when it comes to good and unique content. If anyone is interested in the topic, I think it is worth reading this post: gamerseo.com/blog/what-is-unique-c...