So I had recently picked up ChatGPT prompt engineering, and I wanted to make something sort of practical. So I thought of making something like GraphQL, but with natural language. So I whipped up the OpenAI playground, and started writing a prompt in the format that I usually use. It took a long while to get it to do what I wanted, but I eventually got it to work more than half the time.
How the prompt works
The prompt works by taking in 2 things. The natural language query and your API schema. (Your API schema should also be natural language) I took advantage of the system message by appending the schema to the system message, right under the the section labeled “Schema”
I instructed the AI to respond with this JSON format:
{
"headers": { node-fetch-style headers here },
"body (optional)": { body here },
"url": "[api endpoint]",
"method": "POST",
"pipe (OPTIONAL)":
{
"url": "url to pipe to",
"method": "request method",
"headers": { node-fetch-style headers here },
"pipeTo" : "pipe items from body to body, here is an example of syntax: "message=>username;status=>password""
}
}
This, with the exception of the pipe key allows me to just plug in the values into a fetch request. I also added the ability to pipe the response of one request to another request, but it’s quite buggy.
Finally, to coax the AI into responding with just JSON, I added a small example conversation I had with the AI and sent that with the system message.
The UI
I made a simple UI in figma but I didn’t like it so I asked my friend @bddy to improve it. He did a really awesome job with the UI and you should check him out! I also hooked up the UI to the backend for him so he wouldn’t have to do that lol.
Top comments (0)