Short and sweet
So, I was supposed to complete a coding task, test it and then submit. I created a prompt. AI generated a code for me. There was a bug in it, which I discovered after testing it thoroughly. I found out that what was the issue and simply added a prompt to correct the code. AI corrected the code with the syntax that I did not know, so I simply went to learn it.
The thing is, LLMs just crawls through the web. Someone might have posted about the issue, that's where it got to crawl and provide a solution. The logic behind the bug was not intuitive to the humans, how could the LLMs!
Your natural visualization of a problem is much more important to the industry than your "crawling through the web skills"! So, if you gotta "let's find a solution on internet" ask AI. But if you got a "What is the problem with this code" problem, don't bother that AI.
Our GPUs are melting.
- Sam Altman
Top comments (0)