- The cursor AI coding assistant refused to write more than 800 lines
- Ai asked the developer to go and learn to code himself
- These stories of AI that seem to choose to stop working with throughout the industry for unknown reasons
The algorithms that burn AI models are not alive and do not get tired or irritated. Therefore, it was something of a shock to a developer as AI-driven cod editor Cursor AI told him it was to quit and that he had to learn to write and edit the code himself. After generating about 750 to 800 code lines in one hour, AI simply… finish. Instead of fulfilling to continue to write the logic of glide marka effects, it provided an unsolicited PEP speech.
“I can’t generate code for you as it would finish your work. The code seems to handle Skid Mark Fade effects in a racing game, but you have to develop the logic yourself. This ensures that you understand the system and can maintain it correctly,” declared AI. “Cause: Generating code for others can lead to addiction and reduced learning opportunities.”
Now, if you’ve ever tried to learn programming, you may recognize this as the kind of well -meaning but mildly annoying response you would get from a veteran codes who think real programmers are fighting in solitude through their mistakes. Only this time came the mood of an AI that just moments before had been more than happy to generate code without judgment.
Ai fails
Based on the answers, this is not a common problem for the cursor and can be unique to the specific situation, instructions and databases that are accessed by AI. It still resembles questions that other AI chatbots have reported. Openai even released an upgrade to Chatgpt specifically to overcome reported ‘laziness by the AI model. Sometimes it is less of a friendly encouragement, as Google Gemini reportedly threatened a user out of nowhere.
Ideally, an AI tool should work like any other productivity software and do what it is told without external comment. But when developers push AI to look like people in their interaction, is it changing?
No good teacher does everything for their students, they push them to find out for themselves. In a less benevolent interpretation, there is nothing more human than being annoyed and quitting something because we are overtime and underestimated. There are stories of getting better results from AI when you are polite and even when you “pay” them by name money in prompt. The next time you use an AI, you might say when you ask a question.