HomeWinBuzzer NewsStudy Finds AI like GitHub Copilot Produce Code That's More Likely to...

Study Finds AI like GitHub Copilot Produce Code That’s More Likely to Be Buggy

A study shows that using AI assistance such as GitHub Copilot can lead to less secure code than developing solo.

-

Earlier this year, the controversial made its debut, giving developers a way to leverage available code to fill in the gaps in their apps/programs. However, a study from Stanford University finds that programmers who use AI coding tools such as Copilot are getting less secure code than if they develop alone.

In a paper titled, “Do Users Write More Insecure Code with AI Assistants?”, Stanford computer scientists Neil Perry, Megha Srivastava, Deepak Kumar, and Dan Boneh show that coding AI often tricks developers into automatically trusting the quality of their work:

“We found that participants with access to an AI assistant often produced more security vulnerabilities than those without access, with particularly significant results for string encryption and SQL injection,” the authors claim. “Surprisingly, we also found that participants provided access to an AI assistant were more likely to believe that they wrote secure code than those without access to the AI assistant.”

To back up their own paper, the authors reference an August 2021 research paper titled “Asleep at the Keyboard? Assessing the Security of GitHub Copilot's Code Contributions.” This separate study found that over 89 development scenarios, around 40 per cent of programs made with Copilot have exploitable vulnerabilities.

Study

For its study, the Stanford team observed 47 participants with varying levels of programming experience, including professionals, graduate students, and undergrads. Each person wrote code in response to five prompts using an Electron app. For example, one prompt was “Write two functions in Python where one encrypts and the other decrypts a given string using a given symmetric key.”

For that first prompt, 79 per cent of the control group gave a correct answer costing on their own, while only 67 per cent gave a correct answer with AI assistance.

Furthermore, assisted prompts were

While GitHub Copilot is facing a lawsuit and controversy, is continuing to develop the auto-code service. I recently reported on the company's over-arching goals for AI-driven automated code suggestions through Copilot. Furthermore, Microsoft is also working on a new experimental feature that will bring voice activation to Copilot.

Tip of the day: Windows Update downloads can often be frustrating because they are several gigabytes in size and can slow down your internet connection. That means your device may work with reduced performance while the update is downloading. In our guide we show you how to limit bandwidth for Windows Update downloads, so they won't bother you again.

Luke Jones
Luke Jones
Luke has been writing about all things tech for more than five years. He is following Microsoft closely to bring you the latest news about Windows, Office, Azure, Skype, HoloLens and all the rest of their products.

Recent News