GitLab’s New AI-Based Security Feature Explains Codebase Vulnerabilities

The new security feature uses large language models to provide detailed information on potential vulnerabilities.

April 25, 2023

GitLab Logo on Webpage
  • GitLab has launched an AI-based security feature, which will explain vulnerabilities in code to developers through LLMs.
  • Earlier this month, the company released an experimental tool capable of explaining code to a user.

Leading developer platform GitLab announced the launch of a new security feature for its platform that will use artificial intelligence (AI) to explain coding vulnerabilities to developers. The company aims to automate the troubleshooting of vulnerabilities with large language models (LLMs) in the future.

Since late 2022, GitLab has launched several new features, including a code completion tool for Premium and Ultimate subscribers, a machine learning-powered suggested reviewers feature, a beta feature to create summaries of issue comments, and, lately, a tool that can explain any piece of code to a user.

The new feature for explaining vulnerabilities can help a development team quickly identify and fix vulnerabilities within a specific codebase by combining fundamental vulnerability data with recommendations on the user’s code. The feature also logs vulnerabilities that have been dismissed to help with tracking actions in compliance and audits.

See More: Google’s Brain and DeepMind Unite To Accelerate AI Breakthroughs

GitLab Aims for 10x Efficiency With IP Privacy Safeguards

The company has emphasized the importance of data privacy in terms of its AI tools. GitLab has stated that it will not use customer data for training its language models. This will prove crucial as several GitLab customers are strictly-regulated entities unlikely to take chances in terms of data leak risks.

According to the company blog, GitLab also launched a new scanner for license approval and compliance, which can verify and identify more than 500 types of licenses, minimizing the risk of users missing out on compliance parameters.

Other features that GitLab seeks to incorporate in further versions include group and subgroup-level dependency lists, continuous container and dependency scanning, management tools for compliance frameworks, and software bill of materials (SBOM) ingestion.

Do you think artificial intelligence will play a greater role in development requirements in the future? Let us know on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!

Image source: Shutterstock

MORE ON ARTIFICIAL INTELLIGENCE

Anuj Mudaliar
Anuj Mudaliar is a content development professional with a keen interest in emerging technologies, particularly advances in AI. As a tech editor for Spiceworks, Anuj covers many topics, including cloud, cybersecurity, emerging tech innovation, AI, and hardware. When not at work, he spends his time outdoors - trekking, camping, and stargazing. He is also interested in cooking and experiencing cuisine from around the world.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.