Today, Microsoft announced that the official launch day of Copilot for Security for will be April 1, 2024. This proves that Microsoft won’t let a late-night snowstorm stop its pursuit of security revenue or finding new applications for generative AI. Approximately one year after announcing the project, Copilot for Security will be available for security leaders and their teams. We share our takeaways and thoughts to prepare security leaders for the technology below.

Get Comfortable With Consumption-Based Pricing

Microsoft security compute units are $4 per hour, with no mentions of discounts for prepurchase or consumption amount.

We think that Microsoft chose consumption-based pricing for two additional reasons:

  1. Not bundling this with the alphabet soup of its existing licensing scheme probably makes it easier for Microsoft (and possibly for its customers) to launch.
  2. There is a significant degree of unpredictability in the cost and pricing models for generative AI across vendors. Striking the balance of what to charge — and how to charge — for generative AI is still a struggle.

Long term, we expect generative AI features to become a baseline feature of most products and services, and few vendors will be able to charge a premium for the capabilities. Microsoft is one of those vendors.

Making Security More Accessible For All

Copilot for Security will be rolled out, to start, in eight different languages for prompting. The product interface will be available in 25 languages. This is, according to Microsoft, “reflective of the diversity of attackers.” Additionally, and according to Microsoft, Copilot will reduce the barrier to entry for those with different backgrounds and attract more (and more diverse) talent into cybersecurity.

Though large language models and generative AI may level the playing field and allow for accelerated security talent development, no amount of out-of-the box prompt books and guided response steps replaces fundamental security knowledge, skills, and experience. Expect continued investment in cybersecurity skills and training, mentoring, and job shadowing to remain a vital part of your talent management strategy. Also expect a fair amount of change management and training for even your most seasoned practitioners to take full advantage of Copilot (see below).

Plan For Training, And Think About How To Encourage Behavioral Change

We attended one of Microsoft’s pre-launch events and received interesting details from several of its private preview and beta users. One of the most interesting nuggets: Expect it to take around 40 hours of training to get security practitioners comfortable with using Copilot for Security. That seems like quite a bit of time, but remember that this is a new way to work.

In addition, we heard that it takes four or more weeks — with many stops and starts — to get practitioners comfortable with the technology. Once the technology doesn’t do what a security operations center (SOC) analyst expects, they go back to the “old way” of doing things. Getting them to fully adapt requires more than just training: It also requires them to develop new behaviors.

Microsoft Needs Partners To Solve Scale And Training Problems

Microsoft acknowledges the challenges mentioned above, and one of our key takeaways is that scaling training for analysts who need it will become a partner problem. Deployment and implementation are pretty fast, given that this is largely a cloud solution and more integrations are on the roadmap.

As of today, Microsoft is offering a series of training webinars and documentation related to Copilot skills for Defender XDR. When it comes to helping actual end users, however, the onus is on its partner ecosystem to encourage adoption and facilitate use of this technology.

Enormous Productivity Gains Exist, Mostly For Experienced Practitioners

At our 2023 Security & Risk Forum keynote, Allie Mellen and Jeff Pollard discussed how the practitioners getting the most value out of generative AI capabilities were not new and inexperienced personnel. It was the more experienced personnel who received the most benefits in terms of a productivity gain. This was confirmed by Microsoft’s panel of clients.

Experienced personnel saw major gains in productivity by eliminating drudgery such as writing queries. Less experienced personnel were helped by summarization and report creation.

These findings align with our research on How Security Tools Will Leverage Generative AI, where we identified that the top three use cases were content creation.

We asked an SOC leader who had been in the beta if Copilot for Security had uncovered threats that his investigations would have missed. He said no, he would have found them all, but it would have taken longer. He added, “But it helps the junior analyst, because they would have gotten stuck much earlier and would have had to escalate to us.”

In addition, removing some of the writing and reporting requirements for incidents from personnel helped them focus on more proactive tasks such as making plans to improve overall security posture and working on automating workflows.

What Microsoft Early Access Clients Loved

Microsoft’s panel of clients agreed that they loved the following things about Copilot for Security:

  • Making script analysis easier. They particularly noted its ability to deobfuscate on the fly and explain script contents.
  • Accelerating threat hunting by helping write queries based on adversary methodologies.
  • Creating complex queries. Speeding up and simplifying how to write complex KQL queries or PowerShell scripts was a huge value-add.
  • Analyzing phishing submissions. This includes verifying true positives, providing details on whether or not an individual opened the emails and which inboxes it was sitting in.
  • Improving analyst experience. Copilot for Security reduced a practitioner’s need to swap between various tools and interfaces, leading to big productivity gains.
  • Creating report summaries. One beta customer said his favorite outcome was that the incident report could be generated as a template, with information that executives needed to sign off on. This eliminated an occasional back-and-forth series of emails when analysts missed including this component.

What Are The Downsides?

For now, given that it’s an MVP, Copilot for Security will require multiple instances for companies that want to silo data between business units, operating companies, or geographies. Those instances do not roll into a single interface at launch. Not only is that problematic for multinationals or complex corporations, but it’s also a challenge for service provider partners offering MDR, SOCaaS, or managed SIEM services on Copilot for Security.

Integrations are limited — for now. Copilot can call Power Automate, and vice versa, but in neither case are the calls bidirectional. Copilot cannot auto-quarantine an infected host today. But Microsoft has plans to add more integrations in the future, and customer adoption will accelerate this.

Should Security Leaders Invest?

For security teams already leaning on Microsoft Sentinel, Defender, Entra, Priva, Intune, and Purview, this is a no-brainer add-on that will help their teams become more productive.

Make sure that your budgets can absorb some unknowns, given the “pay as you go” (aka consumption-based pricing model), and set aside some dollars and time for training, too. As we stated in Top Recommendations For Your Security Program, 2024, you’ll need to balance the usefulness of assistants such as Copilot with the potential for skyrocketing costs.

For teams that depend on other technologies, it may not be worth sinking much investment into secure compute units (for now); instead, turn to whatever your current portfolio player vendor calls its generative AI solution.

Connect With Us

Forrester clients can schedule an inquiry or guidance session with us to discuss how Microsoft Copilot for Security can impact your security program.