The Race for AI-Powered Security Platforms Heats Up

The Race for AI-Powered Security Platforms Heats Up

Source Node: 2544907

When a major vulnerability shakes up the cybersecurity world — such as the recent XZ backdoor or the Log4j2 flaws of 2021 — the first question that most companies ask is, “Are we affected?” In the absence of well-written playbooks, the simple question can require a great deal of effort to answer.

Microsoft and Google are investing heavily in generative artificial intelligence (GenAI) systems that can turn large security questions into concrete actions, assist security operations, and, increasingly, take automated actions. Microsoft offers overworked security operations centers with Security Copilot, a GenAI-based service that can identify breaches, connect threat signals, and analyze data. And Google’s Gemini in Security is a collection of security capabilities powered by the company’s Gemini GenAI.

Startup Simbian is joining the race with its new GenAI-based platform for helping companies tackle their security operations. Simbian’s system combines large language models (LLMs) for summarizing data and understanding native languages, other machine learning models to connect disparate data points, and a software based expert system based on security information culled from the Internet.

Where configuring a security information and event management system (SIEM) or a security orchestration, automation, and response (SOAR) system could take weeks or months, using AI cuts the time to — in some cases — seconds, says Ambuj Kumar, co-founder and CEO of Simbian.

“With Simbian, literally, these things are done in seconds,” he says. “You ask a question, you express your goal in natural language, we break into steps code execution, and this is all done automatically. It’s self sufficient.”

Helping overworked security analysts and incident responders streamline their jobs is a perfect application for the more powerful capabilities of GenAI, says Eric Doerr, vice president of engineering at Google Cloud.

“The opportunity in security is particularly acute given the elevated threat landscape, the well publicized talent gap in cybersecurity professionals, and the toil that is the status quo in most security teams,” Doerr says. “Accelerating productivity and driving down mean time to detect, respond, and contain [or] mitigate threats through the use of GenAI will enable security teams to catch up and defend their organizations more successfully.”

Different Starting Points, Different ‘Advantages’

Google’s advantages in the market are evident. The IT and Internet giant has the budget to stay the course, the technical expertise in machine learning and AI from its DeepMind projects to innovate, and access to a lot of training data — a critical consideration for creating LLMs.

“We have a tremendous amount of proprietary data that we’ve used to train a custom security LLM — SecLM — which is part of Gemini for Security,” Doerr says. “This is the superset of 20 years of Mandiant intelligence, VirusTotal, and more, and we’re the only platform that has an open API — part of Gemini for Security — that allows partners and enterprise customers to extend our security solutions and have a single AI that can operate with all the context of the enterprise.”

Like Simbian’s guidance, Gemini in Security Operations — one capability under the Gemini in Security umbrella — will assist in investigations starting at the end of April, guiding the security analyst and recommending actions from within Chronicle Enterprise.

Simbian uses natural language queries to generate results, so asking, “Are we affected by the XZ vulnerability?” will produce a table of IP addresses of vulnerable applications. The systems also uses curated security knowledge gathered from Internet to create guidebooks for security analysts that show them a script of prompts to give to the system to accomplish a specific task.

“The guidebook is a way of personalizing or creating trusted content,” says Simbian’s Kumar. “Right now, we are creating the guidebooks, but once … people just start to use it, then they can create their own.”

Strong ROI Claims for LLMs

The returns on investment will grow as companies move from a manual process to an assisted process to autonomous activity. Most GenAI-based systems have advanced only to the stage of an assistant or copilot, when it suggests actions or takes only a limited series of actions, after gaining the users permissions.

The real return on investment will come later, Kumar says.

“What we are excited about building is autonomous — autonomous is making decisions on your behalf that are within the scope of guidance you have given it,” he says.

Google’s Gemini also seems to straddle the gap between an AI assistant and an automated engine. Financial services firm Fiserv is using Gemini in Security Operations for creating detections and playbooks faster and with less effort, and for helping security analysts quickly find answers using natural language search, boosting the productivity of security teams, Doerr says.

Yet trust is still an issue and a hurdle for increased automation, he says. To bolster trust in the system and solutions, Google remains focused on creating explainable AI systems that are transparent in how they come to a decision.

“When you use a natural language input to create a new detection, we show you the detection language syntax and you choose to run that,” he says. “This is part of the process of building confidence and context with Gemini for Security.”

Time Stamp:

More from Dark reading