Exaforce Blog Author Image – Marco Rodrigues
Back to Blog
Exaforce
Industry
May 29, 2025

3 points missing from agentic AI conversations at RSAC

Agentic AI tools for security operations centers promise to enhance—not replace—human analysts, but their true value lies in thoughtful integration, deep context, and rigorous proof-of-concept testing, not hype-driven adoption.

Exaforce Blog Featured Image

This article originally appeared in SC Magazine.

For those who attended RSAC 2025 this year, chances are agentic AI came up in the conversation. Vendors pushed dozens of agentic AI products, many of which were tailored to use cases for security operations centers (SOCs) – and marketers dove in head-first to position their companies at the forefront of innovation.

However, thoughtful dialogue about the practical application and true value of agentic AI in the SOC got lost. Here’s what many of the sales pitches missed:

Agentic SOC platforms are a force multiplier, not a replacer.

One of the biggest misconceptions about agentic SOC solutions that we heard is that they will put security professionals out of work and replace some of the tools they’re most familiar with, such as security incident and event management (SIEM) tools. That’s not accurate - in fact, humans, SIEMs and agentic SOC solutions work better when used in tandem.

Security professionals benefit from using effective agentic SOC tools. The new products can minimize tedious workloads, time spent triaging alerts and performing investigations will decrease substantially, and they’ll have more time to uplevel and focus on high-tier investigations and response tasks.

SIEMs have been around for decades and aren’t going anywhere. They collect large amounts of historical data and context that agentic SOC solutions can rely on to produce recommendations and responses. While some agentic SOC tools add reasoning and action to datapoints, they need access to the context in the SIEMs to remain effective.

Context too often gets overlooked.

An overlooked aspect of agentic AI that has gotten lost in conversations about minimizing workloads is its ability to work in tandem with third-party systems. These third-party tools and data sources have nuanced interfaces, data schemas, and operations that agents can misinterpret without deep contextual knowledge of how a tool works. AI agents need deep integration, with sufficient access to data, visibility into workflows, strong feedback mechanisms and environmental context.

If the enabling deep context gets overlooked, agentic AI tooling can add tasks to a to-do list, rather than removing them. For example, if the solution triages an alert and offers a recommendation, is there transparency on that data was gathered? Do we have to go through another system to get the transparency on that data? Is that adding work for the team? The level of context and importance of automating fine-tuning after deployment are still aspects that are being overlooked.

The vendors don’t offer PoCs that can prove a product’s real value.

Crowded booths and flashy banners were everywhere, but booth demos are optimized to tease the best functionality the vendor has to offer – they can’t deliver the insights that deploying the product in the user’s own environment can elicit.

Vendor claims for agentic AI SOC tools ranged from saving time and money to agents making decisions and executing on them autonomously. A proof of concept (PoC) can help verify whether those claims hold up under the company’s SOC’s conditions. Can the tool operate with the company’s specific data volumes and alert types? Can they integrate with the tools in the tech stack that are crucial to the organization’s business operations?

Many may think: “PoCs are nothing new – we know there’s value.” True, but the misconception that AI agents will replace security professionals in combination with the current economic climate adds concerns that we can quell with a PoC in favor of a paper evaluation. Giving analysts the opportunity to test the product and see that it’s there to help them, not replace them, will go a long way in building trust between the user and the product, as well as the employee and the investment decision-makers.

Getting a PoC and fighting the urge to make a heavy investment immediately for the sake of quick innovation lets a team fine-tune the tool’s logic, policies and thresholds to match a SOC’s risk appetite and operational nuances.

So with any new technology, we’re bound to have a hype cycle that spin up fluff. To find the true value of a new product, take it for a test drive and hold it to a high standard to deliver on its promises. Make sure the outcomes are accurate, the sources transparent, the data immediately accessible, that it complements the operations of the teams and tools that are crucial to the success of the organization.

Table of contents

Share

Exaforce What is an AI SOC Anyway Webinar

Recent posts

Product

October 29, 2025

The past, present, and future of security detections

Exaforce HITRUST award

Product

October 16, 2025

We’re HITRUST certified: strengthening trust across cloud-native SOC automation

Exaforce Blog Featured Image

Industry

October 9, 2025

GPT needs to be rewired for security

Exaforce Blog Featured Image

Product

October 8, 2025

Aggregation redefined: Reducing noise, enhancing context

Exaforce Blog Featured Image

News

Product

October 7, 2025

Exaforce selected to join the 2025 AWS Generative AI Accelerator

Exaforce Blog Featured Image

Research

October 2, 2025

Do you feel in control? Analysis of AWS CloudControl API as an attack tool

Exaforce Blog Featured Image

News

September 25, 2025

Exaforce Named a Leader and Outperformer in the 2025 GigaOm Radar for SecOps Automation

Exaforce Blog Featured Image

Industry

September 24, 2025

How agentic AI simplifies GuardDuty incident response playbook execution

Exaforce Blog Featured Image

Research

September 10, 2025

There’s a snake in my package! How attackers are going from code to coin

Exaforce Blog Featured Image

Research

September 9, 2025

Ghost in the Script: Impersonating Google App Script projects for stealthy persistence

Exaforce Blog Featured Image

Customer Story

September 3, 2025

How Exaforce detected an account takeover attack in a customer’s environment, leveraging our multi-model AI

Exaforce Blog Featured Image

Industry

August 27, 2025

s1ngularity supply chain attack: What happened & how Exaforce protected customers

Exaforce Blog Featured Image

Product

News

August 26, 2025

Introducing Exaforce MDR: A Managed SOC That Runs on AI

Exaforce Blog Featured Image

News

Product

August 26, 2025

Meet Exaforce: The full-lifecycle AI SOC platform

Exaforce Blog Featured Image

Product

August 21, 2025

Building trust at Exaforce: Our journey through security and compliance

Exaforce Blog Featured Image

Industry

August 7, 2025

Fixing the broken alert triage process with more signal and less noise

Exaforce Blog Featured Image

Product

July 16, 2025

Evaluate your AI SOC initiative

Exaforce Blog Featured Image

Industry

July 10, 2025

One LLM does not an AI SOC make

Exaforce Blog Featured Image

Industry

June 24, 2025

Detections done right: Threat detections require more than just rules and anomaly detection

Exaforce Blog Featured Image

Industry

June 10, 2025

The KiranaPro breach: A wake-up call for cloud threat monitoring

Exaforce Blog Featured Image

Industry

May 29, 2025

3 points missing from agentic AI conversations at RSAC

Exaforce Blog Featured Image

Product

May 27, 2025

5 reasons why security investigations are broken - and how Exaforce fixes them

Exaforce Blog Featured Image

Product

May 7, 2025

Bridging the Cloud Security Gap: Real-World Use Cases for Threat Monitoring

Exaforce Blog Featured Image

News

Product

April 17, 2025

Reimagining the SOC: Humans + AI bots = Better, faster, cheaper security & operations

Exaforce Blog Featured Image

Industry

March 16, 2025

Safeguarding against Github Actions(tj-actions/changed-files) compromise

Exaforce Blog Featured Image

Industry

November 6, 2024

Npm provenance: bridging the missing security layer in JavaScript libraries

Exaforce Blog Featured Image

Industry

November 1, 2024

Exaforce’s response to the LottieFiles npm package compromise

Explore how Exaforce can help transform your security operations

See what Exabots + humans can do for you