Real-World Performance of Cybersecurity Products

The Blue Report 2024

Get a comprehensive analysis of over 136 million cyber attacks and understand the state of threat exposure management.

DOWNLOAD

It’s easy to feel confident when your security tools shine in lab tests. But what happens when they face the stress of the real world? It is likely that your organization has invested thousands of dollars into cutting-edge security solutions that achieve 100% protection and detection in MITRE Engenuity ATT&CK® Evaluations. Yet, when these same tools are deployed in your complex production environment, their effectiveness may not be the same—sometimes not even reaching half of their lab-tested performance. 

If you're curious about the factors behind this stark difference, as revealed by extensive research in the Blue Report 2024, keep reading. We’ll break down the key issues at play and share strategies to close the gap between lab-tested results and real-world performance.

Understanding the Picus Blue Report 2024

Before we dive into the key findings derived from the output by APV simulations, it's important to understand the significance of the Blue Report 2024 itself. 

So, What Exactly Is the Blue Report 2024?

The Blue Report is an annual study conducted by Picus Labs, analyzing over 136 million attack simulations performed by Picus Security customers from January to June 2024. These simulations, run on the The Security Validation Platform, offer insights into the real-world performance of leading security products.

Why Focus on Findings from Security Control Validation (SCV)?

This blog focuses on the findings from our Security Control Validation (SCV) product because it reveals how security controls, when deployed across diverse organizations and sectors, often fail to perform as expected in real-world scenarios. While controlled environment tests can offer promising results, they frequently overlook the complexities of actual operational settings. SCV, powered by Picus' Breach and Attack Simulation (BAS) technology, provides a critical, real-world evaluation of security solutions, highlighting the gaps that might otherwise go unnoticed.

What Real-world Data Shows Us: Key Findings

As stressed earlier in the introduction, security products are exhaustively tested against simulated attacks to evaluate their effectiveness. The MITRE Engenuity ATT&CK® Evaluations are one of the most respected benchmarks, with many solutions achieving perfect scores (100%) in prevention and detection under controlled environments.

However, real-world performance data paints a different picture. When these solutions are deployed in diverse production environments, their effectiveness varies significantly. The graphs below illustrate the prevention and detection scores of various security solutions as tested by our platform’s SCV capability.

Prevention Score: The Good, the Bad, and the 'Yikes'

The prevention score, drawn from extensive simulations across a broad range of environments, reveal a wide distribution. 

While some implementations of the same devices manage to perform quite well, with scores clustering around 70% to 80%, there’s a noticeable spread across the spectrum. Many devices dip into the 40% to 60% range, indicating a middle ground of effectiveness that may not meet the high expectations set by controlled testing. A few outliers even struggle below 30%, highlighting the challenges faced in less-than-ideal conditions.

prevention-score

Figure 1. Real-World Prevention Scores of Security Controls Evaluated by Picus SCV Platform

This data underscores a critical insight: real-world environments are far more complex and unpredictable than controlled settings. The major factors that define such a graph will be examined in the next section. But lastly, we are going to discuss the real-life performance of detection layer solutions.

Detection Score: A Mixed Bag of Success and Shortfalls

Like prevention score, the detection score, derived from extensive simulations across various environments, exhibits a broad and uneven distribution.

While a few solutions show moderate effectiveness, with scores clustering around 40% to 50%, the majority fall below expectations, with a significant number of scores dipping between 20% to 30%. This wide spread indicates a mixed bag of results, where some controls struggle to consistently identify threats. The lack of high-performing outliers above 75% further underscores the challenges faced by security solutions in accurately detecting threats in real-world conditions.

detection-score

Figure 2. Real-World Detection Scores of Security Controls Evaluated by Picus SCV Platform

Three Factors Affecting Prevention and Detection Effectiveness: Key Takeaways

Real-world data shows that even best-of-breed products that score 100% in controlled settings can exhibit a wide range of prevention and detection effectiveness once deployed. 

We attribute this variability to several factors:

1. Environmental and Configurational Differences 

Organizations’ network architecture, regulatory and compliance needs, threat landscape, and user behavior are all unique. These differing environments can significantly impact the performance of security products, leading to variations in effectiveness across different organizational  deployments.

2. Context and Deployment Nuances

Where and how a cybersecurity solution is implemented, including its integration with other security tools, policies, and specific configuration settings, play a vital role in determining its real-world effectiveness. The same product might perform exceptionally well in one setup but face unexpected limitations in another.

3. Dynamic Nature of Threats

The cyber threat landscape is always evolving, with new TTPs emerging regularly. Security products need to be continuously validated against these latest global threats to ensure they remain effective. This requires companies regularly update and fine-tune their cybersecurity solutions to maintain the strongest, most effective posture

Two Critical Recommendations for Improved Prevention & Detection Score

Given the variability in the performance of both detection and prevention layer solutions, organizations should have realistic expectations when implementing security solutions, even those that perform exceptionally well in standard evaluations. We strongly recommend you conduct comprehensive, context-specific evaluations. Then, pivot to continuously monitoring and tuning these tools to ensure they remain effective against the most current threats. 

This approach leads us to offer two critical recommendations:

Recommendation 1: Continuous Validation

Organizations must regularly test and validate their security products against the latest threats to confirm that they provide the expected level of protection. Regular attack simulations can help identify potential gaps and areas for improvement.

Recommendation 2: Ongoing Fine-Tuning

Security tools should not be considered set-and-forget. Continuous fine-tuning and updates are essential to adapt to changing threat landscapes and organizational needs. This includes adjusting configurations, updating threat intelligence feeds, and integrations with other security tools.

These recommendations aim to strengthen your defenses and close the security gaps highlighted by the SCV findings. However, understanding the broader impact of these gaps and how to address them strategically requires a more in-depth approach. 

That’s where Picus Security Control Validation comes into play.

Stop Estimating Your Security Effectiveness: Picus SCV Can Help You

Picus SCV, powered by award-winning Breach and Attack Simulation (BAS) technology, offers a dynamic and proactive approach to measuring and enhancing your organization's cyber resilience. Rather than relying on static evaluations or assumptions, Picus SCV continuously and automatically tests the effectiveness of your security tools in real-world conditions.

ransomware

With a threat library that is updated daily by a team of offensive security experts, Picus ensures that your defenses are always tested against the latest and most relevant attack techniques. This includes a 24-hour SLA for adding emerging threats to the library if a publicly available proof of concept (PoC) exists. This level of responsiveness ensures that your security posture is not just current, but resilient against the ever-evolving threat landscape.

By simulating real-world cyber threats, Picus SCV allows you to identify prevention and detection gaps in your security infrastructure. More importantly, it provides actionable mitigation recommendations (both vendor-specific & vendor-neutral), enabling you to address these gaps swiftly and effectively. This proactive approach ensures that your security tools are not just theoretically sound, but practically effective in the face of actual threats.

Conclusion

In conclusion, while achieving 100% protection and detection coverage in MITRE ATT&CK Evaluations highlights a product's potential, it doesn't guarantee absolute security in real-world deployments. Factors such as misconfigurations, human error, the complexity of real-world attacks, and the specific demands of each environment can significantly affect the effectiveness of even the best security solutions.

The takeaway is clear: Performance variability often arises not from the tools themselves but from how they are implemented and maintained. 

A robust security posture requires more than just deploying the right solutions—it needs continuous validation, proper configuration, and effective management.

Organizations must avoid overestimating their security readiness. The real test of security controls lies not in controlled, ideal conditions but in the unpredictable challenges of everyday operations. With Picus SCV, you gain a clear understanding of your true security posture, moving beyond assumptions to ensure your defenses can handle real-world threats. By leveraging continuous testing and proactive adjustments, you ensure that your security tools remain resilient against the dynamic threat landscape.