What we learned about API discovery from comparing runtime and edge views
As a CISO, my litmus test for API discovery is simple: does it find the endpoints that matter for security work we can act on? Will it give my team a clean list of testable items? To pressure-test the discovery features on the Invicti Platform and see how it stacks up, we ran an informal benchmark within our AppSec team.
Your Information will be kept private.
Begin your DAST-first AppSec journey today.
Request a demo
Specifically, we took the network-layer API discovery feature powered by Invicti’s DAST-integrated network traffic analyzer (NTA) and compared it to Cloudflare’s API Discovery tool that we use as part of the edge gateway setup across our production and corporate sites. Both tools were then run against one of Invicti’s own applications with no special preparation for benchmarking. The goal was a very practical check on coverage and actionability across two different vantage points.
“We wanted an honest read on whether our DAST-based discovery keeps up with what a network-perimeter product can see – and just as importantly, whether the results are ready for security work without extra cleanup,” said application security engineer Paul Good, who set up and ran the tests.
Two discovery approaches, two perspectives
NTA provides the innermost layer of Invicti’s multi-layered API discovery. It works inside the application architecture and performs API discovery while a DAST scan is running. It identifies endpoints based on live interactions and is constrained by pre-configured rules to avoid risky operations in production, like any delete operations or actions that could deauthenticate the tool mid-scan. The result is a curated, security-focused view of actively tested APIs.
The Cloudflare tool works at a different level: it passively inspects live traffic at the edge via its reverse proxy. This enables the continuous detection of all APIs being accessed in real time, including shadow and legacy endpoints, whether or not they’re under active testing. Having this kind of perimeter inspection provides a broader and more persistent view across environments.
Both approaches are valuable in their own way: a DAST-centric list shows you what’s immediately testable, while an edge inspection list can uncover activity you may not be hitting during a scan. The question was how Invicti’s own product would perform and how results from the two tools would differ.
Evaluating the results
Our team compared what each tool surfaced for the same app and validated the discovered endpoints by sending requests to check the response statuses. Because scanning context, traffic patterns, and exclusion rules can influence any side-by-side, this was treated as a very rough benchmark rather than a strictly controlled bake-off.
“Both tools got the same target and the same window. We didn’t stage anything special, other than setting up NTA,” Paul noted. “We then normalized the results from both tools and validated what each list produced to see how many endpoints actually returned 200s and how much noise we’d have to sift out afterwards.”
Results at a glance
Across the test window, Invicti’s discovery with NTA produced a larger and cleaner set of endpoints that were ready for security testing. Here are the full results:
Invicti | Cloudflare | |
---|---|---|
Validated endpoints (HTTP status 200) | 317 | 72 |
Definite false positives (HTTP status 404) | 14 | 80 |
For investigation (HTTP statuses other than 200 or 404) | 69 | 104 |
Total endpoints detected | 400 | 256 |
Even though this wasn’t a rigorous test, two things were immediately clear from the numbers. Firstly, Invicti’s NTA found over 50% more endpoints. And secondly, most of Invicti’s discovery results were valid and immediately usable while most of Cloudflare’s weren’t – over 79% of endpoints discovered by Invicti NTA returned HTTP 200 OK as compared to only 28% of Cloudflare findings.
“The signal really stood out,” Paul said. “Invicti found more unique endpoints and far more that returned 200 OK during validation, with far fewer 404s. In practice, that means less cleanup for our team and faster time to actual testing.”
Again, this isn’t a winner/loser scenario because the two approaches are fundamentally different (and also because we were testing our own product). Crucially, the endpoint sets from both products weren’t identical. Cloudflare did discover a meaningful set of unique endpoints that Invicti didn’t hit during its test run, which is consistent with its passive, edge-first vantage point.
Edge-based API discovery fills in gaps
Cloudflare’s edge telemetry can see traffic that a DAST session might not access and test in a given run, especially if certain workflows weren’t triggered or if user-driven paths were quiet during the test window. That’s why our internal conclusion was to cross-review the Cloudflare-identified endpoints to maximize coverage and learn from any gaps while recognizing that a strict one-to-one metric match is unrealistic across different methods.
“Cloudflare’s view highlighted a few endpoints we weren’t hitting that day,” Paul said. “That’s exactly the kind of feedback loop we want: use edge hints to enrich the DAST target list, then validate and test.”
DAST-based API discovery drives action
Our informal experiment showed first-hand that Invicti’s NTA for API discovery works well and lets our own security team act on results more efficiently. More generally, DAST-integrated API discovery provides a high-value starting point for triage and testing. When discovery is part of DAST, you get endpoints your security scanner can exercise under authentication, governed by safety rules in production and immediately ready for vulnerability testing with minimal noise.
“Discovery on its own is just inventory. Discovery inside DAST becomes action,” Paul noted. “Because the endpoints we find with Invicti are the ones we can test right away, we can turn those lists into findings and then into fixes.”
Invicti’s whole platform is built around a DAST-first philosophy: focus on runtime realities and confirmed, exploitable risk, then use DAST as the verification layer for everything else. Combining DAST with discovery and AST inputs in a single view helps organizations secure what actually matters and do it efficiently.
From a coverage perspective, it’s important to note that the NTA we tested is only one part of the picture. Invicti provides multiple ways to build up an API inventory, with zero-config spec discovery, integrations to sync definitions, and traffic analysis with NTA to reconstruct API definitions from observed calls. This approach lets teams combine developer-provided specs with discovery and then test the whole set using the same high-accuracy checks.
Practical takeaways for AppSec leaders
What started as a simple “let’s see what happens” scenario for internal use helped us tighten up our own security. The broader practical takeaway is that if your priority is reducing risk quickly and measurably, Invicti’s DAST-first approach includes API discovery that flows directly into validated testing, not just a bigger spreadsheet to check later. Edge-level discovery using Cloudflare or a similar tool still provides a useful complementary signal to catch stray or legacy activity, but you should drive your remediation work from a list you can test under auth with minimal false positives.
“The practical win for us as a security team was simple,” Paul Good concluded. “DAST-based discovery produced a clean, testable API inventory we could act on immediately, without losing the ability to learn from additional edge signals.”
If you’d like to see how Invicti’s DAST-based API discovery and testing can streamline your AppSec program, schedule a working session with our technical team. We’ll show you how application and API discovery flows into vulnerability testing and reporting, and how to integrate all this into your CI/CD for production-safe scanning at the speed of development.