How We Test VPNs

Our independent, repeatable methodology across speed, privacy, streaming, gaming, and censorship.

Independent buys No paid placements Reproducible tests Versioned methodology
Methodology v1.0 Last reviewed:

At Global VPN Access, we believe trust starts with transparency. That’s why we publish the exact methods we use to test every VPN provider. Our approach is independent, reproducible, and user-focused. We don’t accept paid placements, and we buy every subscription anonymously—just like you would.

Our Testing Principles

  • Independent: We purchase all subscriptions ourselves. No free accounts, no vendor edits.
  • Reproducible: Same scripts, hardware profiles, regions, and time windows each round.
  • User-first: Scores adjust by intent (streaming, privacy, gaming, value, etc.).
  • Transparent: Raw data and methodology versions are published openly with a changelog.

Lab Setup & Controls

  • Hardware: Windows, macOS (including Apple silicon), iOS/Android, Ubuntu LTS.
  • Networks: 1 Gbps fiber (NA/EU) + 300–500 Mbps (APAC). Control run (no VPN) every session.
  • Regions tested: US-East, US-West, Germany, Netherlands, UK, Singapore.
  • Timing: Peak vs. off-peak to capture congestion effects.
  • App versions: Locked and logged for each run (OS, app, protocol).
  • Account hygiene: New emails, unique payment per provider, default app settings + advanced protocol profiles for fairness.
  • Version locking: Toolchain checksums and run logs stored with each dataset.

What We Measure (Test Matrix)

Performance

  • Throughput: Download/Upload averages + 95% confidence intervals across 5 locations × 3 times-of-day.
  • Latency & Jitter: Ping to regional CDNs + in-game RTT (where applicable).
  • Stability: 20-minute sustained transfers, packet loss %, reconnect behavior.

Privacy & Security

  • IPv4/IPv6/DNS/WebRTC leak checks and kill-switch integrity (forced drop tests).
  • Default protocols & cryptography (PFS, key lengths), RAM-only claims, infra ownership.
  • No-logs claims, third-party audits (existence/date/scope), jurisdiction/ownership changes.
  • App permissions audit on mobile.

Streaming & Unblocking

  • Netflix libraries (US/UK/JP), Disney+, Prime Video, Hulu, BBC iPlayer (two titles per service validated).
  • Playback reliability, buffering, CDN region detected, resolution achieved (1080p/4K).

Gaming

  • Median/95th percentile RTT to major regions; jitter thresholds.
  • DDoS mitigations when offered; NAT type, port-forwarding availability, CGNAT notes.

Torrenting & P2P

  • P2P policy (allowed/limited/forbidden), port-forwarding support, swarm speeds.
  • IP visibility checks and legal-use reminder.

Censorship Resistance

  • Obfuscation availability (automatic/opt-in), protocol types (e.g., TLS camouflage).
  • Accessibility checks via partner vantage points where legally permissible.

Usability & Support

  • Features: split tunneling, multihop, dedicated IP, ad/malware filtering, auto-connect rules.
  • Onboarding/UI: install time, defaults clarity, error states, crash rate.
  • Support: live chat/email response times, KB quality, refund flow friction.

Value

  • Effective price (public price vs. durable deal), devices per plan, refund policy clarity.
  • Ownership & risk: M&A history, parent entity transparency.

Scoring Framework (Intent-Aware)

We publish a composite score plus intent-specific scores that re-weight categories for real use cases.

Preset Privacy Performance Usability Value Support Streaming Gaming Torrenting
General Use 30% 25% 15% 15% 5% 5% 3% 2%
Streaming 15% 25% 10% 10% 5% 35%
Gaming 15% 40% 10% 10% 25%
Privacy 50% 15% 10% 10% 5% 10%
Cheapest (Value-led) 15% 20% 15% 40% 10%

We publish the exact weight table on this page and version it in the changelog.

See how this methodology ranks our top providers: NordVPN, Surfshark, ExpressVPN and more.

Data Collection & Updates

  • Automation: Scripted runs (CLI where available) + GUI macros; results to JSON/CSV.
  • Replicates: 3× per scenario; documented outlier handling.
  • Cross-checks: Independent speed endpoints, DNS providers, IP intel databases.
  • Human verify gates: Streaming playback confirmed with title-level checks and screenshots (stored, not published by default).
  • Routine cadence: Monthly for top brands; quarterly for full catalog.
  • Hotfix triggers: Major app/protocol releases, audits, ownership changes, block/unblock events.

Download Latest Results

We mirror our datasets across multiple hosts for reliability.

CSV (GitHub Raw) JSON (GitHub Raw)

More mirrors

Disclosures

  • We earn affiliate commissions from some providers if you buy through our links.
  • Affiliates never affect test order, scores, or placement. Providers cannot buy rankings or whitelisting.
  • Any unavoidable gifts/perks (e.g., event swag) are declined or disclosed in the changelog.

Changelog

  • 2025-08-26 (v1.0): Initial publication of methodology and scoring presets.