2026-05-04·14 min read·sota.io team

Most developers think of the Cyber Resilience Act as a "product security" regulation — something about CE marking, technical documentation, and vulnerability disclosure. They are right. But the CRA has a supply-chain dimension that is far more immediately actionable: every package you pull from npm, PyPI, Maven Central, or crates.io is a potential CRA liability under Art.9.

In March 2026, ENISA published its Secure Use of Package Managers Technical Advisory FINAL — a document that translates directly into CRA compliance obligations for software developers. This guide maps the ENISA recommendations to the specific CRA articles they address, and gives you the exact commands and configurations to implement them across the four major package manager ecosystems.

Why this matters right now: CRA Art.14 (vulnerability reporting) and Art.13 (security updates) enforcement begins September 11, 2026 — 131 days away. Your supply-chain posture is a direct input to both.


What the ENISA Advisory Actually Covers

The ENISA Secure Package Manager Advisory FINAL addresses a specific attack surface: the moment your build system resolves, downloads, and installs a dependency. This includes:

The advisory covers three categories of control:

Control CategoryENISA TermCRA Mapping
Source verificationRegistry integrityArt.9(2) — component authenticity
Version lockingReproducible buildsArt.9(1) — known vulnerability avoidance
Continuous monitoringVulnerability trackingArt.13(1) — security update provision

The CRA Articles That Package Managers Affect

CRA Art.9 — Due Diligence for Third-Party Components

Art.9 is the CRA's software supply-chain clause. It requires manufacturers (that includes you as a software product developer) to:

"exercise due diligence when integrating components from third parties, including open-source software"

What "due diligence" means in practice under the ENISA advisory:

  1. Known vulnerability check before integration — you must check whether a dependency has known CVEs before adding it to your project, not just during CI
  2. Source authenticity verification — you must be able to prove the package came from where you think it came from (registry integrity + package signature)
  3. License and provenance documentation — relevant for SBOM requirements under Art.9(3)

The ENISA advisory operationalizes this as: use lockfiles with integrity hashes, verify signatures where available, and maintain an SBOM that is updated on every build.

CRA Art.13 — Vulnerability Handling and Security Updates

Art.13 requires you to handle vulnerabilities in your software throughout its supported lifecycle. Third-party dependencies are explicitly included — a CVE in a package you ship is a CVE in your product.

ENISA maps this to:

CRA Annex I — Security Requirements

Annex I Part I, Requirement 1 states that products must be "delivered without any known exploitable vulnerabilities". Annex I Part II, Requirement 6 requires manufacturers to "address and remediate vulnerabilities without delay".

Both directly apply to package manager hygiene.


ENISA Recommendation → CRA Action Mapping

The advisory lists 23 concrete recommendations. Here are the highest-impact ones mapped to CRA obligations:

Recommendation 1: Use Lockfiles with Hash Verification

ENISA: "Always commit lockfiles that include cryptographic hashes of resolved packages. Treat unlocked dependency resolution as a security risk."

CRA Relevance: Art.9(2) — establishes authenticity of integrated components

npm:

# package-lock.json is automatically generated — never .gitignore it
# Verify integrity on install
npm ci  # Uses lockfile exactly, fails if lockfile is out of sync
# NOT: npm install (can silently update lockfile)

pip:

# Generate a pinned requirements file with hashes
pip-compile --generate-hashes requirements.in > requirements.txt
# Install with hash verification
pip install --require-hashes -r requirements.txt

Maven:

<!-- Use the Reproducible Builds plugin for lockfile-equivalent pinning -->
<plugin>
  <groupId>io.github.zlika</groupId>
  <artifactId>reproducible-build-maven-plugin</artifactId>
  <version>0.16</version>
</plugin>
<!-- Pin exact versions in pom.xml — avoid version ranges like [1.0,2.0) -->

Cargo:

# Cargo.lock is auto-generated — commit it (it IS your lockfile)
# Verify lockfile consistency
cargo verify-project
# For libraries: Cargo.lock should still be committed in CI

Recommendation 2: Verify Package Signatures Where Available

ENISA: "Enable and enforce package signature verification for registries that support it. Treat unsigned packages from critical dependencies as higher-risk."

CRA Relevance: Art.9(2) — source authenticity

# npm — verify signatures (npm 9+)
npm audit signatures

# pip — PyPI supports PEP 740 attestations (Sigstore-based, 2024+)
pip install --require-hashes --index-url https://pypi.org/simple/ requests

# Maven — GPG signature verification
mvn verify  # Verify GPG signatures when configured

Recommendation 3: Separate Public and Private Registry Traffic

ENISA: "Configure package managers to use scoped namespaces for internal packages and prevent fallback to public registries for private package names."

CRA Relevance: Prevents dependency confusion → Art.9(1) known vulnerability avoidance

npm — prevent dependency confusion:

// .npmrc — pin internal packages to private registry, block public fallback
@mycompany:registry=https://npm.mycompany.internal
always-auth=true

// package.json — explicitly scope all internal packages
{
  "dependencies": {
    "@mycompany/auth-lib": "^2.1.0"  // Scoped = only from configured registry
  }
}

pip — private index with fallback disabled:

# pip.conf
[global]
index-url = https://pypi.mycompany.internal/simple/
# DO NOT set extra-index-url if you have private packages — 
# pip will check both and the public one wins on name collision

Maven — repository order matters:

<repositories>
  <repository>
    <id>internal</id>
    <url>https://nexus.mycompany.internal/repository/maven-public/</url>
    <!-- Set releases/snapshots to true, external public repos to false -->
  </repository>
</repositories>

Recommendation 4: Automate Vulnerability Scanning in CI

ENISA: "Integrate automated vulnerability scanning into every CI pipeline run. Block merges when critical CVEs are detected in the dependency tree."

CRA Relevance: Art.13(1) — proactive vulnerability identification

# GitHub Actions example — runs on every PR
name: Dependency Security Scan
on: [pull_request, push]

jobs:
  scan:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      
      # npm
      - name: npm audit
        run: npm audit --audit-level=high
      
      # pip
      - name: pip safety check
        run: pip install safety && safety check -r requirements.txt
      
      # Maven
      - name: OWASP Dependency Check
        run: mvn org.owasp:dependency-check-maven:check -DfailBuildOnCVSS=7
      
      # Cargo
      - name: cargo audit
        run: cargo install cargo-audit && cargo audit

For CRA Art.13 compliance, you need evidence that this scanning happens — CI logs with timestamps are sufficient documentation.

Recommendation 5: Generate and Maintain an SBOM on Every Build

ENISA: "Generate a Software Bill of Materials (SBOM) in CycloneDX or SPDX format as part of your build process. Update it on every dependency change."

CRA Relevance: Art.9(3) references SBOM-equivalent documentation; Annex II requires technical documentation including component lists

# npm — CycloneDX SBOM
npx @cyclonedx/cyclonedx-npm --output-format json --output-file sbom.json

# pip — CycloneDX SBOM
pip install cyclonedx-bom
cyclonedx-py -e --format json --output sbom.json

# Maven — CycloneDX plugin
mvn org.cyclonedx:cyclonedx-maven-plugin:makeAggregateBom

# Cargo — cargo sbom
cargo install cargo-sbom
cargo sbom > sbom.cdx.json

Store SBOM artifacts alongside your release artifacts. The CRA technical documentation (Annex II) requires a list of all components — your SBOM is that list.


The Dependency Confusion Attack in CRA Terms

ENISA dedicates a full section to dependency confusion — and it is the clearest example of an Art.9 failure mode.

How dependency confusion works:

  1. Your company has an internal package called mycompany-utils (not published to npm)
  2. An attacker publishes a package named mycompany-utils to npm with a higher version number
  3. Your CI resolves mycompany-utils from npm instead of your internal registry
  4. The attacker's package runs malicious code during postinstall

Under CRA Art.9, this is your liability. You failed to exercise due diligence in verifying the authenticity of an integrated component.

ENISA's mitigations (all CRA Art.9 compliant):

MitigationImplementationCRA Effect
Namespace scoping@mycompany/ prefix for all internal packagesPrevents name collision with public registry
Registry isolationPrivate registry with no public fallbackEliminates lookup-order attack vector
Allowlist-only policyBlock unlisted packages at registry levelPrevents opportunistic typosquatting
Build-time verificationHash check on every installDetects tampered packages post-publication

Transitive Dependencies: The CRA Blind Spot

Most developers review their direct dependencies. CRA Art.9 requires due diligence on the entire dependency tree — including transitive dependencies you never chose.

The scale of the problem:

# See your full dependency tree depth
npm ls --depth=Infinity 2>/dev/null | wc -l
# A typical React app: 800-2000 packages in the transitive tree
# A typical Python data science project: 200-600 packages

ENISA's recommendation: You cannot manually review every transitive dependency. Automate the risk reduction:

  1. Prefer shallow dependency trees — choose libraries with few transitive dependencies when evaluating alternatives
  2. Pin at all levels — lockfiles capture transitive versions, preventing surprise updates
  3. Score dependencies by criticalitysocket.io at the transport layer has higher blast radius than a formatting utility
  4. Monitor for new transitive CVEs — tools like Dependabot, Renovate, and Snyk alert on transitive CVE introductions
# Python: Check transitive dependency count before adding a package
# pip-tree gives you the full dependency tree
pip install pipdeptree
pipdeptree --packages requests
# requests 2.31.0 → certifi, charset-normalizer, idna, urllib3
# 4 transitive deps — low risk

CRA Art.14 Supply-Chain Incident Reporting

Starting September 11, 2026, CRA Art.14 requires you to report actively exploited vulnerabilities to ENISA within 24 hours, and provide a full vulnerability report within 72 hours.

Supply-chain incidents that trigger Art.14:

ENISA's guidance: The actively exploited threshold means you need a monitoring channel to know when your dependencies are being targeted. Subscribe to:

# Check if any of your dependencies appear in the CISA KEV catalog
# OSV.dev API — checks against multiple databases including CISA KEV
curl -s "https://api.osv.dev/v1/query" \
  -H "Content-Type: application/json" \
  -d '{"package": {"name": "log4j", "ecosystem": "Maven"}}' | \
  jq '.vulns[].id'

Python CRASupplyChainScanner — Verify Your Posture

import subprocess
import json
import hashlib
import sys
from pathlib import Path

class CRASupplyChainScanner:
    """
    Checks your package manager setup for CRA Art.9 compliance.
    Run before your next release to document due diligence.
    """
    
    def check_npm(self, project_dir: str = ".") -> dict:
        results = {"ecosystem": "npm", "findings": [], "cra_score": 0}
        p = Path(project_dir)
        
        # Check 1: Lockfile exists and is committed
        lockfile = p / "package-lock.json"
        if lockfile.exists():
            results["findings"].append("✓ package-lock.json exists")
            results["cra_score"] += 20
        else:
            results["findings"].append("✗ No package-lock.json — CRA Art.9(2) risk")
        
        # Check 2: npm ci used (not npm install)
        # Check for CI config files
        ci_files = list(p.glob(".github/workflows/*.yml")) + list(p.glob(".gitlab-ci.yml"))
        npm_ci_used = any("npm ci" in f.read_text() for f in ci_files if f.exists())
        if npm_ci_used:
            results["findings"].append("✓ npm ci used in CI (lockfile-strict install)")
            results["cra_score"] += 20
        else:
            results["findings"].append("⚠ Consider using 'npm ci' instead of 'npm install' in CI")
        
        # Check 3: npm audit
        try:
            result = subprocess.run(
                ["npm", "audit", "--json"],
                capture_output=True, text=True, cwd=project_dir
            )
            audit_data = json.loads(result.stdout)
            critical = audit_data.get("metadata", {}).get("vulnerabilities", {}).get("critical", 0)
            high = audit_data.get("metadata", {}).get("vulnerabilities", {}).get("high", 0)
            if critical == 0 and high == 0:
                results["findings"].append("✓ No critical/high vulnerabilities in npm audit")
                results["cra_score"] += 30
            else:
                results["findings"].append(f"✗ npm audit: {critical} critical, {high} high CVEs — CRA Art.13 action required")
        except Exception as e:
            results["findings"].append(f"⚠ Could not run npm audit: {e}")
        
        # Check 4: .npmrc exists with registry config
        npmrc = p / ".npmrc"
        if npmrc.exists():
            results["findings"].append("✓ .npmrc found (registry configuration present)")
            results["cra_score"] += 15
        
        # Check 5: SBOM exists
        sbom_files = list(p.glob("sbom.json")) + list(p.glob("sbom.cdx.json")) + list(p.glob("*.cdx.json"))
        if sbom_files:
            results["findings"].append("✓ SBOM artifact found — CRA Annex II documentation ready")
            results["cra_score"] += 15
        else:
            results["findings"].append("⚠ No SBOM found — add CycloneDX generation to build pipeline")
        
        return results
    
    def check_pip(self, project_dir: str = ".") -> dict:
        results = {"ecosystem": "pip", "findings": [], "cra_score": 0}
        p = Path(project_dir)
        
        # Check pinned requirements with hashes
        req_file = p / "requirements.txt"
        if req_file.exists():
            content = req_file.read_text()
            if "--hash=sha256" in content:
                results["findings"].append("✓ Hash-pinned requirements.txt — CRA Art.9(2) compliant")
                results["cra_score"] += 35
            elif "==" in content:
                results["findings"].append("⚠ Version-pinned but no hashes — add pip-compile --generate-hashes")
                results["cra_score"] += 15
            else:
                results["findings"].append("✗ Unpinned requirements — CRA Art.9 risk")
        
        # Check for safety or pip-audit in CI
        ci_files = list(p.glob(".github/workflows/*.yml"))
        safety_used = any(
            "safety" in f.read_text() or "pip-audit" in f.read_text() 
            for f in ci_files if f.exists()
        )
        if safety_used:
            results["findings"].append("✓ Vulnerability scanning in CI (safety/pip-audit)")
            results["cra_score"] += 35
        else:
            results["findings"].append("⚠ Add 'safety check' or 'pip-audit' to CI pipeline")
        
        return results
    
    def generate_cra_report(self, project_dir: str = ".") -> str:
        npm_results = self.check_npm(project_dir)
        pip_results = self.check_pip(project_dir)
        
        report = ["# CRA Supply-Chain Posture Report", ""]
        report.append(f"**Generated:** {__import__('datetime').date.today()}")
        report.append(f"**CRA Relevance:** Art.9, Art.13, Art.14, Annex I, Annex II")
        report.append("")
        
        for results in [npm_results, pip_results]:
            score = results["cra_score"]
            grade = "COMPLIANT" if score >= 70 else "PARTIAL" if score >= 40 else "AT RISK"
            report.append(f"## {results['ecosystem'].upper()} — Score: {score}/100 ({grade})")
            for finding in results["findings"]:
                report.append(f"- {finding}")
            report.append("")
        
        return "\n".join(report)


if __name__ == "__main__":
    scanner = CRASupplyChainScanner()
    print(scanner.generate_cra_report(sys.argv[1] if len(sys.argv) > 1 else "."))

Ecosystem-Specific CRA Compliance Gaps

npm / Node.js

GapRisk LevelFix
npm install in CI instead of npm ciMediumLockfile can be silently updated by npm install
^ version prefixes in package.jsonMedium^1.2.0 allows 1.99.x — unpredictable transitive updates
No npm audit in CIHighCVEs enter undetected — Art.13 failure
postinstall scripts not reviewedHighMalicious packages can execute code during install
Public fallback for scoped packagesCriticalDependency confusion vulnerability

Disable postinstall scripts for untrusted packages:

npm install --ignore-scripts package-name
# Or globally in .npmrc:
# ignore-scripts=true
# (breaks packages that need build steps — evaluate per-project)

Python / pip

GapRisk LevelFix
requirements.txt without hashesHighAny registry-side tamper is invisible
pip install without --require-hashesHighHash in file is meaningless if not enforced
extra-index-url with private packagesCriticalpip checks both registries, public wins on version
No pip audit in CIHighCVEs in transitive deps undetected
# Safe pattern: use only the internal registry index
# NEVER combine index-url + extra-index-url if you have private packages
pip install --index-url https://pypi.mycompany.internal/simple/ \
            --require-hashes \
            -r requirements.txt

Maven / Java

GapRisk LevelFix
Version ranges [1.0,2.0) in pom.xmlMediumNon-reproducible builds
Maven Central without signature verificationMediumBuild-time dependency substitution possible
No OWASP Dependency Check pluginHighCVE tracking requires this or equivalent
Snapshot versions in productionHighSNAPSHOT resolves differently per-build
<!-- Enable OWASP Dependency Check in every Maven build -->
<plugin>
  <groupId>org.owasp</groupId>
  <artifactId>dependency-check-maven</artifactId>
  <version>9.0.10</version>
  <configuration>
    <failBuildOnCVSS>7</failBuildOnCVSS>
    <formats>
      <format>HTML</format>
      <format>JSON</format> <!-- Machine-readable for CRA documentation -->
    </formats>
  </configuration>
</plugin>

Rust / Cargo

GapRisk LevelFix
Not committing Cargo.lock for binariesHighReproducibility lost — applications should always commit it
No cargo audit in CIHighcargo audit checks against RustSec Advisory DB
Yanked crates not detectedMediumcargo update silently skips yanked, old Cargo.lock stays
# Comprehensive Cargo security in CI
cargo audit                    # Check against RustSec DB
cargo deny check              # Enforce license + advisory policies
cargo update --dry-run        # Check for available updates without applying

The 30-Item CRA Supply-Chain Checklist

Group your package manager compliance into four CRA-mapped categories:

Category A — Art.9 Component Authenticity (10 items)

Category B — Art.13 Vulnerability Handling (8 items)

Category C — Art.14 Incident Readiness (6 items)

Category D — CRA Annex I / II Documentation (6 items)


Practical Timeline: CRA Supply-Chain Readiness

DeadlineCRA RequirementPackage Manager Action
NowArt.9 best practiceImplement lockfiles + integrity checks
NowArt.13 continuousAdd CVE scanning to CI pipeline
Before July 2026Annex II documentationGenerate SBOM, document dependency process
Before Sept 11, 2026Art.14 enforcement liveTest incident response runbook for supply-chain scenarios
OngoingArt.13(3) updatesDependency update SLA operational

The ENISA advisory frames these as technical hygiene recommendations. Under the CRA, they are legal compliance requirements.


What the ENISA Advisory Means for Open-Source Components

CRA Art.9(5) creates a specific carve-out for open-source stewards — but it does not exempt you as a user of open-source packages.

The distinction:

Using requests, spring-boot, tokio, or react in your commercial SaaS product means you bear the due diligence responsibility for those components. The package maintainers are not responsible for your use of their packages.

ENISA's guidance for open-source dependencies:

  1. Check the package's own security posture: Does it have a security policy (SECURITY.md)? Does it sign releases?
  2. Review its CVE history: How quickly has the maintainer responded to past vulnerabilities?
  3. Assess abandonment risk: When was the last commit? Is it still actively maintained?
  4. Check for known compromise history: Has this package ever been taken over or injected?
# npm — check for known takeover risk indicators
npm view package-name | grep -E "version|author|homepage|time.modified"

# Python — check PyPI security metadata
pip index versions package-name
curl https://pypi.org/pypi/package-name/json | jq '.info.author, .info.project_urls'

Next Steps

The ENISA Secure Package Manager Advisory FINAL is a 40-page document. The CRA compliance value is in the actionable controls, not the reading:

  1. This week: Run npm audit, pip-audit, or cargo audit against your current codebase. Fix all Critical/High findings.
  2. This month: Implement lockfile-enforced installs in CI (npm ci, pip --require-hashes). Configure automated CVE scanning.
  3. Before July 2026: Generate your first SBOM. Add it to your CRA technical file.
  4. Before September 11, 2026: Run a supply-chain incident response drill. Who handles a compromised transitive dependency?

The ENISA advisory exists because these controls are not yet standard practice. The CRA makes them mandatory. Start now.

CRA enforcement calendar for supply-chain: September 11, 2026 — Art.14 vulnerability reporting active. Art.13 security update obligations active. Art.9 due diligence is already technically applicable as of August 2, 2026 for products placed on the market.


Further reading: CRA Art.9 Third-Party Component Due Diligence Guide · ENISA Security by Design Playbook for CRA Annex I · sota.io — Deploy Your Software on EU Infrastructure with CRA-Compliant CI/CD

EU-Native Hosting

Ready to move to EU-sovereign infrastructure?

sota.io is a German-hosted PaaS — no CLOUD Act exposure, no US jurisdiction, full GDPR compliance by design. Deploy your first app in minutes.