IAF Recognition for Software & Digital Tools
IAF recognition highlights software products, digital platforms and toolchains whose secure development, functional validation, data governance and field performance meet robust verification criteria across the mobility ecosystem. Listings are issued with a precise scope statement and sustained through periodic surveillance.
Who Can Apply?
Open to suppliers and in-house teams delivering software or digital services that influence vehicle functions, diagnostics, data flows or safety-relevant decisions. Recognition focuses on SDLC discipline, security, validation & simulation, and operational governance. Typical applicant types include:
Embedded ECU & Mechatronics Software
Powertrain, chassis, ADAS and body ECUs with safety or reliability impact.
- Eligibility: versioned releases, code reviews, coverage metrics, SBOM.
- Scope: module names, SW/HW baselines, calibration ranges, diagnostics.
OTA, Update & Key Management
Orchestration of secure updates, signatures, staged rollouts and rollback controls.
- Eligibility: signing policy, KMS, rollback tests, release notes discipline.
- Scope: supported ECUs, protocols, update topologies, audit trails.
Diagnostics, Telematics & Service Apps
Workshop tools, remote diagnostics, fleet telematics, DoIP/UDS-enabled services.
- Eligibility: protocol compliance, access control, data privacy rules.
- Scope: supported brands/models, services, connectivity footprints.
Simulation, HIL/SIL & Scenario Tools
Model-based development, scenario editors and benches validating functions pre-field.
- Eligibility: correlation studies, determinism/reproducibility, coverage.
- Scope: supported physics, interfaces, scenario libraries and limits.
Data Platforms & Analytics (Cloud/Edge)
Pipelines handling vehicle data, telemetry, logs or analytics used for decisions.
- Eligibility: data lineage, retention, DPIA/consent, redaction.
- Scope: data classes, regions, APIs, export/erasure capabilities.
AI/ML Models & Perception
Models influencing detection, estimation or decision support in mobility contexts.
- Eligibility: dataset versioning, labeling QA, drift/bias monitoring.
- Scope: model families, metrics, operating domains and limits.
What We Validate
SDLC & Change Control
Version control, protected branches, peer review, CI/CD and release governance with rollback readiness.
Security & SBOM
SBOM/SCA, threat modeling, static/dynamic analysis, fuzzing and penetration-test remediation.
Functional Validation
Unit/integration/system tests, coverage targets, HIL/SIL correlation and acceptance criteria.
Integration & Compatibility
API contracts, backward compatibility rules, diagnostics/protocol support and dependency matrices.
Data Governance & Privacy
Data classification, consent/retention, redaction/anonymisation and regional residency controls.
Field Performance & Incidents
Telemetry KPIs, crash/freeze logs, incident response and patch cadences with post-mortems.
AI/ML Evidence (if applicable)
Dataset sheets, model cards, metrics (precision/recall), ODD limits, bias/drift monitoring.
Documentation & Licensing
Release notes, user/admin guides, open-source licence compliance and third-party attributions.
Evidence Checklist
SDLC Pack
- Repo policies, branch protections, review gates and signed commits.
- CI/CD configs, artifact retention and reproducible builds.
- Test coverage reports and acceptance criteria per feature.
Security Pack
- SBOM/SCA outputs, CVE management and patch SLAs.
- Threat models, DAST/IAST/fuzz results and pentest summaries.
- Secrets management, code-signing and key-rotation policy.
Validation Pack
- Unit/integration/system test suites with coverage targets.
- HIL/SIL correlation and scenario libraries with reproducibility notes.
- Performance budgets and pass/fail thresholds.
Integration Pack
- API contracts (schema/version), compatibility matrix and deprecation policy.
- Diagnostics/protocol support (e.g., UDS/DoIP), interface test reports.
- Configuration/parameter management and migration/rollback plans.
Data & Privacy Pack
- Data inventory, lineage, residency and retention schedules.
- Consent/DPIA records, anonymisation/redaction mechanisms.
- Incident response and audit trails for access/edits/exports.
AI/ML Pack (if applicable)
- Dataset sheets, labeling QA and class balance notes.
- Model cards, metrics (ROC/PR), ODD limits and calibration curves.
- Drift/bias monitors, fallbacks and human-in-the-loop policy.
Recognition Workflow
1) Pre-check & Scope Draft
List modules, versions, interfaces, operating domains and deployment contexts (vehicle, workshop, cloud/edge).
2) Evidence Upload & Review
SDLC/security packs, validation/integration results and data-governance artefacts are reviewed by IAF evaluators.
3) Verification (Bench/Sim/Witness)
Reproducible builds, signed releases, scenario reruns and interface tests verify integrity and effectiveness.
4) Findings & Closure
Address findings with corrective actions; closure confirmed via independent peer review before decision.
5) Decision, Listing & Surveillance
Recognition granted; your software/tools appear in the public directory. Periodic surveillance sustains confidence.
2-3 w
Document Review1-2 d
Bench/Sim Witness2-4 w
Findings Closure~8-12 w
Typical Lead-timeLabel Use & Public Directory
Recognised solutions may use the “IAF Recognised Software/Digital Tool” label per Brand Guidelines. Pair the label with the scope statement and a verification link/QR (e.g., verify.iaf.com/IAF-ID).
- Use only for the listed modules/versions and domains; avoid blanket claims beyond scope.
- Preserve colours/proportions; respect minimum sizes for readability.
- Remove the label if recognition is suspended or scope changes.
Frequently Asked Questions
Ready to Begin Recognition?
Start with a quick pre-check and a concise scope draft. Our team will guide you through evidence preparation, validation and verification so your listing can go live smoothly.
Apply for IAF Recognition