Sprint Focus: UNTP Tier 2 Validation UI Velocity: ๐ Fast Key Insight: Architecture decisions from Day 1-3 paid massive dividends
The Best Days in Software
There's a special feeling in software development when you start a feature and realize most of the hard work is already done. Your past self made good decisions. The data structures are right. The APIs exist. You just need to connect the dots.
Day 5 was one of those days.
I'd been working with Claude to build the Tier 2 validation UI - a critical feature that lets users verify their credentials pass UNTP interoperability tests before issuing them. Every time we scoped a new component, the conversation went something like this:
Me: "We need to show validation results for credentials."
Claude: "Good news - the API endpoint is already built (POST /api/validate/untp). This is just a frontend component."
Me: "Users should be able to validate by URL or file upload."
Claude: "Already have both endpoints (/verify-url and /untp). Just need the form."
Me: "The credential editor needs a 'Run Tests' button."
Claude: "The validation pipeline is there. The credential state is in Directus. This is a simple component integration."
This happened at least four times today. Every feature scoped turned out to be 70% built already.
Why This Matters for UNTP Adoption
Before diving into what we built, let's talk about why Tier 2 testing matters.
UNTP has three conformance tiers:
Tier 1: Basic credential structure (JSON-LD, W3C VCDM)
Tier 2: Schema compliance (product passports, facility records, etc. match UNTP data models)
Tier 3: Cross-platform interoperability testing
Tier 2 is the interoperability guarantee. It ensures that a Digital Product Passport issued by DPP Kit can be read by any UNTP-compliant system - a buyer's procurement platform, a verification app, a regulatory reporting tool, whatever.
Without Tier 2 compliance, you're not really doing UNTP. You're doing "JSON documents with supply chain data" - which is what we've all been doing for years. The whole point of a standard is that everyone speaks the same language.
DPP Kit builds Tier 2 validation directly into the credential creation workflow. You can't accidentally issue a non-conformant credential. If it passes our tests, it will work with the rest of the UNTP ecosystem.
What We Built
1. Standalone Validation Page
A dedicated /validate page where anyone (even without an account) can test credentials:
Paste a URL โ We fetch it from an Identity Resolver and validate
Upload a file โ Direct validation of local credentials
Instant results โ Three-tier validation (W3C context, UNTP schema, crypto signatures)
This is our "UNTP Playground lite" - focused, fast, no account needed.
2. In-Editor Validation
Added a "Run Tier 2 Tests" button to the credential editor. Click it, get instant feedback on whether your credential will pass before you issue it.
Results show:
โ Pass with details on each validation check
โ ๏ธ Warnings for extra properties (forward-compatible - the standard evolves)
โ Errors with JSON paths, specific schema violations, and human-readable explanations
3. Pre-Flight Validation
Credentials now run through Tier 2 checks before issuance. If the schema validation fails, the issuance pipeline blocks it. No more "oops, we issued a broken credential."
This happens server-side, in the Express API, using the same validation engine as the UI.
The Architecture Payoff
Here's why today moved so fast:
Day 1-2 decisions:
Built validation as API-first (not UI-dependent)
Used Directus for structured credential state management
Designed error formats for both humans and AI agents (more on that below)
Day 3-4 infrastructure:
Server-side validation endpoints already existed
Credential fetching from Identity Resolver already worked
JSON editor component was pluggable
Day 5 reality: Most of the work was React components calling existing APIs. No backend rewrites. No schema migrations. Just UI integration.
This is the compounding effect of good architecture. The first feature is slow because you're building foundations. The fifth feature is fast because the foundations hold.
The AI Angle (Yes, Again)
I keep mentioning that our validation errors are designed for AI consumption. Here's what that means:
Every error includes:
json
{
"path": "credentialSubject.product.description",
"keyword": "minLength",
"params": { "limit": 10 },
"message": "must NOT have fewer than 10 characters",
"severity": "error"
}That's enough context for an AI agent to:
Identify the exact field (
credentialSubject.product.description)Understand the constraint (
minLength: 10)Suggest a fix (add more descriptive text)
We added a "Copy Errors for AI" button that formats validation results for pasting into Claude or ChatGPT. You can literally paste failed validation output and ask the AI to fix your credential JSON.
But more importantly: this error format is the interface for a future AI-assisted editing feature where the agent auto-suggests or auto-applies fixes directly in the credential editor.
Human-readable for manual fixes. Machine-parseable for AI agents. Same error format, two audiences.
What's Next
The validation UI is done. The pipeline integration is done. What remains:
AI-assisted credential editing (that error format isn't decorative)
Post-issuance validation (verify issued credentials still pass after schema updates)
Tier 3 testing integration (cross-platform interoperability tests)
But honestly? Today was satisfying. Fast iteration. Clean implementations. The kind of day that reminds you why good architecture matters.
Stats:
~10 files modified (API, frontend, docs)
3 validation surfaces (standalone page, editor button, pipeline check)
2 new React components
7 documentation pages updated
1 very happy developer
Building in public. Shipping fast. UNTP Tier 2 conformance, guaranteed.
Follow along: Weekly sprint updates at blog.dppkit.io
Issue free UNTP Credentials: dppkit.io