Introducing Jeikin: accessibility compliance for AI-assisted development
I've been building websites for years, and for years accessibility felt like a test I was always failing.
Not because I didn't care, I did, but WCAG has 86 criteria across 4 levels and I don't remember them all. Nobody does. And every tool I tried would hand me a list of violations with no context about who was actually affected or whether my fix actually worked.
So when AI coding tools became part of our workflow, I saw an opportunity: what if the AI could handle the accessibility review, not just find issues, but track them, enforce fixes, and prove the work was done?
That's Jeikin.
What Jeikin does
Jeikin connects to your AI coding tool (Claude Code, Cursor, Windsurf, or any MCP-compatible tool) and guides it through a structured accessibility review.
Here's what happens:
- You ask your AI to review accessibility. Jeikin provides the review criteria, the evaluation steps, and the context your AI needs.
- Issues are tracked on a dashboard. Every finding is reported with severity, affected files, and the specific WCAG criteria it relates to, along with plain-language explanations of who's affected.
- Fixes are verified. When the AI fixes something, Jeikin requires it to verify the fix passes quality checks before marking it done. No shortcuts.
- Evidence is preserved. Your dashboard shows what was found, what was fixed, and what's still open. Shareable proof of compliance work.
Why this approach
Most accessibility tools work beside your workflow. You switch to a scanner, run a check, read a report, switch back to your editor, fix something, switch back to the scanner, re-check.
Jeikin works through your workflow. The AI tool you're already using becomes the reviewer. You never leave your editor. The results appear on a dashboard your team, your clients, or your compliance officer can see.
What makes it different
Enforcement, not suggestions. Other tools suggest fixes. Jeikin requires verification. The AI can't mark an issue as done until quality checks pass. This is the difference between "we looked at accessibility" and "we can prove every fix was verified."
Your code stays private. Jeikin never sees your source code. The AI reads your code locally and sends only its findings, issue descriptions and file references. Your codebase never touches our servers.
Design-informed rules. Beyond WCAG, Jeikin enforces usability principles that matter to real users: Fitts' Law for target sizes, Gaver's affordances for interactive elements, APCA contrast for readability. Not just compliance, genuine usability.
The virtuous loop. We review our own codebase with Jeikin. Every page you see on jeikin.com was built and verified through the same system our users use. If our own tool can't catch an issue on our own site, we have work to do.
Who it's for
- Developers who want accessible products but don't have time to learn all 86 WCAG criteria
- Agencies delivering accessible sites who need documentation for clients
- Engineering teams that need compliance evidence without slowing down feature velocity
- Product managers who want visibility into accessibility status without reading code
What it costs
Free during the preview period. Pricing tiers are coming soon, we'll share details on this blog when they're ready. The free tier will remain genuinely useful, it's how you discover the workflow, not a demo.
Try it
Run one command in your project folder:
npx jeikin init
It sets up the connection between your AI tool and Jeikin in about 30 seconds. Then ask your AI: "Review my code for accessibility."
The first review is often eye-opening. Not because your code is bad, but because accessibility issues are invisible until someone (or something) looks for them.
We're building Jeikin in public. Follow this blog for practical accessibility guides, product updates, and the story of building compliance infrastructure for the AI era.