AI hiring compliance is no longer optional. NYC Local Law 144, Illinois AIPA, Colorado AI Act, EU AI Act, and a growing wave of state and federal proposals are creating a patchwork of obligations for any employer using automated tools in hiring, screening, or promotion. This hub covers what the laws require, how enforcement is evolving, and how to build compliance into your operations — not as an afterthought.
Frequently Asked Questions
What laws regulate AI in hiring?
Key laws include NYC Local Law 144 (bias audits for AEDTs), Illinois AI Video Interview Act (consent + transparency for AI-analyzed interviews), Colorado AI Act (risk assessments for high-risk AI), and the EU AI Act (classifying employment AI as high-risk). Multiple other states have pending or enacted legislation.
What is a bias audit for AI hiring tools?
A bias audit is an independent assessment of an automated employment decision tool that calculates selection or scoring rates and impact ratios across sex, race/ethnicity, and intersectional categories. NYC Local Law 144 requires annual audits before use.
Who is responsible for AI hiring compliance — the employer or the vendor?
The employer (or employment agency) bears primary compliance responsibility in most jurisdictions. Vendors can support compliance by providing audit data, documentation, and transparency reports, but the legal obligation sits with the organization deploying the tool.
How do I prepare for AI hiring regulations?
Build an AEDT inventory, map which tools produce simplified outputs used in hiring decisions, engage an independent auditor, operationalize notice delivery, and publish required disclosures. Our AI Bias Audit Checklist and Disclosure & Consent Templates provide ready-to-use frameworks.