If you run a small business or childcare center in Southern California and you use any kind of AI tool โ from scheduling software to customer chatbots โ January 1, 2026 was a significant date. California's governor signed more than 20 AI-related bills into law, and many of them are now in effect. Most small business owners have no idea. This guide will tell you what actually applies to you, what you need to do about it, and what you can safely ignore.
The scale of California's AI legislation
California didn't pass one or two AI laws. It passed a wave of them covering employment, healthcare, education, social media, data privacy, and automated decision-making. Legal analysts estimate that California's combined privacy and cybersecurity compliance requirements could impose nearly $16,000 in annual costs on small businesses that are fully subject to these rules.
The good news: most of the most aggressive laws โ like SB 53 (the Transparency in Frontier Artificial Intelligence Act) โ only apply to large AI developers with annual revenues exceeding $500 million. If you're a small business in Los Angeles, Ventura, or Oxnard, you're not in that category.
The laws that do apply to small businesses are primarily centered on the California Consumer Privacy Act's new Automated Decision-Making Technology (ADMT) regulations. These have real teeth, and ignoring them carries real financial risk.
The three laws small businesses need to actually understand
1. CCPA Automated Decision-Making Technology (ADMT) Regulations
This is the most important one for small businesses that use AI in their customer or employee processes. California's Privacy Protection Agency finalized these rules, and they define ADMT as any technology that processes personal information and uses computation to replace or substantially replace human decision-making in areas that significantly affect consumers.
Practical examples of ADMT in a small business context include: an AI scheduling tool that determines which employees get shifts, an automated system that decides which customers receive promotional offers, a chatbot that routes or denies customer service requests without human review, or any software that scores, ranks, or categorizes people based on their personal data.
If your business is subject to CCPA (which generally means annual gross revenue over $26.6 million, data on 100,000 or more California consumers, or deriving more than 50% of revenue from selling personal data) and you use ADMT, you must now provide consumers with notice that ADMT is being used, explain its purpose and how it works in plain language, and offer consumers the right to opt out of ADMT for certain decisions.
The timeline for consumers rights compliance: January 1, 2027. For initial risk assessments covering ongoing processing activities: December 31, 2027. Small businesses under $50 million in revenue get additional time for formal audit submissions โ until April 1, 2030. But that doesn't mean you can ignore the notice and opt-out requirements. Those protections begin for consumers in 2027.
2. AB 2013 โ Training Data Transparency
This law, effective January 1, 2026, requires developers of generative AI systems to disclose information about what training data their AI was built on. This law primarily targets the companies that build AI tools, not the small businesses that use them. If you're using ChatGPT, Microsoft Copilot, or any other established AI platform, compliance with AB 2013 is the vendor's responsibility, not yours.
That said, if your business has built a custom AI model โ even a simple one โ using your own customer data, AB 2013 may apply to you as the developer. Custom AI projects are becoming more common in 2026; if EDCON or another partner has helped you build a proprietary AI tool, this is worth reviewing.
3. AB 853 โ California AI Transparency Act Amendments
AB 853 amended the AI Transparency Act to adjust timelines and expand requirements for large online platforms and device manufacturers. For most small businesses, this law has limited direct impact. Its primary effect is on platforms like social media networks and major tech companies that deploy generative AI in customer-facing features. The $5,000-per-day penalty structure is designed to pressure large platforms โ not a 10-person company in Ventura.
How to know if these laws apply to your business
Run through this quick checklist:
- Do you meet CCPA thresholds? If your annual gross revenue is under $26.6 million, you don't collect data on 100,000+ California consumers annually, and you don't earn most of your revenue from selling personal data, CCPA's ADMT rules likely don't apply to your business directly. Many small businesses and most childcare centers are below these thresholds.
- Do you use any AI to make decisions about people? Think carefully. AI-driven employee scheduling, automated credit or payment decisions, chatbots that determine service outcomes, or algorithmic customer scoring all count. If a human always reviews the AI recommendation before acting, the analysis changes โ but "human in the loop" needs to be meaningful, not a rubber stamp.
- Have you built any custom AI? If a vendor built you a custom AI model trained on your data, find out who is responsible for AB 2013 compliance โ you or the vendor.
- Does your privacy policy mention AI or automated processing? Even if the ADMT regulations don't directly apply, California consumers increasingly expect transparency. Updating your privacy policy to describe how AI tools interact with customer data is good practice regardless of legal obligation.
What childcare centers specifically need to know
The overwhelming majority of childcare centers, preschools, and daycares in Southern California will not meet CCPA's revenue and data-volume thresholds. A center enrolling 60 families is simply not processing data at the scale that triggers these regulations. This means the ADMT consumer rights requirements likely don't apply to you directly.
However, three areas still deserve attention:
If you use a childcare management platform (like Brightwheel, HiMama, or similar) that incorporates AI features, those platforms must comply with California law. Confirm with your vendor that they are CCPA-compliant and ask specifically about how they handle any automated processing of parent or child data.
If you use any AI tool that touches parent contact data โ even just an AI email assistant or automated billing system โ your privacy policy should describe this. Many childcare centers haven't updated their privacy policies since the original CCPA passed in 2020. A lot has changed.
ADMT regulations also cover employment decisions. If you use any software that automates scheduling, performance evaluation, or hiring screening based on employee data, this is worth reviewing โ especially as you grow and cross CCPA thresholds.
The five steps to take right now
- Build an AI tool inventory. List every piece of software your business uses that involves automation or machine learning. Include scheduling tools, email platforms with automated sequences, customer chatbots, accounting software with predictive features, and HR platforms. You cannot assess your compliance exposure without knowing what's in your stack.
- Identify which tools touch consumer or employee personal data. For each tool on your list, note what data it processes. Name, contact info, payment data, scheduling history, and performance records all count as personal information under California law.
- Confirm CCPA thresholds. Have your accountant or financial advisor confirm whether your business meets any of the CCPA revenue or data-volume thresholds. If you're significantly below the thresholds, document that. If you're close or growing toward them, plan ahead.
- Update your privacy policy. Your privacy policy should describe all AI and automated tools that process customer or employee data. It should be in plain English, not legalese. If you're subject to CCPA, it must include a specific section on your data practices and consumer rights.
- Review contracts with AI vendors. Make sure your agreements with software vendors include representations about their own CCPA compliance. If a vendor's platform is non-compliant and processes your customers' data on your behalf, you share exposure.
The broader picture: AI governance is becoming a business standard
California tends to set the national baseline on privacy and technology regulation. CCPA became a model for privacy laws in a dozen other states, and businesses that got ahead of it avoided a lot of pain. The current wave of AI legislation is likely to follow the same pattern โ states that don't have their own AI laws often default to California standards in practice, especially for businesses that operate across state lines or serve customers nationally.
For small businesses in Southern California, the practical implication is this: the way you use AI tools is increasingly part of your legal and reputational risk profile, not just your operational one. Businesses that document their AI use, keep their privacy policies current, and vet their vendors carefully will be in a much stronger position as regulation matures.
This doesn't mean you need a dedicated compliance team or a $50,000 legal engagement. For most small businesses, the right foundation is a documented AI inventory, an updated privacy policy, a clear internal policy on what data employees can and can't feed into AI tools, and a technology partner who stays current on these requirements.
Frequently asked questions
Do California's new AI laws apply to small businesses?
It depends on how you use AI. Laws targeting frontier AI developers only apply to companies with $500 million or more in revenue โ most small businesses are exempt. The CCPA's Automated Decision-Making Technology regulations, however, apply to any business subject to CCPA that uses AI to make significant decisions about California consumers. If you meet CCPA thresholds and use AI in decision-making, you need to take action. If you're below those thresholds, the obligations are limited but vendor vetting and privacy policy updates are still good practice.
What counts as "automated decision-making" under California law?
California's CPPA defines ADMT as any technology that processes personal information and uses computation to replace or substantially replace human judgment on decisions that significantly affect consumers. Practical examples: AI-powered shift scheduling, automated loan or credit decisions, chatbots that determine service outcomes, and algorithmic performance scoring. If a real human reviews AI recommendations before acting on them, you may fall outside the strictest requirements โ but the "human review" must be substantive, not just a rubber stamp.
What are the fines for violating California's AI laws?
Under CCPA, fines run up to $2,500 per unintentional violation and $7,500 per intentional violation โ each applied per consumer affected. AB 853 carries $5,000 per day of non-compliance, with each day counting separately. These amounts scale fast when many customers are affected by a single non-compliant practice.
Do childcare centers need to worry about these AI laws?
Most small childcare centers fall below CCPA revenue thresholds, so the ADMT consumer rights requirements probably don't apply directly. However, childcare centers should still confirm that any childcare management software they use is CCPA-compliant, update their privacy policy to mention any AI tools that touch parent or billing data, and review how employee-related software handles staff data โ especially as the center grows.
What should a small business do first to assess AI compliance?
Start with a simple AI tool inventory: list every tool your business uses that involves any form of automation or AI โ chatbots, scheduling software, email automation, HR platforms. For each tool, identify what consumer or employee data it processes and whether it influences decisions. Confirm whether you meet CCPA thresholds. From there, update your privacy policy and review vendor contracts. EDCON helps Southern California businesses do exactly this audit as part of our AI implementation and compliance services.
Not sure where your business stands?
EDCON helps small businesses and childcare centers in Southern California inventory their AI tools, review their compliance exposure, update privacy policies, and vet vendors โ without turning it into a six-figure legal project. Book a free consultation and we'll start with a straightforward assessment of your current risk.
Book a Free Consultation