With individuals set to gain new rights over how businesses use automated decision-making, businesses will have to ensure they're compliant with data privacy and AI regulations.

Nathan Eddy, Freelance Writer

February 6, 2023

6 Min Read
padlock data protection privacy concept. GDPR.
Ievgen Chepil via Alamy Stock

A growing number of data privacy laws in the United States and in the European Union (EU) mean businesses must ensure they're in compliance with regulations affecting personal data of employees and are offering clarity and consent options when it comes to the use of AI-based decision making.

Despite enforcement delays, New York's Local Law 144 will regulate the way organizations use automated employment decision tools, while in California, the Consumer Privacy Act (CCPA), recently amended by the California Privacy Rights Act (CPRA), expands data privacy law.

It will now offer protection to job applicants and current employees, as well as independent contractors and dealings between businesses.

“I strongly urge organizations to look beyond compliance,” says Bart Willemsen, VP, analyst with Gartner. “There are many requirements popping up worldwide, and if you want to prevent having to ad hoc respond to all these things in detail, try to elevate your game to an ethically responsible one. Don't look at compliance. Look at risk.”

He explains the CPRA explicitly includes profiling in its language, which guards against the unauthorized use of AI in employment screening tools, for example.

“Applicants must be notified of the use of technology not only during the video interview, but also in the case of intended use of AI to analyze the video interview afterwards,” he says. “When you deploy or intend to deploy, always offer full transparency of both intent and technology use.”

Willemsen also recommends organizations continuously monitor and manage AI risks in the development stage, training stage, and in production.

“The key items for businesses to be aware of include transparency, choice and monitoring,” he says. “You can only ask an individual to make a decision after you give clarity, transparency and the right not to be subjected to automated decision making.”

Brian Platz, co-CEO and co-Founder of Fluree, says the laws underscore the need for companies to have clean and organized data that is accessible upon employee request.

“It will also be important for organizations to be aware of that data’s lifetime to ensure they are providing employees with complete, comprehensive records in the event data was copied or duplicated for various purposes,” he explains.

Laws Complicate Leveraging Data for AI Models

From the perspective of Muddu Sudhakar, CEO at Aisera, these laws “certainly” make it tougher to leverage valuable data for AI models.

“AI generally needs massive data sets to get effective results.  Next, there is the problem that the data may have gaps,” he explains. “This could lead to skewed models. There may even be potential issues with bias because the data may not be representative of the population.”

He points out that another issue is that the California law has “rulemaking”, which means that it is not clear what the final compliance requirements will be.

“This can add to the difficulties with building models as well as the costs,” he says. “There are likely smaller organizations -- who do not have strong compliance programs -- that may not be aware of the new laws. There is a lack of awareness in general.” 

Sudhakar adds the California law applies to workers and will make privacy much more complicated for employers, raising questions as to what employee information can be deleted on request.

“However, gig companies may have the biggest challenges -- especially the larger ones,” he says. “They will have to manage privacy requirements across many contractors, who may not stay with the company very long.”  

Shira Shamban, CEO at Solvo, points out proof of compliance is not a new need.

“The interesting thing about the new regulations is that if up until now many of the frameworks we needed to comply with had to do with specific verticals, like HIPAA for healthcare or PCI-DSS for payments, the new regulations are talking about the individual person’s privacy,” she says.

Like GDPR before, now other states are looking to protect their resident’s data, and there isn’t a single path for compliance, but what’s important is to have privacy in mind.

That means security and GRC engineers should investigate existing security practices and mechanisms on the one hand, and the data their organization is storing on the other hand, and make sure they correlate.

“There are a few products out there in the market today that could help organizations to identify their private data, and from there it’s the security team’s job to make sure they’re doing the best they can in protecting it,” Shamban says.

Getting Ready for Regulatory Compliance

Even though enforcement of data privacy laws in California and New York laws have been slightly delayed, and California regulations implementing the new AI law are not yet fully baked, businesses should be employing expert consultants now to be ready when enforcement begins.

Platz notes that in the working world -- and especially in an environment that is often largely remote with employees around the country and the world -- these new privacy laws will affect employees beyond the states that enacted the laws if they live and work in different locations.

“With flexibility to work from virtually anywhere, this legislation will have wide reaching impact across states and sectors and will only highlight the need for employers to look closely at their path to compliance across a significant amount of data,” Platz says.

Bryan Cunningham, advisory council member at Theon Technology, a provider of data security, explains California often leads the way on US privacy laws which, in turn, often are inspired by those in the European Union, and new laws and regulations around the use of artificial intelligence to process personal data are the most recent examples.

“As almost always happens, many other jurisdictions will follow suit, as New York City already has,” he says. “So, businesses should be preparing to deal not just with these two new laws but, ultimately, with similar ones in most or all states and perhaps other cities.”

He adds even now, businesses can take little solace in not having a California office or California resident employees, because the new law purports to protect any Californian about whom the business collects or processes data, including employees, independent contractors, and others.

“New York claims a similar reach, and also requires a not-fully-defined ‘bias audit’ for the use of AI in employment decision-making,” Cunningham notes. “In addition, similar EU laws and regulations may well impact US-based businesses if they process data of EU citizens.”

Under such laws, individuals gain new rights over how businesses use automated decision-making, including notification, transparency, opt-out, and correction rights.

“In conjunction with expert lawyers and consultants, businesses should first identify, catalog, and map personal data they hold and any automated or AI-based decision-making tools they use,” he says. “Then they should determine which of the new and emerging laws apply to them. And they cannot begin too soon.”

What to Read Next:

Special Report: Privacy in the Data-Driven Enterprise

Pivotal Moments In Data Privacy History

10 Actionable Tips for Managing/Governing Data

About the Author(s)

Nathan Eddy

Freelance Writer

Nathan Eddy is a freelance writer for InformationWeek. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights