New teams to determine products, processes and procedures based on regulations are being built out within companies to manage emerging technologies -- not just to follow state privacy laws around artificial intelligence (AI), but also in conformance with the Telephone Consumer Protection Act (TCPA).
AI technology and regulations across the United States will become the biggest challenge for the advertising industry in 2025, says Christine Frohlich, vice president of product and data governance at Verisk Marketing Solutions (VMS). The company recently added products to Frohlich's responsibilities to ensure compliance.
There has been a lot of activity at the state level, she said. The insurance industry, which VMS works with frequently, has already gone through “rounds” of discussions. Now she has begun to see it showing up in federal legislation.
advertisement
advertisement
Teams are looking into how to source the data, and how to train the models.
As companies go through the R&D process it will become important to consider how the product will integrate AI as well as biometrics. Consumers need to weigh the benefits and risks by looking at how companies use the information.
“Even the original California legislation from 2020, when it talks about information and how it can and cannot be used, it calls out biometrics data,” she said. “Biometrics data is not just information, but personal information that requires a certain level of consent.”
Frohlich said VMS is not pursuing biometric data because of the regulatory challenges the company sees.
Regulations for biometric and AI data historically have not kept pace with the growth of the technology. That will change for companies in 2025, with or without U.S. state or federal government.
This past week, president-elect Donald Trump and Softbank CEO Masayoshi Son announced a $100 billion investment in America, which will add jobs to help ensure the United States will lead in the development of AI and other emerging technologies without being too heavy-handed in regulations.
Companies like VMS want to ensure regulation is put in place for growth, but also to safeguard against hallucinations and bias. It has become clear that companies do not always have access to fix the massive amounts of data the model was originally trained on. Measurement and benchmarks no doubt are on the way, and some regulation has already begun to take hold.
“There’s vast confusion across the U.S. around the laws and what they apply to,” she said. “This is why we have chosen to regulate nationwide.”
Frohlich doesn’t see a lot of change coming at the federal level, but that could change with a new administration coming in.
“I don’t see comprehensive federal legislation at least for the next two years,” she said. “But that could shift next week.”
Frohlich does expect certain states and federal agencies to eliminate the use of TikTok for their employees and possibly in the private sector because of the potential risk of the data from U.S. residents ending up in the People's Republic of China.
Businesses operating in China are required to share data with the PRC government, particularly when it comes to sensitive information around national security, and must comply with laws related to cybersecurity, data security laws, and national intelligence.
That has been a concern of the U.S. government with regard to TikTok because it is owned by ByteDance, a Chinese company.