Last month, I had the opportunity to speak to entrepreneurs at Launch Dayton’s Startup Week regarding the positive effects that strong privacy and data governance practices have on business.

As regulations increase and complexity rises, many businesses remain hesitant to view privacy and security obligations as anything other than impediments to innovation. In practice, embedding privacy by design and developing strategic approaches to cybersecurity and artificial intelligence laws serve as valuable drivers for growth.

Navigating the Regulatory Landscape
The environment surrounding privacy and security law is dynamic. American companies must contend with a complex framework that includes numerous state privacy laws (Indiana, Kentucky, and Rhode Island will introduce new statutes in 2026) federal regulations such as HIPAA and GLBA, industry self-regulatory standards including PCI-DSS, and evolving market contractual requirements specific to artificial intelligence. International standards, most notably the GDPR, introduce higher expectations for data transfers, extraterritorial reach, and significant penalties, including criminal penalties in Switzerland.

Understanding Key Risks
Organizations face considerable risks due to compromised consumer data. Data breaches commonly diminish trust, and individuals may reconsider or end relationships with the affected business. Significant breaches often trigger declines in brand value and stock price, as well as substantial financial costs resulting from recovery activities, regulatory penalties, and legal disputes. These consequences extend beyond financial metrics, affecting reputation and attracting increased regulatory attention.

Security incidents can arise from multiple sources, such as brute force attacks, business email compromise, social engineering, and suboptimal website data management. Additionally, accessibility issues related to web compliance represent substantial risks; regulatory bodies closely monitor such requirements. Neglect of web accessibility may result in costly litigation and settlements, emphasizing the necessity of compliance.

Artificial Intelligence and Emerging Risks
Artificial intelligence further complicates the landscape. Organizations are increasingly implementing natural language processors, machine learning tools, and generative AI solutions, all of which create distinctive legal exposures. These include intellectual property risks, discrimination, bias, inadvertent data disclosures, and exposure of regulated information. A policy addressing artificial intelligence is necessary to facilitate effective risk management. Judicial bodies address new cases regularly, and the regulatory environment continues to develop at the federal and state levels.

Establishing a Proactive Plan
Taking prompt action helps alleviate digital concerns. Organizations of all sizes can strengthen privacy and security by focusing on several fundamental areas:

  • Leadership commitment: Executive leadership must prioritize privacy; boards are expected to understand obligations and risks. Designating responsibility to a Chief Privacy Officer or department lead is advisable.
  • Data classification: Personal data should be assessed and defined according to both external regulations and internal risk criteria.
  • Data mapping: Understanding the location of data across physical servers, cloud environments, offices, third-party vendors, and artificial intelligence providers is essential for security.
  • Risk assessments: Ongoing risk evaluations help maintain compliance with HIPAA, GLBA, NY DFS, insurance requirements, and government contracts, and should lead to prioritized risk mitigation.
  • Governance and controls: Develop administrative, technical, and physical safeguards including policies, procedures, employee training, privacy statements, and formal agreements to create a multi-layered security structure.
  • Privacy impact assessments: Evaluate risks associated with new products, system updates, or organizational changes prior to launch to promote purposeful progress.

Leveraging Trust for Business Growth
Organizations that protect consumer information consistently benefit from increased loyalty and trust. Many individuals prefer and are willing to support brands that prioritize privacy. Transparent and accessible privacy policies strengthen trust and improve brand perception. Privacy and cybersecurity serve as opportunities for organizations to drive positive momentum. Those that implement privacy in their strategy and invest in governance realize advantages that extend to both market performance and regulatory compliance. Taking immediate steps toward a privacy-first approach remains the most effective path forward.

On July 1, 2025, the Virginia Consumer Data Protection Act (VCDPA) amendments took effect, implementing several changes to the existing privacy law, including adding new protections to reinforce consumers’ sexual and reproductive health information. While other consumer health data laws exist, such as Washington’s My Health My Data Act (MHMDA), which generally protects a broad category of “consumer health data,” the VCDPA amendments take a more narrow approach and only focus on reproductive and sexual health information. Here is what you need to know.

Continue Reading Virginia is for Lovers (of Privacy): VCDPA Amendments Merge Components of Consumer Data Health Laws to Better Protect Reproductive and Sexual Health Information

On September 1, 2025, Texas Senate Bill 140 officially amended the state’s well-known “mini-TCPA” so that certain Chapters now apply to sellers and salespersons who send marketing texts to consumers. This is a big change, particularly in two ways:

  1. Texting included. Previously the law only applied to traditional phone calls, and thus text marketers could arguably avoid the law’s painstaking registration and disclosure requirements.
  2. Private right of action. The amendments also include a private right of action through the state’s Deceptive Trade Practices Act, which subjects violators to steep penalties and gives Chapters 302, 304, and 305 of Texas’ Business and Commerce Code some additional teeth.
Continue Reading New Amendments to Texas’ Telemarketing Law Have Gone into Effect—Sellers Should Carefully Consider the Exemptions

Colorado legislators have approved a five-month delay for the implementation of the Colorado Artificial Intelligence Act (the Act), moving the start date from Feb. 1, 2026, to June 30, 2026.

The decision follows a special legislative session called because of concerns stemming from compliance costs, industry lobbying, and fiscal impacts on businesses and the state. Colorado Budget Director Mark Ferrandino indicated that the law could cost the state alone between $2.5 million and $5 million annually to implement, and Colorado Governor Jared Polis indicated that the amount could be as much as $6 million per year. The Act, originally designed to address risks of algorithmic discrimination in sectors like employment, housing, and lending, will now give both lawmakers and businesses more time to clarify provisions and prepare compliance programs.

Continue Reading Colorado Gives Businesses Breathing Room Before AI Act Takes Effect

On July 24, 2025, the California Privacy Protection Agency (CPPA) approved a sweeping set of amendments to the California Consumer Privacy Act (CCPA) regulations. These updates introduce new compliance obligations for businesses around automated decision making, cybersecurity audits, risk assessments, and more.

Below, we discuss some of these new requirements.

Continue Reading California Finalizes Major CCPA Amendments

On June 19, 2025, the United Kingdom Parliament enacted the Data Use and Access Act 2025 (DUAA). The DUAA amends, but does not replace, the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA) and the Privacy and Electronic Communications Regulations (PECR). While the DUAA imposes new requirements on organizations subject to UK privacy legislation, it also clarifies several provisions, making privacy compliance in the United Kingdom more manageable.

The changes under the DUAA began in June 2025 and will be phased in over the next year through June 2026.

Continue Reading Are You Ready for the UK’s Data Use and Access Act 2025 (DUAA)?

On July 1, 2025, the California Attorney General, Rob Bonta, announced that the California Privacy Protection Agency (CPPA) entered into a settlement with Healthline Media LLC (Healthline), which included a fine of $1,550,000, the largest fine by the CPPA to date, for various alleged violations of the California Consumer Privacy Act (CCPA). This settlement and fine follow the CCPA’s $632,500 fine against American Honda Motor Co. in March of this year. These actions continue to show California’s increased focus on CCPA enforcement.

Per the announcement, Healthline.com is a health and wellness information website that is one of the top 40 most visited websites in the world and generates revenue by showing advertisements on the website.

Continue Reading California Privacy Enforcement Continues: CPPA’s Largest Fine To Date

Special thanks to Taft Summer Associate Richard Roediger for his significant contributions to this post.

On May 20, 2025 Ohio Rep. Adam Mathews (District 56) and Ohio Rep. Haraz N. Ghanbari (District 75) introduced Ohio House Bill 283 (the Act), legislation that requires political subdivisions within the state to enact cybersecurity programs. In Ohio, a “political subdivision” is a county, township, municipal corporation, or other body corporate and politic responsible for governmental activities in a geographic area smaller than the whole state.

The Act’s language was incorporated in its entirety into Ohio’s state budget bill passed on June 30, 2025.

Continue Reading Ohio Budget Bill Requires Counties, Townships, and Cities to Enact Cybersecurity Program by September 29

A recent decision from the Northern District of Texas has upended the Department of Health and Human Services’ 2024 amendments to the HIPAA Privacy Rule (the 2024 Rule), which were intended to bolster privacy protections for reproductive health care information.

The court’s ruling in Purl v. HHS vacates almost all of these amendments, finding that HHS overstepped its statutory authority and improperly interfered with state law.

Continue Reading HIPAA’s Reproductive Health Shake-Up:  What the <em>Purl</em> Ruling Means for Health Plans and Covered Entities

Early on July 1, the U.S. Senate voted to halt an effort to impose a 10-year moratorium on state regulation of artificial intelligence. The vote, 99-1, removed the AI provision from President Trump’s “Big, Beautiful Bill” that had evolved from a full moratorium on state AI regulation for the next decade, to its most recent iteration that required states to adopt the ban in order to receive federal broadband funding over the next five years.

Yesterday, Sen. Marsha Blackburn of Tennessee and Sen. Ted Cruz of Texas attempted to revise the AI ban to address current regulations. According to media reporting, efforts toward banning state AI regulation broke down amidst concerns that the language was overly broad and could adversely impact existing laws concerning privacy, consumer protection, and child safety.

Continue Reading US States Can (And Will) Continue To Regulate Artificial Intelligence … for Now