The Wonderful (and Regulated) World of Wearable Tech Part I: Tech Developers
Humble Brag: I ran the New York City Marathon last Sunday (and it only took me four days to recover)!
In celebration, I thought it would be fun to write about a marathon favorite: wearable tech.
This two-part series helps clarify the complex, evolving regulatory framework relating to wearable tech. Part I covers what data privacy and security considerations developers should be mindful of, and Part II covers considerations for employers who wish to implement health initiatives based on wearable tech use.
Wearable Tech
Wearable tech devices (smartwatches, rings, fitness bands, and smart clothing) collect and share a steady stream of personal data, allowing wearers to track steps, sleep cycles, heart rate, and a host of other bio indicators. This data collection is awesome for tracking training, performance and personal health. But, the data is also sensitive and can reveal insights into personal health that may be problematic if inadvertently disclosed. Accordingly, wearable companies should be aware of a host of federal and state laws, and data privacy best practices, that may apply to their tech.
HIPAA Need Not Apply
This may be surprising to some, but: When used for personal information only, wearable devices do not fall within the purview of the Health Insurance Portability and Accountability Act (“HIPAA”).
Fundamentally, HIPAA is a U.S. federal privacy law that gives patients a say over how their Protected Health Information (PHI) is used and disclosed. It also contains requirements for how health providers must collect and store data, and what they must do in the event of an inadvertent disclosure of such information.
HIPAA only applies to “covered entities” (healthcare providers and health plans), and the business associates that process data on behalf of those covered entities.
Contrary to popular belief, HIPAA only applies to health information collected for or used by a covered entity. And, only covered entities (and their business associates) can be liable for disclosing health information in violation of HIPAA. That means you cannot use HIPAA to sue your coworker for telling everyone you’re coughing at your desk, or to sue a stadium for asking about vaccination status (this was a common misunderstanding during the COVID-19 pandemic). Because companies that sell consumer-targeted wearable devices, not for use in the provision of health services, those companies are not covered entities and not subject to HIPAA.
HIPAA would apply, however, for tech used to collect PHI for a covered entity. For example, HIPAA would apply if a cardiologist partnered with Fitbit to collect and track heart rate information and activity levels to design a patient health plan. In that case, the doctor’s office would be the covered entity, the heart rate and activity data would be PHI, and Fitbit would be the business associate (processing data on behalf of the doctor’s office/covered entity).
In short, wearable data becomes HIPAA-regulated when collected, stored or used on behalf of a healthcare provider or health plan for treatment, billing or operations, but NOT when collected by a consumer app for personal use.
What Rules Do Apply?
While HIPAA is unlikely to apply to consumer-focused wearable tech independently marketed and sold to consumers, wearable companies must still comply with a patchwork of other emerging federal and state privacy laws. These include:
· The Federal Trade Commission (FTC) Act (governing unfair and/or deceptive practices): The FTC enforces against deceptive privacy promises and unreasonable security practices. If your company’s privacy policy promises one thing, but your product shares your data in another manner, you risk FTC action.
· FTC Health Breach Notification Rule (16 CFR 318): Obligates non-HIPAA entities that collect individually identifiable health information (e.g., fitness or biometric metrics tied to an individual’s identity) to notify users, the FTC, and sometimes the media when there has been a breach of unsecured personally identifiable health information. A breach includes unauthorized access, disclosure, or sharing (for instance, by selling health data to advertisers without user consent). In the event of a breach of unsecured personally identifiable health information, the Rule requires individual notice “without unreasonable delay.” Further, if 500+ residents of a state are affected, companies must also notify prominent media in that locale, and the FTC. Companies may face FTC enforcement and penalties for non-compliance.
· California’s Privacy Rights Act (CPRA) (Ca. Civ. Code § 1798.140): The CPRA treats many wearable-derived metrics (including neural data and, broadly, “information collected and analyzed concerning a consumer’s health”) as sensitive personal information, which carries heightened consumer rights, such as the right to opt-out of the “sale” or sharing of data. CPRA requires businesses to perform data protection assessments for certain high-risk processing activities and gives consumers the right to limit the use of sensitive personal information (including health and precise geolocation). Even non-California companies may have CPRA obligations if they sell products in California, and/or have California users.
· Washington’s My Health My Data Act (MHMDA) (Wash. Rev. Code § 19.373): MHMDA is one of the broadest health privacy laws in the U.S. regulating consumer health data and geofencing. It covers any company that collects “consumer health information,” including data that infers health status (such as heart rate, fertility tracking, sleep data, step counts, menstrual cycle data, etc.). Key requirements for wearable tech companies include: that they obtain clear, affirmative consent before collecting or sharing relevant health data; provide consumers with a health data privacy policy; honor consumer rights to access, delete and withdraw consent; cannot sell health data without a signed authorization; comply with strict breach notification requirements for unauthorized disclosure of health information; and cannot utilize geolocation/geofencing to track or target people around health facilities. Specifically, MHMDA makes it unlawful to implement a geofence around an in-person health services location for the purpose of identifying/tracking consumers, collecting consumer health information, or sending targeted ads or messaging related to health data or services.
· Texas’s Data Privacy and Security Act (TDPSA) (Tex. Bus. Comm. Code Ch. 541 et seq.): This law applies broadly to anyone doing business in Texas and processing personal data (i.e., precise location, health information, biometric identifiers). The law contains a small business carveout, exempting them from most DPSA obligations except that small businesses may not sell sensitive personal data without first obtaining a consumer’s prior consent. Larger businesses must additionally implement a privacy program, conduct data protection impact assessments, provide a full set of consumer privacy rights and processes, and maintain detailed privacy documentation.
· Virginia Consumer Data Protection Act (VCDPA): This law applies only to companies that either: (1) control or process data of more then 100,000 Virginia residents; or (2) process personal data of more than 25,000 residents, and derive more than 50% of revenue from selling personal data. If VCDPA applies, companies must: obtain opt-in consent before processing sensitive data (i.e., health, biometric, location); provide specific privacy notices; conduct data protection assessments for high-risk uses (i.e., profiling or selling personal data); implement data minimization and reasonable security protocols; and offer consumer rights to access, delete, correct and transfer their data.
· Colorado Privacy Act (CPA): Colorado is considered one of the most restrictive state privacy regimes for wearables. Similar to Virginia, this law applies only to companies that: (1) control or process data of at least 100,000 Colorado residents; or (2) derive revenue from selling personal data and processing data of more than 25,000 consumers (regardless of what percentage of revenue is derived from the sale of data). Among many of the requirements stated above, Colorado additionally allows consumers to opt out of targeted ads, sale of personal data, and profiling that produces significant decisions (i.e., insurance, employment, etc.). It also requires companies to publish a clear privacy notice describing what data is collected, the purpose of collection, and categories of third parties that receive data.
· Florida’s Digital Bill of Rights (DBR): This law applies only to very large businesses ($1 billion or more in global revenue), and operate search engines, social platforms, or targeted ad networks. Therefore, most wearable companies are out of scope. However, it is worth noting because the DBR contains many of the same requirements as stated above (requiring data minimization; obtaining consent for sensitive data; honoring rights to access, delete and correct personal data). Therefore, it furthers the best practices framework.
Best Practices for Innovators
Even if no state laws may apply to your wearable tech, it is still best practice to comply with common principles designed to protect consumer health information (many of which are discussed above). These include:
· Avoiding the use of vague privacy promises. If you share data with advertisers or analytics vendors, say which vendors and for what purpose data is shared to avoid an FTC investigation into misleading statements;
· Obtaining explicit user consent to collect and process data;
· Conducting Data Protection Impact Assessments (DPIA) to identify and mitigate privacy risks;
· Providing basic consumer rights to their data, including the right to access, delete, correct and opt out of data sale or secondary use, and create a system for implementing those rights consistently;
· Conducting vendor due diligence (cloud provider, analytics partners) to ensure partners employ responsible data security measures;
· Implementing data minimization and purpose limitation principles to ensure data is only collected for legitimate and stated purposes;
· Securing high-risk/sensitive data through encryption, authenticated APIs, and least-privilege access.
Developer Considerations
For startups and smaller businesses in wearable tech, the key is embedding privacy and security from the design phase (“privacy by design”), and incorporating as many of the above best practices from above as possible. By mapping data flows (knowing when and from where you collect data), and classifying data, companies can easily ascertain where additional security measures should be implemented. Treating health information as sensitive by default can help reduce downstream risk.
Data privacy laws are evolving quickly, and it is expected that more states will implement data privacy and security laws similar to those already in place. Establishing clear, accessible user consent mechanisms and regular security audits can help maintain compliance and consumer trust from early stage.
Consulting an attorney early can help identify which laws apply, and how to comply with them.