Could Biometric Tracking Harm Workers? – The Regulatory Review

Share Article

Expert urges regulators to restrict how companies can access and use employee health data.
What if your boss used your heart rate to infer your stress level and decided not to promote you?
According to a recent article by Elizabeth A. Brown, a professor at Bentley University, your boss should be punished for violating the law. But no current regulation prevents employers for using health data in this way. Brown argues that employees need greater legal protections, particularly with employers collecting increasing amounts of biometric information, such as DNA, fingerprints, eye scans, and facial images.
One way that employers collect this biometric information is through wearable devices, such as smart watches and fitness trackers, which are often part of employers’ workplace wellness programs. Brown explains that workplace wellness is now an $8 billion industry in which employers can encourage workers to stop smoking, exercise more, or meditate. As more employers offer workplace wellness programs to their employees, they may also collect more biometric information through wearables and associated apps.
Employers may collect employee data to provide more counseling and resources to workers who experience more stress. They may also believe that healthy workers are more productive and will have lower health insurance costs in the long-term.
Collecting biometric data, however, may come at the cost of mismeasuring workers due to inaccurate data. After all, wearables may malfunction.
Furthermore, third-party data collectors use algorithms to create “risk scores” based on biometric data that purport to predict likely illnesses and behaviors for a given individual, which may not be reliable. While acknowledging that health care providers can use predictive risk scores to assess possible treatments for their patients, Brown emphasizes that companies generating these scores should explain their algorithms or demonstrate that they are not biased.
Employees who use apps that track reproductive health are especially vulnerable, Brown argues, because employers could gain access to the data on fitness bracelets, smart watches, and apps that track fertility, predict ovulation, and monitor fetuses. This tracking could lead employers to discriminate based on gender, pregnancy, or predicted pregnancy. Although the United States has anti-discrimination laws, such as the Civil Rights Act of 1964 and the Pregnancy Discrimination Act of 1978, these laws fail to protect against misuse of health data.
Having biometric and health tracking data available increases the potential for employers to rely on them to make adverse employment decisions. Brown cites a study that found most global business leaders are not “very confident” in their ability to collect and analyze data responsibly when it comes to their employees.
Despite employers’ self-reported lack of confidence in handling employee health data responsibly, Brown predicts that employees will not object to employers collecting their data because biometric and health tracking is ubiquitous in the United States. Apple phones use facial recognition to unlock themselves, and one in two American adults are already in a law enforcement facial recognition database.
Employee perception of data’s usefulness suggests employees may not resist employers using data in worker evaluations. Brown cites a survey showing that nearly 80 percent of workers would welcome data-based feedback to optimize their time, and 82 percent of surveyed workers agreed that “pay, promotions, and appraisal decisions” based on data would be less biased and more accurate.
Employee desire for privacy increased, however, when considering employers use of health data. Brown cites a study showing that 93 percent of surveyed workers wanted to keep their smoking habits private from employers, and 81 percent wanted to keep their alcohol use out of employment decisions.
In addition, Brown raises the issue of power imbalance between employers and employees. If employers penalize workers who refuse to allow access to their health data as part of workplace wellness programs, then workers may effectively lack the choice either to participate in wellness programs or to protect their biometric data privacy.
Brown argues that as companies deploy increasingly sophisticated methods of collecting biometric and health data, regulators should balance the value of data collection with the potential harms of using that data to make adverse employment decisions. For example, employers bear the cost of health insurance, so employers that need to reduce their workforce may choose to terminate employees they predict as expensive to insure regardless of whether those predictions are accurate.
Brown points out that the Health Insurance Portability and Accountability Act (HIPAA) does not protect employees from third-party companies that collect data in workplace wellness programs. Health-related apps may sell users’ health information without violating HIPAA because HIPAA only covers health care providers.
In light of inadequate protection from legislation, Brown promotes the need for clear guidance for explaining how companies may use biometric and health data.
Brown suggests strengthening existing health privacy protections to cover biometric and health data. Scholars have proposed amending HIPAA to regulate biometric monitoring devices, including wearable technologies in the definition of “regulated medical device” rather than considering them “consumer electronics devices.” Brown also suggests expanding HIPAA’s definition of “personally identifiable” to include all health data, as well as its definition of “business associates” to include wearables manufacturers.
Brown also suggests amending the Affordable Care Act to clarify that neither employers nor their business associates can collect biometric and health-related data as part of voluntary workplace wellness programs.
In light of the expanding market for wearable technologies and the potential for employers to misuse sensitive data, Brown urges lawmakers to protect biometric data and employees who are at stake.
A Federal Trade Commissioner urges the agency to take action to prevent bias in computer algorithms.
Existing federal privacy laws may not cover data collected from autonomous vehicles.
Policymakers around the world are developing guidelines for use of artificial intelligence in health care.


source

You might also like

Surviving 2nd wave of corona
COVID-19

Surviving The 2nd Wave of Corona

‘This too shall pass away’ this famous Persian adage seems to be defeating us again and again in the case of COVID-19. Despite every effort

@voguewellness