A Global Perspective on the Future of Wearable Technology

 
polar-a360-2059937_1920.jpg
 

Wearables — where to start? With the smartwatch languishing in my junk drawer, the step tracker, heart rate monitor, security pass, two smartphones (private and business), smart clothing or the Santa list with a virtual reality headset?

It is hardly surprising that ABI Research predicts 500 million wearable devices will have been sold by 2021. Not counting the apps. If our personal time is tracked by wearables, what happens on the job? Organizations are embracing wearable technology to engage with their employees, monitor movements and productivity, to assess wellbeing, and even to market/cross-sell new products.

Using technology in this way is not without risk, but it is here to stay. Data privacy risks, ethical considerations (such as  the right to a private life), and continued liability for employers all need to be considered at the outset of implementing programs using wearables for work. One of the biggest considerations is what is the data used for? One obvious use of such data is how long staff spend working are their designated site, but do they know and understand the extent to which that data is monitored, what type of analytics are applied to it and even what other decisions are being made using that data that the employees do not see? In operating wearables for work (and collecting, using and storing data from them), organizations need to comply with relevant legislation/guidelines, including being mindful of the duty of care to their staff, be transparent with their employees, and consider wider ethical issues.

In the health and wellbeing space: Workplace schemes encouraging staff movement and activity, via apps or subsidized sports bands for example, are not new and are an established part of corporate culture for big businesses. Good for health, wellbeing and engagement in the employee population, insurance companies also recognize these benefits and adjust premiums accordingly. But do employees know what the data is used for? Some of this data may be used to look at employee attendance by employers and could form the basis of disciplinary action for example  — is that made clear at the outset? If an individual were to suffer an accident or overexert themselves during a fitness drive it is not a stretch to assume that, even if waivers have been signed by staff, liability (including vicarious liability) will still be placed firmly at the employer’s door.

For senior executives, often an organization’s biggest asset, wearable technology can monitor sleep patterns, stress levels and other more invasive biometrics that will help an organization manage its people risks. Even with consent to such a program, would that remove liability where a senior employee suffers a heart attack or stress-related illness as a result of their role? Unlikely, the normal legal tests would still apply. For staff in ‘high risk’ roles, for example pilots, wouldn’t a heart rate monitor or other device that can track and monitor extreme or abnormal changes in biometrics be important  — could it help predict or stop human-error disasters?

In terms of movement and productivity assessment, wearable technology can be a huge asset to organizations with thousands of staff  — keeping track of where people are. For example, you can monitor ‘hot spots’ for office infrastructure and plan effective working spaces. However, many countries have significant privacy laws and a legal right to a private life. If these technologies continue monitoring and tracking out of hours (including breaks) can the benefits of safety justify the impact to privacy/private life? In terms of productivity, the world’s biggest e-commerce companies now track their employees’ productivity and efficiency via wrist bands embedded with microchips against metrics which are often determined relative to the speed at which robots can operate. To what extent do employers have a responsibility to maintain human oversight of their staff even when such wearables are being used and to what extent are they reviewing potentially high rates of fatigue or burnout?

Much has already been made about potential inherent discrimination that may be written into algorithms. To what extent will employees in the future have the ability to challenge material decision making made by or data collected by devices, programs, technology on the basis of underlying discrimination. For example, selection for poor performance discussions, selection for redundancy or other serious impact to an employee’s career. If the underlying data or collection methodology is allegedly compromised, then it potentially follows that an organization could be held to account for relying upon it. Human oversight should always be included as part of any decision making process.

In terms of identifying fraud, the ability to track staff movements is extremely useful. Local laws usually provide that employers can process and use data to identify crimes or serious misconduct/dishonesty offences  — but it is always going to be important to have made staff aware of this type of monitoring before implementation and ensure compliance with local law/codes of practice.

Finally, let’s look at France and the decision to give employees the right to time off from technology  — to discourage emails out of hours, over the weekend or on annual  — leave unless business critical. In a connected world where staff often have smartphones issued by their employer or ‘bring your own device’ programmes pushing native work emails onto their personal phones and now smart watches sending instant messages  — what obligation do companies have not to overburden their staff? Many employees do not want colleagues whatsapping them on a Saturday when it could wait until Monday.

For many multinational employers and particularly those working across time zones, the French approach goes too far; it also depends on sectors, cultural norms in jurisdiction and expectations on workers in terms of productivity. For most of us, the view is that legislation is not needed in this area but good corporate culture and management leadership is. Unions and collective labour organizations are taking up this mantle in some jurisdictions and negotiating with companies for workers’ rights in this area.

However your organization chooses to adapt and adopt new technologies, be clear about the drivers for change, the expectations on your staff and be balanced about business needs. It is increasingly important to carry out risk assessments on a rolling basis to check if your technology programs continue to meet the aims they were introduced to address and remain fit for purpose or need to be adapted to reflect changing laws or social and cultural norms.

Julia Gorham is a partner in our Hong Kong office and works closely with leading employers assisting them with their cross border matters, particularly in the Asia Pacific region.

 

Cassie Peterson