June 08, 2020

Work is Watching

Work is Watching

“Companies are essentially using algorithms and other forms of new technology to gather and use personal data from their current employees’ lives and that they now want to know about workers’ hobbies, their political beliefs, their consumer preferences, even their exercise routines.” Leora Eisenstadt, Fox Department of Legal Studies

How do you feel about your employer tracking your time running personal errands and then using that data as part of company-wide research? Associate Professor of Legal Studies Leora Eisenstadt takes a deep dive into the world of employee information tracking. What methods are companies using to research employees’ lives, how can we maintain our privacy and are there any legal repercussions for businesses that do this?

It’s a harrowing thought to know that our personal information could be out there for our employer to use. When you are connected to a company’s internet network, everything can be tracked. In the last few months, so many of us have been working from home, so just logging online might allow your company to track your data.

When an employer has this data, what conclusions can they jump to? They can understand how willing you are to take risks, how good of a multitasker you are, and your current and future health issues. This can ultimately affect how many hours they give you as well as the odds of you getting a promotion or being terminated.

In this episode of Catalyst, the podcast from the Fox School of Business, we sit down with subject matter expert Eisenstadt to discuss the future of employee privacy and understand how the line between our professional and personal lives continues to blur.

What could your company already know about you? How has COVID-19 affected employee privacy? These and more questions are answered by Eisenstadt in this episode of Catalyst.

Catalyst is a podcast from Temple University’s Fox School of Business about the pivotal moments that shape business and the global economy. We interview experts and dig deep into today’s most pressing questions, such as: What is the future of work? Will the robots really take our jobs? And how is my company using my data? We explore these questions so you can spark change in your work. Episodes are timely, provocative and designed to help you solve today’s biggest challenges. Subscribe today.

Podcast Transcript

Host: Welcome to Catalyst, the podcast at Temple University’s Fox School of Business. I’m your host, Tiffany Sumner. Your company knows a lot about you: your age, how many children you have, where you live and your vacation habits. But what if your company also knows that you just bought a new mountain bike or a pregnancy test? That you scroll through twitter during your lunch break or you just scheduled an appointment [00:01:00] with your dermatologist? 

On this episode of Catalyst, we are talking to Leora Eisenstadt, assistant professor of legal studies at the Fox School of Business. Today, we’ll learn about how employers can use data from our personal lives to inform business decisions. Leora studied how the divide between our personal lives and professional lives is blurring and what protection should we put in place. Let’s understand what happens when work is watching. Hi Leora, thank you for joining me today. 

Leora Eisenstadt: My pleasure. 

Host: What has your research shown about companies tracking employee’s data?

Leora Eisenstadt: First, let me just say, it’s always been the case that your employers could find personal details about your life. Maybe you were friends with your manager and you have conversations about personal things, right? Or maybe you were Facebook friends with your co-workers or your manager and they could see your pictures and your posts. There were always sorts of ways to find out about your personal life [00:02:00] but what’s changed is that companies began to recognize that data analytics that can actually use, what were before seemly unrelated data points to make predictions about workers behavior and on the job success. So once that happened, they started to gather tons of more information—way more than a manager would ever gather about an employee in an informal, casual conversation. 

Host: It’s a little scary to think that work is watching. How and why did this start?

Leora Eisenstadt: I think it started because the recognition that algorithms can now take thousands of variables, thousands of data points and make meaning out of this random data. So, I’m going to give you a couple of examples. An algorithm could conclude based on data provided by the employer right, to the algorithm, essentially that workers who drink Sam Adams beer, read the Wall Street Journal, [00:03:00] do cross-word puzzles and prefer hiking over running make the best leaders for their organization. They can then search social media profiles of these employees for this data and use it to make recommendations about workplace opportunities for the employees. Here’s another example: an algorithm could take thousands of unrelated data points and predict that you’re concerned about developing diabetes or attempting to get pregnant or thinking about starting a family. 

And there are really dozens of possible uses for this kind of information. It could impact hiring decisions, termination decisions, promotion decisions, all of those. There’s another one that I recently learned of which is facial recognition software, and some researchers in India are starting to argue that facial recognition software could be used to identify your emotional state with a very quick scan of your face. So, they are recommending that employers use this instead of, you know, [00:04:00] swiping in with a card to get into your workplace. Then maybe it’ll just be a quick facial scan and that will determine your emotional state, which will then impact how effective a worker you’re going to be that day. And without asking you a single question, your employer might then decide what you’re working on that day, who you’re sitting next to, what your opportunities are for professional development. 

Host: That terrifying. Although if you’re having a bad day, does this mean that you get to go home? 

Leora Eisenstadt: I suspect not. Although maybe it means you get to go home permanently, which is not the best solution. 

Host: Yeah, it’s so funny, you know, it’s 2020 and the world more and more are sounding like a sci-fi novel doesn’t it?

Leora Eisenstadt: Indeed. 

Host: it’s sort of interesting though, that when you’re really thinking about the kind of data that we’re talking about and what your company knows about you that they might even consider the type of beer that you drink or the food that you eat [00:05:00] and your diet. Like what other types of data do you think they might consider that would surprise employees or consumers?

Leora Eisenstadt: It could really be anything because the point here is that it’s totally unrelated sort of on the surface to what they’re actually looking for, right? These algorithms aren’t showing causation—this is an important point to remember. They’re not showing them because you drink Sam Adams beer, you will make a good leader. They’re showing correlation. So they’re given a profile of what a good leader is or what a good worker is and then they’re going through thousands maybe even hundreds of thousands of variables to find ones that match with that profile of a good leader. Companies don’t actually care about causation, right? If the algorithm works well enough, if it’s accurate enough, if the correlation is strong enough, then they don’t even care about causation. So, it would literally be anything. It could be your political association, [00:06:00] it would be what kind of flowers you grow in your backyard, it could be “I like chocolate versus gingerbread cookies,” right? It could be anything. 

Host: I like all cookies! But I think it’s so fascinating and I would assume that—and I like the idea of a profile of a leader for a specific organization. I would assume that they would develop multiple profiles based on a number of data sets.

Leora Eisenstadt: Yeah, and this is not going to be the same for every organization, right? Every organization is going to have a different profile or you know this could be used to create teams. Who the company thinks works best together, which kinds of people should pair up so it’s not, “Actually, oh we’re looking for an aggressive type that goes along with passive type.” We’re looking at sort of two profiles of people that they think merge and now we’re looking at thousands of related variables that suggest that you fall into one [00:07:00] of those profiles. So, it’s actually, it’s something that you can’t even predict beforehand if you’re the employee because it’s so unrelated. Unless of course, eventually, you start to see a pattern, in which case you might start to say “Well, if I see that everybody who drinks Sam Adams beer is getting promoted, I’m going to start drinking Sam Adams beer,” and you might change your actual preferences in your daily personal life. 

Host: I guess that’s good if you like Sam Adams beer. 

Leora Eisenstadt: Good for Sam Adams. 

Host: [laughs] They are not sponsoring this podcast. You said that it’s about correlation, but companies are using this data to predict who they should hire. Is that correct?

Leora Eisenstadt: Well, we’re actually not talking about hiring. There are currently companies using data analytics and the same kinds of tools. Facial scans are being used, data analytics are being used to figure out who to hire, [00:08:00] that’s sort of a different—I put that in a different category. You know, I’m coming from the legal angle and when you’re hiring someone, it comes with different legal concerns than when you are applying these technologies to your existing workforce population. So, in hiring, you know, you sort of expect to be evaluated on any number of variables, right? I expect that if I am up for a job that my employees are going to search my Facebook profile and look at whatever’s public. They’re going to look at whatever I posted online because the employer has a level of liability, right? They want to make sure they are not bringing someone into their workplace who is going to sexually harass other people or who is going to be a danger to other people or who is going to embarrass the company because they say all kinds of vile things online. So, you sort of have this expectation that employers are doing that. 

Existing employees don’t quite have that same expectation, right? They think, “I’m being evaluated on the work that I do and whether it’s good or not [00:09:00] and I’m going to be up for promotions or get more professional development opportunities based on the work I do.” Most people are not thinking I’m going to be offered some leadership position because I ate pancakes versus oatmeal this morning.

Host: So what types of companies are either doing or consider doing this data collection and what methods are they using? 

Leora Eisenstadt: I suspect that large and small companies will soon be doing this. So, for example, when we’re talking about the kinds of sort of health data and health predictions, there are a number of companies that offer a third-party platform known as a health navigation platform. And they offer this to their employees as a benefit and they say, “Look, this can help you manage your prescriptions and your doctor visits and your overall healthcare experiences.” This is sort of an interface between you and your insurance and you and your doctors, right? This makes it all easier. But when employees opt [00:10 :00] in to using this third-party platform, they often give that health navigation platform permission to share the data that the platform gathers with the employer because they’re not thinking about it. You click a box and that data includes prescription purchases, doctor searches, medicine information searches. All kinds of information that is not in your health record—that’s protected, that’s private—but these are sort of external health-related data points that it turns out can be used to predict what you’re thinking about in terms of your health. What concerns you have, what family planning thoughts you might be having. The kinds of companies using this include, Kraft, Adobe, Heinz, Liberty Mutual, Viacom. Tons of huge companies with hundreds of thousands of employees involved. 

Host: Wow, [00:11:00] so how does or will this affect me?

Leora Eisenstadt: Look, I think that it can affect everyone in big and small ways. Because as soon as companies start to do this on a more regular basis, everybody else is going to jump on the bandwagon. So, companies could be using this to put together teams based on who the algorithm predicts will work well together. They can use it to determine who to promote based on health concerns people are having, if you know someone is contemplating getting pregnant, you might think, “Oh, she might not be as effective of an employee, so maybe I’ll give this opportunity to someone else.” As the algorithms become increasingly good at making accurate predictions, the possibilities really are endless. 

Host: You mentioned that employees that might be dealing with health issues or trying to start a family might be passed over for promotions. How does that affect anti-discrimination laws?

Leora Eisenstadt: It is [00:12:00] clearly unlawful to discriminate against someone who is pregnant. It’s a question, it’s a grey area in the law whether it is also illegal to discriminate against someone who is thinking about getting pregnant because they are not actually in that state of pregnancy that is covered. Now it’s a grey area. It’s possible that a court would say thinking about pregnancy is related enough, but I don’t think we’ve had that case yet to challenge it. 

With that being said, it’s really hard to prove that that’s what an employer is doing. It may be happening but they’re going to have all different kinds of reasons in your file for why they are taking the actions they are taking. So, I think this is a concern employees have—should have—because it’s going to be very hard to make that decision and tie it so closely to that health care concern. It’s not just pregnancy, pregnancy is only one issue but any disability, right? Though disability [00:13:00] discrimination would prevent an employer, the probability of discrimination would prevent an employer from acting against you because you have diabetes or you’re developing diabetes. It’s going to be very hard for that employee to show a line between the action the employer took and that concern that the employee had that was determined based on this algorithm that predicted it. 

Host: In light of the COVID-19 pandemic and stay at home orders, how has this issue been accelerated? 

Leora Eisenstadt: So, my research focuses on all the ways this off-duty data gathering is eroding the divide between work spheres and non-work spheres, right? And this blending of work and personal life was a process that was clearly happening already. So, you’ve got technology that allows you to do work from anywhere, anytime. All of our laptops and smartphones and everything. We’ve got millennial worker attitudes which focus [00:14:00] on this desire to be flexible and be able to work from home when they want to or to work from a coffee shop—when we can return to coffee shops—and you’ve got this growth of the sharing economy which includes all of these jobs that demand flexibility. So, we’ve seen a real erosion of the work, non-work division for a number of years already. But I think that this data analytics and data gathering is really hastening the erosion of that divide. 

Now you’ve got the COVID-19 pandemic and millions of people are working from home. And I think the process has been further exacerbated with the rise of Zoom and employers demanding that all of their conference calls be on Zoom—which kind of is unnecessary but it’s happening, right? A lot of employers are now seeing inside your home, wherever you’re sitting, or maybe you’re moving around the house while you’re on the phone or while you’re on the Zoom call. [00:15:00] Maybe they’re even seeing the other people in your home, they know how many cats you have. They are now seeing that on a regular basis, right? 

And the second thing is, with everyone working from home there is increased concern about efficiency, that we’re all is not being as efficient as we were before—and by the way, those of us who have children, have kids coming in and out of the office every ten seconds and so that’s a real concern, I get it. But employers are turning to more monitoring technologies because of that concern of efficiency. So, they can track their every digital move and through that—there’s this New York Times article where this reporter actually downloaded the monitoring software and gave the rights to his boss to view him and he felt totally creeped out by it, but he did it anyway and he could see everything [the reporter] was doing. So, before maybe you had an office computer [00:16:00] and a home laptop. Now, your home computer is your computer. And your employer has installed digital tracking software on your computer so they can see, not only how long you spent on work projects, but how long you’ve spent on every other website you went to, and what other website you went to, and what’s in your Fresh Direct or Instacart, right? Or what is your message to your son’s teacher say or what does your message to your friends say. So, it’s just sorts of exploding with possibility and the line between what is work and what’s not work is being torn apart. 

Host: Wow, it’s mind-blowing and really eye-opening. So is it possible to know if an employer has installed tracking software on our computers?

Leora Eisenstadt: Well, for sure they are going to make you sign something that allows them to do it, that’s a given. They are cognizant of privacy concerns. Now that being said, how many employees notice [00:17:00] they’re signing away their personal data, that they are giving permission to their employer to track keystroke. I suspect that many employees would just check that box without really reading the fine print. 

And second, as more and more companies start to do this, and in the coming—and maybe we are already in—that job market, the ability to refuse to sign that or check that box is getting slimmer and slimmer because if you know that the next job you’re going to apply to is going to have the same requirements then you are going to say, “Forget it, I’m just going to sign this away.” 

Host: And so that might be presented to the employee as, “We have a new privacy policy” or a new data, you know, “We are handling data a different way and we need you all to opt into this,” or something along those lines? How are companies presenting this policy to their employees? 

Leora Eisenstadt: I suspect that it’s all over the place. So [00:18:00] some employees are signing this as a part of their package of documents that they sign when they come on right? So, you sign your insurance documents, give them your social security number, provide your bank account for them to do direct deposit and you’re also signing some type of tech privacy policy, right? So, it’s going to be a bunch of other things and you’re not going to pay a bunch of attention to it. If they are giving it to existing employees and they are changing things, maybe they are doing an update to their employee handbook, right? And so, this is going to be a new page that gets inserted. I guarantee most employees have not read their employee handbooks cover to cover with the kind of concern that they should be. You know, it’s that same kind of prevision that gets thrown in. They are going to add an arbitration clause to their employee handbook, they are going to add a tech policy, they are going to add a privacy policy and it’s going to have a title “Privacy Policy” to make you feel like, “Oh, this is how they are protecting my privacy,” and it probably doesn’t. [00:19:00] 

Host: So what steps should we be talking to ensure our privacy?

Leora Eisenstadt: Read everything before you sign it. That recommendation from lawyers worldwide usually falls on deaf ears, right? We all say that we do this and in my classes that I teach my students about arbitration clauses and I ask my students every year, how many of you have signed an arbitration clause? And virtually nobody raises their hand. And then say I say, how many of you have cellphones? And then they all raise their hands. Then I tell them that they all have signed arbitration clauses because every cellphone contract has an arbitration clause, virtually right? But how many of us have read our cellphone contracts cover to cover? No one! So read everything. But I don’t know how often people actually take that advice. That is certainly a way to think about this, the privacy angle. It’s not really the angle [00:20:00] I approach this from because I think there is not much hope of that changing where we’re going from the privacy angle. 

Host: Are employees at all worried that this type of collection and analysis will lead to a lack of trust among their employees?

Leora Eisenstadt: That is certainly a concern of mine. I haven’t seen that be a concern of employers yet and that’s one of the alarm bells that I am raising, right? So, I am actually in my research, talking to employers and saying, “Look, you should be concerned about the impact of this erosion of this work and non-work divide is going to have on the employer and employee relationship. You shouldn’t be casual about this and you shouldn’t be doing it just because you can.” I think there are a bunch of negative implications that are possible from employers using this kind of data. 

Host: Can you talk to us a little bit about those implications?

Leora Eisenstadt: Sure, I think they fall into four basic categories [00:21:00]. 

The first potential consequence of the erosion of the work, non-work divide will be an impact I think on companies’ legal liabilities on the actions of their workers, right? So, there are ways– there are doctrines in the law that protect employers from liability from everything their employees do in their personal lives. One of these is the scope of employee doctrine, right? So, this basically says that employers are only liable for the things their employees do when they are working, when they are in the scope of employment. Well once you sort of create a grey area or once you erode the divide between what is work and what is not work. Then that scope of employment grows. And it’s possible that employers could be liable for many more things their employees are doing in their “personal time.” So, if you are not using personal data from employers about everything [00:22:00] they do, are employees then going to be held liable for all of those actions that can potentially harm other people. If employers are using facial scans to determine employees’ emotional states when they come to the office. If someone looks angry and aggressive and potentially violent from their facial scan in the morning, is an employer going to be held liable if that employee then has some kind of physical altercation with one of his coworkers and hurt someone else right? They knew, should they have done something about it? So, the more employers know, the more data they gather, the more they may be on the hook later for the negative consequences of their employees’ actions. 

The second is, I think there will likely be a decrease in workers’ creativity and their efficiency long term when they lose the sense of personal autonomy. So, when your employer sees everything you do, eat, think, believe, read. [00:23:00] All of a sudden it does seem Big Brother-ish right? And it also feels like, “Well, everything I do is focused on how I will succeed at work. So maybe I should alter my personal behavior so that I get ahead at work.” Well, that long term is going to lead to a sense of loss of autonomy. Studies have shown that workers that don’t have that sense of autonomy, lose creativity. They lose inspiration. 

Third, burnout. Burnout in stress levels. This is going to impact work-life balance or any sort of ability to have a separate life away from work and all of this monitoring. The studies that look at physical monitoring of employees and even the keystroke monitoring show that employees develop much higher stress levels which lead to obviously emotional [00:24:00] issues but also physical issues. So, pain in the upper neck and shoulder area, pain in their hands that have real financial costs for employers who are paying for insurance for all of these employees. So, they really should care from that perspective. 

And last, impact on employee retention and loyalty. If you feel like your employer is watching your every move, you’re going to start to think, “I don’t really want to be here.” As long as everyone is not doing this you’re going to look for places to work who is not doing this. And you know, that would variably cost companies money when people start to leave. 

Host: It’s interesting that you say that, that last point, because I started to wonder as you were explaining the implications, what is the ROI on using this data? I mean it’s sort of the dream, right? Let’s just be data-driven, everything we do we can sort of automate or make informed decisions based off of even seemingly unconnected [00:25:00] data points at what cost, right? The cost of talent, at the cost of reputation, at the cost of running your companies culture but I think perhaps one of the big takeaways I’m taking from our conversation today is that employers should really think, companies should think very strategically about how to use this so they don’t face, they don’t have a small benefit that has a much larger negative impact on their workforce which could affect their bottom line.

Leora Eisenstadt: I totally agree. I feel like this sort of, it’s this shiny object they want to grab, right? Because it looks so attractive and there are clearly short-term gains to be cut from this. But if you don’t think about the long-term implications, then you are missing a huge part of the picture. 

Host: That’s great. So, any closing advice that you want to give to employers who are considering [00:26:00] using data in this way? 

Leora Eisenstadt: No, I think there are sort of maybe three takeaways. So, for employees. Look, employees have to be aware of what they are being asked to give up in terms of their personal data and they have to be more aware of how it’s being used. And once you give up, the right to inspect your personal data, you don’t know how the employers can use it to two years down the line, four years down the line. Imagine this technology improves as these algorithms improve, that data can be used in multiple different ways that they are not telling you about ahead of time, right? Because they don’t even know yet. Employees have to be more aware. 

Employers have to be aware of the negative consequences of using this information, right? Of the possible legal negative consequences, the cultural consequences, culture in terms of your workplace culture and the financial consequences, eventually, right? That stem from both those legal and cultural implications. [00:27:00] 

And really the third sort of takeaway is, I’m just urging a slowdown. I don’t think we can say stop altogether because technology is moving forward and people are excited about it. But we have to slow down and consider what this is doing to our society and to our workplaces. Don’t just adopt every new option just because it exists. Think through the long-term implications. Maybe we want an opt-in approach or an opt-out approach or maybe we want a mandate that employers tell you in big bold letters what they are doing as opposed to burying it in the small fine print. Maybe we want committees that have to evaluate how the data is used years after it was initially gathered, right? There are all kinds of possibilities but we have to slow down and think about what we are doing rather than just act on the short-term gain. [00:28:00]

Host: Thank you, Leora, for joining me. Your research on blurring the line between personal and professional is fascinating and honestly, it is a little bit scary. But the important thing here is knowledge. Now we know to look closely at what kinds of information we’re willing to give our employers and what they will be able to do with it. And employers know to be cautious of what they are doing with our data. It might seem attractive to track everything but companies should weigh the risks carefully. Are you willing to collect personal data at the risk of losing the trust of your employees? 

Catalyst is a podcast from Temple University’s Fox School of Business. Visit us on the web at fox.temple.edu/catalyst. [00:029:00] We are produced by Eva Terra, Megan Alt, Anna Batt and Stephen Orbanek, with help from Karen Naylor. Special thanks to Joe Williams at Temple University’s Tech Center.