Other parts of this series:
- Workforce Data: gaining value by building trust
- Giving to Get: building trust by incentivizing FS workplace data collection
- Sharing the Value: co-owning FS workplace data with employees
- Share Responsibility: creating a system of checks and balances for FS workplace data
- Elevate People: Using FS workplace data to get the most from employees
With the right use of technology, companies have the opportunity to unlock valuable information that can benefit both the employee and their bottom line. But how a company uses the data it collects is as important as what it collects. It is important for those with access to the data to ensure that once it is gathered, it is used to elevate––not punish––workers.
By reviewing data on people’s capabilities, such as the ability to learn, analyze or collaborate with team members, leaders can create far more accurate predictors of performance, identify hidden skills and match people to jobs they never imagined they could do.
However, while analyzing data to identify new opportunities for the individual worker can be beneficial, such scrutiny has left many employees (59 percent) concerned that the same information could work against them. In fact, 75 percent of executives themselves fear this development.
Data-based decision making can easily overlook uniquely human factors, making it too easy to look past individual attributes that may be of benefit to a company. So, rather than adding undue stress on a worker who may fear punishment based on a data set, it is important to establish that the information is being shared to elevate them by way of helping the company understand the challenges within its system. Employees may also want to share feedback on their work to help identify opportunities for better outcomes.
Our research has shown that companies that use new workplace data to help employees learn, grow or make their jobs easier can outperform those that use the data primarily to monitor and penalize individual employees.
The algorithms that source data are written by people, and sometimes those people can inadvertently be biased. While much of the data collected is meant to be scrubbed of gender and race identifiers, biases can still persist. As Kriti Sharma wrote in the Harvard Business Review article, Can we keep our biases from creeping into our AI?, “not addressing the issues of bias and diversity could lead to a different kind of weaponized AI.”
The good news: with the right design, technology provides ample opportunity to reduce bias. To do so, the teams designing the algorithms should be a diverse group of individuals, from different backgrounds, ages, and genders.
One example of a company taking a unique approach to reducing bias is that of Pymetrics, an AI startup. The team at Pymetrics developed a tool called Audit AI which was originally designed to root out biases in its own algorithms. It has since open-sourced the Audit AI tool to help others audit the output of virtually any machine learning technique, meaning anyone from any background or gender is able to identify potential biases.
Tools like this, along with meaningful approaches to building trust and consent, can help companies unlock the untold value of workplace data. It takes strong leadership, the right people at the table, and a commitment to privacy and security––but once established, trust can help elevate both employees and the organization itself.
If you enjoyed reading this series and would like to know more about unlocking the value of organizational DNA, I suggest downloading the full report, Decoding Organizational DNA: Trust, Data and Unlocking Value in the Digital Workplace.