Solution Brief

How to talk about Gem's diversity recruiting insights with your legal team

Your organization cares deeply about diversity, which is the reason you’re reading this... so we’re off to a great start. But regardless of what your org’s diversity goals look like, you’re only going to hire underrepresented talent if you can first get them into the top of the funnel—the stage of the hiring process in which you have virtually no demographic information about them. So how do you know if you have the representation you want at the very top of the funnel, where it matters most?

Gem offers Diversity Recruiting Insights that allow talent teams to track how their work affects the gender and race/ethnicity makeup of their hiring funnels, from first outreach through hire. With Outreach Stats, teams can see how successful their messaging efforts are for different demographics: what percentage of passive talent in a particular group opened, replied, indicated interest, and converted to process based on your outreach? Once they’re in process, we analyze conversion rates through the funnel. Are certain demographic groups disproportionately dropping out of process at certain stages? Where might unconscious bias be impacting your ability to hire underrepresented talent? These are the questions Gem’s diversity data can help you answer… but your legal team is likely to have questions of their own. Here are some answers you can give them:

How does Gem get its demographic data?

We use an algorithm to predict gender and race/ ethnicity based on three data points: first name, last name, and location. The model we use is trained and tested on large sets of publicly self-identified data (the U.S. Census Bureau as well as a number of global datasets). The self-disclosure that occurs in these datasets is precisely why we use this model, as opposed to a machine-learning model in which Gem tries to interpret the data ourselves: self-disclosed data will always be the most accurate and reliable demographic data. And because ours is not a continuous learning model, there’s no additional bias introduced into the categorization process: we’re working with historical datasets in which talent may have self-disclosed elsewhere.

That said, teams can import self-ID data directly from their ATS into Gem through our integration. This means that once a candidate self-identifies through an EEO form, you can overwrite Gem’s prediction, making your pipeline data even more accurate.

The self-disclosure that occurs in the datasets Gem uses (i.e. U.S. Census Bureau data) is why we use this model: self-disclosed data will always be the most accurate and reliable demographic data.

How accurate is Gem’s demographic data?

You’re never going to get perfect accuracy when it comes to demographic data; but we come pretty close! Our gender data is 90%+ accurate, and our race/ethnicity data is 75-95% accurate. Combined, you can expect 80-90% accuracy— which is enough accuracy to inform your DEI strategy overall.

What demographic groups does Gem predict?

Gender diversity in Gem is broken down into men, women, non-binary, and unknown. Race/ ethnicity diversity is broken down into White, Black, Hispanic/Latinx, Asian, and undetermined. We also offer custom fields so teams can create their own categories, depending on what their diversity initiatives look like.

Why is Gem’s race/ethnicity data only offered in aggregate?

Gem doesn’t provide race/ethnicity predictions for individuals: we’re not telling you “here are the Asian candidates in your pipeline” or “here’s the Black talent that is receiving your outreach.” Access to individual data doesn’t necessarily invite bias; but it’s critical that teams handle such data in a sensitive manner and for business purposes.

So for now, we only show you race/ethnicity in aggregate, within groups of 5 or more (if the sample size is too small, you won’t have access to data for that group). Only showing aggregate data accounts for variance—but more importantly, it fulfills our objective: to give teams a sense of the “directional correctness” of their efforts.

By “directional correctness,” we mean this: if your organization’s goal is gender parity but Gem shows that only about 20% of your outreach is going to women, it’s time to course correct at the top of the funnel. If only about 5% of Black talent is passing your phone screen while about 25% of White talent is passing, that’s an indicator that there’s either bias in your screening questions, or in your screener. Think of Diversity Recruiting Insights as providing the likely or general makeup of the recipients of your outreach sequence and your subsequent pipeline. How well are you doing at prioritizing diversity overall? How likely (or unlikely) is it that your hiring process contains some bias that, if emended, would diversify your team?

Is this kind of reporting even legal?

According to the U.S. Government, yes. It’s worth noting that demographic data collection isn’t a bad thing. (Think of affirmative action programs, which encourage companies to target diverse candidates.) In fact, many employers are legally required by both the Department of Labor (DOL) and the Office of Federal Contractor Compliance Programs (OFCCP) to collect demographic data on 100% of their employees, as well as on applicants where possible. The OFCCP language is unambiguous: contractors must identify “the gender, race, and ethnicity of each employee” and “where possible, the gender, race, and ethnicity of each applicant” (41 CFR 60-1.12(c)). This is the case even if candidates and employees will not volunteer their own data.

You see where the dilemma lies: employers must account for the race and gender of both their workforce and their potential workforce. But they can’t require talent to provide that data themselves. Neither the DOL nor the OFCCP has mandated a specific method of demographic data collection—only that it be done. Of course, the preferred approach is to allow candidates and employees to self-identify. But the government wants that information even if they refuse to. The DOL states that “if self‐identification is not feasible, post-employment records or visual observation may be used to obtain this information.” But guessing a candidate’s or employee’s race or gender (“visual observation”) is not only unreliable; it could also easily become the basis for a discrimination lawsuit—especially if you guess incorrectly. So employers and contractors are better off reviewing other records and data sources to provide more reliable information to the government.

This is where Gem can be invaluable: any examination of additional documentary information on the part of an employer would qualify as “not a guess.” So our algorithm, which compiles various documented sets of data, helps recruiting teams meet both DOL and OFCCP requirements.

Can we control who on our team has access to this data?

Absolutely. Gem provides controlled, customizable access to EEO stats so that data access is only provided to approved custodians. Allowing the appropriate individuals in your org access to diversity data facilitates compliant hiring goal-setting, yet these active controls mitigate risk. They invite hiring organizations to structure conversations about the data in a way that protects attorney-client privilege, and they give legal teams proactive visibility into how the data is being used and who is using it.

Employers must account for the race and gender of both their workforce and their potential workforce. The DOL states that “if self‐identification is not feasible, post-employment records or visual observation may be used to obtain this information.” But employers are better off reviewing other records and data sources to provide more reliable information to the government.

To learn more about how Gem can help your team, contact us here.

Request Demo Image

Get started today

See how Gem can help you hire with remarkable speed and efficiency