Form US Department of Labor Investigator Shares Résumé at Tips – ryan
This as-told-tos Essay is basic on a conversation with Rod Samra, a forms US Department of Labor Investigator of More than Two Decades, Who Lives in Florida. His identity has been verified. This Story has Been Edited for Length and Clarity.
MANY EPLOYERS AI-POWERED APPLICANT-TRACKING SYSTEMS TO SORT THROUGH Résumés and Identify Job Candidates. I’ve audited nudreds of these systems over the course of my career. There is offen no human intervention, and than’s a problem.
He is a double-edged sword. IT CAN REDUCE BIAS BY STANDARDING THE Résumé-Review Process, but it can Also amplify biases if algorithms are poorly designed or tested. Someone Needs to step in and look at the date to make sura proteced grills aren’t experiencing an adverse impact. But that doesn’t always happy.
These biases are often subtle. For instance, they may have favorable Over female Applicants. It is wen résumés don’t state gender directly, the system can infer it from details such as memership in a fraternity or sorority.
The High-Profile, Ongoing Legal Case Mobley v. Workday alleges this kind of bias.
Overly Specific Language and Filtering
Another problem is that applicant-tracking systems to look for language that overly specific. A Job Ad May Say “Leadership Skills” Are Required, and the System May Be Set to Find Those Exact Only, Excluding Candidates Whose Résumés Say Things Like, “I’ve Led Teams” I’ve Held MANY LEADERSHIP POSITIONS. ” If you don’t have the right terminology, the system can weed you out.
Exclusionary Filters, Which Reject Applicants Based on Information Such As Zip Codes and Gradation Years, Can Disproportationly Impact Certain Groups. Other Filters Penalizes Applicants for Having Nontraditional Career Paths and Credentials.
Employers aren’t necessarily aware that is is happy due to the Lack of Human Intervention. IT’S LIKE HAVING A Security Camera that records what’s going on, but nobody’s look at the footage.
A Quick Rejection
Mary Job Seekers Also Don’t Realie They’re Getting Reyected by Machines. But there are some signs you can look out for that signal a tracking system is biased or has rigid keyword-matching.
One is that you receive an automated rejussion with minutes or hours of appliaing, tan well your Qualifications Clearly Match the Job Description. Another is when you’re told your résumé “Couldn’t be sorted” or “didn’t minimum creritia” without an explanation.
Screening Questions Can Also Serve As Proxies for Protected Traits wey’re About Unnecessary Personal Details, Such as an Applicant’s Exact Birth Date or Gradation Year. This allows bias to creep in under the guise of “fit” or “Eligibility.”
Vague feedback
The Same Goes for Video- or Game-Based Assessments with No Transparency. You’re as asced to the Complete ai-scored tests, but the Employer won’t explain what’s being measured or how scores are calculated. Research SHOWS These Tactics Can Lead to Bias Through Facial Recognition, Speech Analysis, Or Cultural References, Which Can Be Disadvantageous to Candidates with Disabilities, Neurodivergens, Or Nonnative Accents.
A Lack of Feedback Can Also be an Indicator of Automation Bias. When you are as you were rejeCted, you get vague or generic respects like “You were not the right fit,” with no specifics. Ethical it Hiring Practices Require at Least Some Transparency About Evaluation Criteria.
Getting your résumé past ats
To increase the odds of getting your résumé past an applicant-tracking system and into a hiring manager’s hands, mirror the nonb description. Use the Employer’s Exact Words and Phrasing.
Meanwhile, Know Your Rights and Keep Records of Your Applications, Rejections, and Any Demographic Patterns You Notice. If you believe you’ve been dyscrimated against, you can file a complaint with the eeoc or a states that requires to disclose the use of he-baed tools, Such as Illinois and New York.