Project information

Details

Implicit biases in language and writing results in unequal opportunities in hiring. From job listings to recommendation letters, these biases have been shown to impact hiring decision. Gendered words, racialized terms, and ableist language contribute to sustained inequality. Studies show that gendered/racialized wording in job postings lead to fewer female and minority applicants. They also show that recommendation letters with “feminine” words imply a less qualified applicant in the eyes of recruiters. These biases reinforce workplace inequality, placing female and marginalized applicants at an disadvantage in the job market.

  • Developed a web application using React and Node.js that scans job postings for biased language, improving inclusive hiring practices and enhancing diversity awareness in recruitment
  • Built and integrated a bias detection in algorithm in Python, analyzing job postings and identifying key phrases requiring adjustments, increasing detection accuracy by 30%
  • Resolved 10+ UI/UX bugs and optimized frontend responsiveness with CSS and React components, improving user experience and reducing processing time by 40%