Skip to main content

GitHub pull requests affected by gender bias, new study suggests

github gender bias coding
Pixabay
Evidence that gender bias exists in the field of computer science has emerged in the form of a new study examining acceptance rates of contributions from men and women in an open-source software community. The study’s findings indicated that women’s contributions were rejected more often, but only if their gender is identifiable. If, in fact, it is unclear whether the contributor is a man or a woman, women’s contributions tended to be accepted at a greater rate.

“There are a number of questions and concerns related to gender bias in computer programming, but this project was focused on one specific research question: To what extent does gender bias exist when pull requests are judged on GitHub?” Emerson Murphy-Hill, corresponding author of a paper on the study and an associate professor of computer science at North Carolina State University, told phys.org.

In order to conduct this research, Murphy-Hill and his colleagues analyzed more than three million pull requests (methods to improve code on a project) from around 330,000 GitHub users, about 21,000 of whom were women. Within this group, 78.7 percent of women’s pull requests were accepted, while 74.6 percent of men’s requests were accepted.

But when gender got involved, things got a bit more complicated. Murphy-Hill also looked at pull requests from individuals who were not considered “insiders” on a project, and found that gender appeared to play a role in acceptance. Computer scientists who were easily identifiable as women (as a result of their name or profile picture) ended up with lower pull request acceptance rates at 58 percent than male users at 61 percent. Curiously enough, female programmers with gender neutral profiles had the highest acceptance rates of all in this group (70 percent), even more so than men with gender neutral profiles (65 percent).

“Our results indicate that gender bias does exist in open-source programming,” Murphy-Hill said. “The study also tells us that, in general, women on GitHub are strong programmers. We don’t think that’s because gender affects one’s programming skills, but likely stems from strong self-selection among women who submit pull requests on the site.”

You can check out the full results of the study in the open-access journal PeerJ Computer Science, where the research is published under the title, “Gender Differences and Bias in Open Source: Pull Request Acceptance of Women Versus Men.”

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more