Happy National Cybersecurity Awareness Month! In its 17th year of existence, National Cybersecurity Awareness Month (NCSAM) is continuing to raise awareness about the importance of cybersecurity across the nation. In an age where almost every week we are being notified of breaches of digital information, NCSAM offers the opportunity to continue to educate Americans and corporations about the importance of their cybersecurity teams, their software, and the importance of securing their customer’s information online. The NCSAM’s theme this year is “Do Your Part. #BeCyberSmart” and in supporting that theme, Black Girls Hack is doing our part to highlight the impact of the lack of diversity in Cybersecurity.
While Cybersecurity has many diversity problems, none are more glaring than the lack of women, and the lack of African Americans. In 2019, the Bureau of Labor Statistics performed a survey of employed persons detailed by occupation, gender, race and ethnicity. In that survey, African Americans represented 7.6% of Information Security Analyst positions and women represented 17.1% of those roles. Similar statistics exist for all the Professional and Related Occupations including Systems Analysts, Programmers, Software Developers and Network and System Administrators to name a few. The lack of diversity in Science, Technology, Engineering and Mathematic (STEM) roles, is a direct reflection of the amount of diversity in STEM undergraduate and STEM graduate programs and in STEM programs in high school, and middle school and elementary school. *Insert infinity mirror*
More than just lacking representation, and role models, the lack of diversity in Cybersecurity has many unintended side effects such as adding bias to artificial intelligence, signature analysis and definition, and systems themselves. Malicious actors are creative and diverse in their way of thinking and to stay ahead of the game, cybersecurity professionals must be reflective of that trend and of society.
Organizations are using artificial intelligence to do everything from deciding what to watch next, to driving, to interviewing and determining the best candidate, and criminal justice. Analysis has shown that the over-representation of men in the design of artificial intelligence leads to both cultural and gender bias in the developed systems. Machine learning, which is how systems gain their “intelligence” is built off the data that it is provided with and if that data, and the design and development of the algorithms are biased, the resulting application of the technology will perpetuate that bias (Leavy, 2015).
More advanced intrusion detection systems for example use Artificial Neural Network based Intrusion Detection Systems (IDS) to help detect attacks. These Artificial Neural Network IDS systems analyze large volumes of data and use that data to help predict attacks and learn from its mistakes (Garzia, Lombardi, & Ramalingam, 2017). Recent studies have shown that examination of facial analysis software shows an 0.8 percent error rate for light skinned men, and a 34.7% error rate for dark-skinned women (Hardesty, 2018). Three reviewed commercially released facial analysis programs from major technology companies showed both skin color/skin type and gender related biases. What that means for us, as consumers of these systems, is that these systems, having learned how to respond based on the data it was provided will have difficulty in identifying the way women make decisions, and differentiating black faces in video footage, and determining if a Black woman is a good fit for a job when it can’t accurately interpret her facial expressions. Some companies are replacing first round interviews with AI assisted technology. Applicants are asked to use a webcam to respond to interview questions on video. The employers can then use AI to “review” the interviews to evaluate if the candidate matches in demeanor, enthusiasm, facial expressions, or word choice (Burke, 2019). Based on this evaluation the candidate is then recommended (or not) for the next round of interviews. When AI cannot properly analyze darker skin or gender based differences, and is built from data and developers with inherent biases, this serves the purpose of both eliminating diverse applicants from the hiring process, and reducing the number of diverse employees within the companies.
So why isn’t this being shouted from the mountain tops? It’s because research has shown that the people who often address gender and racial bias in Artificial Intelligence and developed software are often those affected by the bias (Leavy, 2015). Susan Leavy in her white paper on Gender Bias in Artificial Intelligence argues that by recognizing the bias, women are more likely to understand its impact and attempt to resolve it (Leavy, 2015). The problem? While women represent 47% of the occupational workforce, they represent 27% of Chief Executives, 28% of Computer and Information Systems Managers, 20% of computer programmers, 18% of software developers, and 17% of information security analysts. African Americans fare far worse representing 4% of Chief Executives, 9.6% of Computer and Information Systems Managers, 8.5% of computer programmers, 5.8% of software developers, and 16.6% of information security analysts (BLS.gov, 2020).
Cybersecurity has a diversity problem and until minority and gender discrepancies in hiring, education, and access to resources are resolved, America and its citizens will be worse off in every aspect of the industry.
BLS.gov. (2020, January 2020). Labor Force Statistics from the Current Population Survey. Retrieved from BLS.gov: https://www.bls.gov/cps/cpsaat11.htm
Burke, L. (2019, November 4). Your Interview With AI. Retrieved from insidehirered.com: https://www.insidehighered.com/news/2019/11/04/ai-assessed-job-interviewing-grows-colleges-try-prepare-students
CISA.gov. (2020, October). National Cybersecurity Awareness Month. Retrieved from CISA.gov: https://www.cisa.gov/national-cyber-security-awareness-month
Garzia, F., Lombardi, M., & Ramalingam, S. (2017). An integrated internet of everything — Genetic algorithms controller — Artificial neural networks framework for security/safety systems management and support. International Carnahan Conference on Security Technology (ICCST).
Hardesty, L. (2018, February 11). Study finds gender and skin-type bias in commercial artificial-intelligence systems. Retrieved from MIT News: https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212
Leavy, S. (2015, May 28). Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning. Retrieved from https://ame-association.fr/wp-content/uploads/2018/11/17.188_gender_bias_in_artifical_intelligence_the_need_for_diversity_and_gender_theory_in_machine_learning.pdf