After talking to women in the IT industry, it's clear that the reason they leave their previous companies that they do not like the culture of their work, and do not feel like they have enough support to change it.
Unfortunately, by enabling male-centered culture, we are missing out on nearly half of the population's potential coding talent.
Actually, many companies know that they should try to hire diverse employees, but the benefits of a diverse workforce might not be evident at first blush.
1. Diversity brings innovation
...and different perspectives on how to approach work.
Working in a cross-functional team with diverse background means working in a creative setup. Every person has a different idea of life. Every person has a different story to tell.
Diverse teams are more productive, more creative, and more innovative than teams consisting of teammates within a similar demographic and gender. Promoting diversity not only within an organization as a whole but within departments or teams will boost the company’s productivity.
2. Provide global opportunities
For many companies, it’s not enough to conquer only the home marketplace. It’s about becoming a world leader in its industry. They might already do business with vendors or clients in numerous countries, but diversity in the workplace often means uniquely diverse skillsets that can be translated to a global stage.
Do employees speak other languages? Understand different cultures? Have friends in other countries? Know local marketplace conditions? That's it. These are the advantages.
3. Improve the company’s brand
Creating a diverse workforce can improve customers’ opinions of the company. Nowadays, we are not just employees but consumers. We look for the same qualities in the companies that we purchase from as we do from companies we might work for. Diverse companies show consumers that they are inclusive, global-minded, innovative, and looking toward the future.
All in all, diversity inhibits groupthink. Diverse teams perform better. They make better decisions, they are more creative, and they produce better business results. And this extends beyond race and gender to include a diversity of education and work experience.
If you would like to know more about how gender diversity can improve the performance of the company, learn about this in my newsletter (+bonus professional networking cheat sheet).
Thank you for reading! 🙏
Don't hesitate to approach me on Twitter, if you have any questions regarding this topic or leave your comment below to discuss it further!
Cheers,
ilonacodes
Photo by rawpixel.com from Pexels
Top comments (22)
Wait... I've read this issue before... And I still don't fully understand this issue... I mean, maybe it's because I'm a male (shame on me), but, are they not supposed to center on people's skills and qualifications instead of what they have between their legs or where they come from?? I mean, come on... It's 21 freaking century already... We are about to have people on Mars... Or maybe I'm missing the point on why they are not diverse?...
I studied on a class of mostly males, and the few females I've known did choose technical paths, is it because the dev. Culture is female-averse? (Damn it, who thought it was a good idea to only have guys around??!!)...
I'd love to know more about this issue. I still have questions.
What resources could you point me towards to have a solid, number based (you know, studies and charts and percentages) understanding of this matter?
Thanks.
From what I gather most of the inclusive policies improve the company's brand. Nowadays anybody can easily start smearing a company just because they think the workforce isn't diverse enough. I believe every person is unique and has his/her unique way of doing things irrespective of their skin color or gender. Saying that only a group of people who look different can think different is pretty absurd.
It's not all about talent. You might have the most talented people in your team, and miss key use cases that will have a huge impact in your project. In general, we all tend to think that users are going to be like us, but that's not always the case, and the results can be really negative: websites that are not accessible for people with disabilities, face detection software that works great for white men but not so great for women or people of color, text that uses slang/terms that may have a negative connotation in some cultures, variables in units that are not the expected ones, etc. Could a more diverse team miss them too? Of course, but once you have experienced many of those scenarios first hand, you pay more attention to those "details."
Wait...
But wouldn't these details be paved on the requirements and problem assessment, even before the solution engineering starts?
Wouldn't these things be assessed before the algorithms and coding even began? I mean...
If you're gonna develop a face-recognition algorithm for "people" (generalistic approach), you need to assess first who this people would be, and indeed you would have to test your solution against these people.
For example, not only on white males, but white females, black females, black males, etc, Etc... (All people that correspond your target population), and if the algorithm is wrong, you would have to fine tune-it or retrain your models.
I mean... You wouldn't develop an app. that doesn't have accessibility options if your target is a lot of people there will be blind or deaf people.
But if the workforce doesn't include your target audience, or representing players on yur target audience, you will have issues. Is that what you mean??
What I meant was that I would like to have more resources pointing me on the conclusion that an inclusive workforce is directly related to inclusive solutions, and that the design and development can only be produced if we have a diverse workforce.
I mean... If the factor is inclusive workforce, or an inclusive solutions approach that would not necessarily depend on a more inclusive approach but a wider, more sensible analisis on the solutions and the target market.
Maybe it's the methods, not the people. That's why I would love to know more with some studies or so...
Yes, this! We like to believe we can cover every scenario, but in reality, it's harder to remember to cover scenarios we don't personally relate to or have experience with. In theory, these things /should/ be assessed before the solution starts, but in reality, it's not. For example this study I am linking below. The study was done on tech by really large tech companies like Microsoft and Amazon. You'd think as tech giants they would be careful about their process before deploying things to market, but they still failed to a certain extent.
news.mit.edu/2018/study-finds-gend...
insurancejournal.com/news/national...
Coding is just a step in the SDLC. Taking into account diversity and inclusion are things that need to be done at every step of the way, because every contributor in a project can (and should) bring concerns up.
100% agree with this... but still it doesn't happen. And there are many examples available: face recognition software is 99% effective with white men, but 65% effective with black women. Ask if Target and Domino's have blind/deaf people as their customers and how did it go for them accessibility-wise. POs, BAs, ScrumMasters, Designers, Developers, Testers... they all didn't realize something as basic as what you are saying. Again, because developers tend to think that users are like them. Even for the test data.
No, that's not what I said or meant. I said that developers tend to think users are like them. It's not bad, it's a normal bias. If you have a uniform team, chances are that some issues are brought later than what they would have been brought in a more diverse group. Different points of view and experiences enrich a team in that sense.
About some studies, you can see this Medium post about diversity statistics with links to different sources and studies. (Which doesn't mean that I endorse any of them in particular, I actually disagree with the approach of some I read in the past. This was just the result of a quick search on Google).
Thank for your patience!! This clarifies a lot of the issues. After this starting point, I can pick up searching and knowing more.
I wish people would stop using the AI examples for this, as it is a very poor argument.
When training an ML model, or doing any kind of statistics, you must ensure your test set is representative of the population you are going to make a statement about. Sex and skin color are blatantly obvious cases in vision systems. There are biases that are far harder to detect but have the same result on an individual. Adding team members with different sex or skin colors might fix this particular symptom, but the problem is that your data-gathering is inadequate.
For instance, a little ago there was a post about a soap dispenser using AI/computer vision to recognize whether a hand was beneath it, but only worked for light skin colors. The argument was made that a more diverse team would have spotted this problem, completely missing the point that a cheap sensors would have been a more robust solution and would work for different skin colors, missing fingers, tattoos...
There's a further problem. Often, there just aren't enough willing participants to get a representative data-set. This is a well-known problem in academia. Many people just have better things to do than subject themselves to some tests they do not understand for a few bucks. While we should be wary of unrepresentative data-sets, often the only alternative is doing nothing at all.
There are good arguments (beyond public relations or social injustice) for at least male/female diversity, and there are excellent arguments for tearing down some of the 'soft barriers' keeping mostly women out of STEM. This just isn't one of them.
I understand the AI example may not be the best, but it's a sign of something bigger. And it is not limited to poor test data. As I put in a different comment, development is not only coding, it involves all the steps in the SDLC, and data gathering too.
The data-gathering is definitely inadequate, but it's not an excuse either. Training data doesn't show up out of thin air, it is created and gathered by people (or algorithms created by people), which may influence its representation of the population and neutrality.
Even if the data is wrong, and the training is wrong. Nobody realizing that the accuracy was so far sided is a sign that they were oblivious to a sex/skin color issue. No one thought "hey, we have 99% accuracy for white men, but 65% for black women"? And if they did, nobody did anything? That's not a data gathering issue.
I agree that it is not always possible to get a good representation of the population. But in this day and age, with many free sources available for images and portraits, having bad data for a vision system is a poor excuse.
I can't really see the point you're trying to make here. Nevertheless, I think there are a few problems with what you're saying.
Yes, it is a product of a divided society. The reasoning "biased AI → we need diversity in tech" does not hold though.
If you know a good example about how diversity in the development team can profit the company, use that rather than AI. Let's not dilute good arguments with bad ones.
You also appear to assume the entire team is responsible for the whole process, which is often not true. Essentially this issue only matters for QA.
I think you've missed my point here. There are an uncountable number of biases your dataset might have. A good data-gathering process ensures samples are representative of the final use case. Skin color issues are an indicator that the data-gathering process is poor and produces bad results. That is a problem in and of itself. Adding a black woman to the team might solve this particular issue, but the team is still going to produce dangerously biased models, with biases that are far less obvious to notice.
This is unlikely to be the case. ML will just match the data, whatever that is. Beyond having a model that is too simple, which will result in low accuracy, bias of the model after training is a reflection of the bias in the input data.
This would cause exactly the bias problems I was talking about. Data gathering is hard. You can't just download some pictures and expect it to be an unbiased dataset.
I'd like to reiterate: I'm not making and argument against diversity. I've had rather good experiences pair-programming with women; men and women have different ways of tackling problems and there's definitely a "stronger together" effect. I would, however, like to see the argument of biased AI go away.
If you add bad arguments to good ones, the good arguments lose credibility.
I like diversity, I don't like "diversity for the sake of diversity". I will never hire a white man, an Asian woman or a black guy just to tick some diversity checkboxes.
That's why I appreciate the selection process at my company. When I get sent a CV to review, I don't see the personal details. Just the technical skills and past experiences. I'm not hiring someone because of who they are, but because of what they can do. The human part is handled later, and by HR staff (which is much more skilled in human relations than I am)
I partially agree in your thought process behind hiring an optimal candidate. I, too, value high technical skills in anyone I meet. However, we, as humans, are much more than simple, code-pumping machines. We are more, in that, we have emotions.
These emotions allow us to best ourselves constantly or put others down. These emotions, when organized, make up our behaviors. These behaviors, in my point-of-view, make up more of who we are than our external-physical traits. If they make up more of who we are, then they should be the basis of how we categorize people.
My point is this: if you wish to get the best candidate for a job, technical skills matter but if you truly "like diversity", please consider how stable, creative and motivated a candidate displays themselves to be even if this same candidate appears to be less qualified on paper.
Yes, yes absolutely. I omitted this part because I didn't think it was relevant, but the CVs I get sent are for people I have to assess. I usually put candidates in technical situations they are not super comfortable (for example, I had many candidates who have never done Test Driven Development, and with those, we try implementing some simple feature using this methodology) to see how they work out something new.
I, of course, don't stop at the CV and meet them personally during the assessment. The HR staff is also in charge of evaluating their human side. But the fact that I don't have unnecessary clutter to dig through when reading the CV means I have a lot less bias during the first screening.
Well, I would hope you have bias toward at least the appearance of competence in the field in which the candidate would be working.
Unfortunately, by this method you are only assessing the candidates ability to write an enterprise-conformant CV.
To bring a different point of view into play:
It has been proven that women increase climate and productivity in the workplace.
Psychologically men are less aggressive and the whole team works better together. In the asian region, studies have even shown that the women did not even have to contribute professionally to the success, the presence in the team was usually completely sufficient.
The female care and consideration of women leads enormously to a healthy working climate and contribute to the success of projects. Women pursue their own interests less than men and prioritise the well-being of the project.
I can't recall the source where I read it at the moment but when it comes to my mind, I will edit it.
So what you are saying is the benefits that a female colleague brings into the work environment are indistinguishable from a potted plant.
What? How is he saying that?
I hope
dev.to != reddit.com
.One issue I've seen with diversity in the workplace is that each company seems to have its own interpretation of what diversity should be (many times associated to the corporation's own demons). I worked in a company in which diversity was synonym for Hispanic, in another one it was LGTBQ, another one was people of color, other was women... in some of them, being part of the diverse group was great; but being outside of that idea of "corporate diversity", even when being part of what would be considered a diversity group, could mean less opportunities and ostracism.
What would you do in those cases? How would you approach this diverse/not-really-diverse environments?
Silicon Valley has diversity: a large foreign born male workforce (71% Spring 2018) complemented by American males. Given the huge number of male H1-Bs (primarily from India) that the government lets the companies import, I was surprised to see a group complaining about a lack of women and minority workers (presumably American) in Silicon Valley - without saying one word about the massive importation of workers.
When building something for humans, it's good to have as many different human viewpoints as possible.
Anyone can code. Put some effort into finding the ones that bring something new to your team. Not just what's on their outsides, but their ideas and what they can tell you about your applications and the assumptions you've made.
For example, does your application assume that everyone uses email? That everyone reads English fluently? That first (or last!) names never change? Or even that someone might not want to be addressed by the name scraped from their credit card? (I am still sour that Panera does this.)
We all make a ton of invisible assumptions every day, it's a completely normal thing for brains to do just to process the immense amount of data we have to sift through. The problem is when we don't have anyone to tell us when these assumptions aren't valid for what we're trying to accomplish, and that's why we need diversity in software. We have a bunch of applications that just don't meet the needs of the users they're trying to serve, and the problem could have been solved for less time and money if their teams had been more inclusive to start with.
This seems like great advice for Product Managers.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.