In recent years, there have been increasing concerns that LinkedIn, the popular professional networking platform owned by Microsoft, may be engaging in discriminatory practices against certain groups of users. Critics argue that LinkedIn’s algorithms and policies may make it more difficult for women, minorities, older workers, and other protected groups to fully utilize the platform and benefit from its vast networking potential.
What is LinkedIn Discrimination?
LinkedIn discrimination refers to the idea that LinkedIn may intentionally or unintentionally limit opportunities and visibility for users belonging to legally protected groups. This could happen through:
- Algorithms that show some users’ profiles more prominently than others
- Policies that restrict how users can present themselves on their profiles
- Allowing wage and age data that could enable discrimination
- Not acting on reports of harassment or discriminatory content
Critics argue that even unintentional algorithmic bias or policies that seem neutral on their face may have a disparate negative impact on marginalized groups. This may reinforce real-world biases and limits to professional advancement for women, minorities, LGBTQ individuals, disabled individuals, and older workers.
Examples of Alleged LinkedIn Discrimination
Here are some examples of practices that critics allege may constitute LinkedIn discrimination:
Profile Rankings
LinkedIn’s algorithms determine the prominence and visibility of user profiles in search results and feeds. Some argue these algorithms favor traditionally privileged groups, making it easier for majority groups to network and get discovered by recruiters.
Name Presentation
LinkedIn does not allow symbols or accents in names that do not use the Latin alphabet. Critics say this could force some users to misrepresent their names in a way that hinders networking.
Gender Options
Until 2021, LinkedIn only allowed binary male/female gender selections. The lack of non-binary options may have isolated transgender and non-binary users.
Age and Wage Data
LinkedIn allows users to post their age and wage history. Critics argue this may enable subtle discrimination by letting recruiters screen candidates based on protected characteristics.
Harassment Issues
Some users say LinkedIn does not do enough to curb harassment and discriminatory content targeted at women, minorities, and other marginalized groups.
What Does LinkedIn Say?
In response to recent allegations of discrimination, LinkedIn states that it does not tolerate bias and is working to enhance diversity, equity and inclusion across its platforms. Actions it highlights include:
- Implementing new algorithms designed to boost diversity among suggested connections
- Expanding name presentation options for international users
- Adding non-binary gender options
- Launching channels to report harassment and offensive conduct
LinkedIn says it relies on user reporting to identify and remove policy-violating content. It claims harassment affects a small fraction of its user base, but any amount is too much.
What Does the Law Say About LinkedIn Discrimination?
Under federal and state laws, online platforms generally enjoy broad immunity from liability for user-generated content. However, platforms can still face liability for their own actions enabling discriminatory conduct. Relevant laws include:
Title VII
Bars employment discrimination based on race, color, religion, sex and national origin. LinkedIn could face claims for enabling discriminatory recruiting and hiring.
Age Discrimination in Employment Act
Prohibits age-based discrimination against job applicants and employees over 40. Posting age data may raise liability risks.
Americans with Disabilities Act
Requires equal opportunity for disabled individuals. LinkedIn could face scrutiny for any algorithmic or policy barriers.
Communications Decency Act
Section 230 shields online platforms from liability for third-party content. But platforms can still be liable for their own actions and algorithms that enable discrimination.
LinkedIn Algorithmic Discrimination Lawsuits
While no major suits have been filed yet, algorithmic discrimination is an emerging area of liability risk for platforms like LinkedIn. Possible claims could include:
Disparate Impact Discrimination
Even neutral algorithms can produce outcomes biased against protected groups. LinkedIn could face liability if its algorithms disproportionately restrict opportunities.
Disparate Treatment Discrimination
Biased algorithms that intentionally treat some users differently could also trigger liability for intentional discrimination. But disparate treatment is hard to prove.
Discrimination Per Se
Algorithms that explicitly consider protected traits like gender or age as input factors could be deemed discriminatory on their face.
Steps LinkedIn Is Taking
To mitigate discrimination risks from algorithms, LinkedIn says it is:
- Conducting impact assessments to identify algorithmic biases
- Using machine learning to proactively detect policy-violating content
- Designing algorithms to boost diversity in search rankings and suggestions
- Making regular adjustments to formulas to remove unintentional biased outcomes
However, critics argue LinkedIn needs to do more to evaluate and address the root causes of bias encoded in its systems.
What Users Can Do
Users concerned about potential LinkedIn discrimination have a few options:
- File reports of harassment and discriminatory conduct
- Notify LinkedIn of problems with name display or gender options
- Avoid posting sensitive data like age or wage history
- Use profile customization tools to boost visibility
- Consider raising concerns publicly or consulting an attorney
However, some experts argue the burden should be on LinkedIn to take proactive steps to test for and eliminate biases from its algorithms and policies.
The Bottom Line
Concerns about algorithmic and policy discrimination on major platforms like LinkedIn are likely to persist. But the platform also has strong incentives to minimize bias and create more inclusive networking opportunities.
By taking a data-driven approach, continuously evaluating its systems, and engaging with users, LinkedIn may be able to evolve its practices in a more equitable direction. However, critics argue far more transparency and accountability are needed to address discrimination issues in social media and tech.