Roughly 50% of transgender and/or non-binary people are software developers and roughly 50% are furry artists, so it makes sense we would be more wary of AI.
I use arch, btw.
Trans nonby software dev who dated a furry artist, my disdain for AI knows no limits.
I use Nobara, btw. (Is Arch good I’ve never looked into it)
It can be a tiny bit involved to install but if you know your way around Linux already it’s perfectly doable. The arch wiki is a great reference for MANY things and it has a dedicated page with installation instructions.
I like that it’s lightweight because it comes with the bare minimum for a working Linux install and everything on top of that must be explicitly installed by you. I also love pacman (the package manager). It’s never borked anything for me and I’ve yet to be dropped into a dependency hell in 6+ years of using it.
I got in a dependency loop one time. It was my own damn fault 😂
Only corporate executives benefit from AI.
Everyone else is harmed, both directly and indirectly, and it makes the customer experience far worse because workers are replaced by chatbots that are incapable of understanding.
From what I’ve seen, the only people that have a positive view of AI are those who see themselves as the master of others. The trans, nonbinary, & disabled people in this study are very unlikely to fit that mold.
Knowledge based fields were historically a “safe space” for queer and disabled people. If you are just super fucking smart and could be a wizard in a programming language, or were a genius physicist, you could get to the point where you were too valuable to fire for being trans or disabled. I may be trans and an unperson in the place I live, but I can do calculus, and there’s no way they can take that away from me.
There’s an attack on knowledge itself going on right now. A desire by the rich to control information. They want to force us into an unreality where skill and knowledge are meaningless. This hurts people who are socially marginalized, because it takes away one of our few paths for economic survival.
It goes with the attacks on DEI. What they want is a tool that can replace the need for talent, so that they can select who gets to have jobs. They want all jobs to be Graeber’s “bullshit jobs” so that skill is meaningless and they can allot them out to the people they think “deserve” them.
That’s interesting. I feel like a lone voice in my university, trying to explain to people that using LLMs to do research tasks isn’t a good idea for several reasons, but I’d never imagine that being disabled would put me into a group more likely to think like that. If I had to guess, I’d suggest that there’s possibly a strong network effect being abused in our social environment to make people get into the AI hype, and we, the ones who live less connected to the “standard” social norms, tend to become less vulnerable to it.
It may also be that disabled, transgender and nonbinary people are more aware of:
- The use of AI to reduce people’s employment opportunities, which are already tough enough for people in these groups.
- The tendency of AI to reproduce the prejudices present in its training materials. If everyone’s relying on AI then historical prejudices are going to be perpetuated just because LLMs are regurgitation machines.
As an autistic bastard I just think it’s shit, though I will say that I do partake in the guilty pleasure of Two Scuffed and DougDoug. But I wouldn’t feel particularly bad if every bit of generative AI spontaneously corrupted and could never be replicated, generally feels like a money sink for dipshit corporations at this point.
Hi. Haven’t read the article. Straight middle aged white guy here. I too also view AI negatively.
If trans, nonbinary, or disabled people view AI negatively, it’s not because they’re trans, nonbinary or disabled. It’s because AI is terrible, and threatens (and already is proving to) make all of our lives terrible for the sole sake of giving billionaires a few extra pennies.
Though I will say, if trans, nonbinary and disabled people have any extra issues with AI making their life specifically worse, that’s not caused by AI itself. It’s caused by the wealthy CHOOSING to use AI to make their lives worse.
This doesn’t need to happen. None of this needs to happen. Google doesn’t need entire campuses dedicated to AI with special power requirements. This is all bullshit.
AI is the new crypto by the ceos and c-suites, sorry but theres no market for it for a regular customer base, and they admitted its costing them alot more money using AI than actually saving or even profitting from it. its actually no wonder the people who fall for AI /crypto are mostly conservatives.
It has a market for a regular customer base. But they try to shove it down everything just to see what sticks, and most of those things are useless at best or actively making the product worse.






