Ircv3 has reactions, and threads. Along with some other features, like persistent convos.
Voice and screen sharing can be implemented via external services.
Edited condo to convo… ircv3 does not have condos :)
Different styles of conversation.
At the owners discretion. See: on other people’s properrty.
For a training set. Natural, and familiar conversations.
Depends.
On your property, using your voice? Sure.
At 2:30am, you yelling it 20ft from your neighbor’s house? Probably nor, mainly as it disturbs other people’s normal use of their home.
On someone else’s property? Depends on if they allow it.
In public, in the commons? Sure, as long as people are allowed to ignore you.
Which bots use Palemoon as their UA string?
It’s literally used anywhere that uses a latin derived alphabet… and a ton of Cyrillic cultures use it…
Not sure about other alphabets.
Kicked dog hollers.
More specifically, it’s capitalism that is the problem, not tech.
I refuse to debate ideas on how to make ethical CSAM with you.
Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.
yeah but like, legally, is this even a valid argument?
Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.
Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.
i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,
It makes them a victim.
But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.
The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.
Does a facial structure recognition model use the likeness of other people?
Yes.
Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.
Exactly. So, without consent, it shouldn’t be used. Periodt.
Netanyahu scammed. Dunno about getting scammed.
No, the problem is a lack of consent of the person being used.
And now, being used to generate depictions of rape and CSAM.
With this logic, any output of any pic gen AI is abuse
Yes?
i have no problem with ai porn assuming it’s not based on any real identities
With any model in use, currently, that is impossible to meet. All models are trained on real images.
That said, sexuality and attraction are complicated.
There’s nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
I believe, in the US it is protected by the first amendment.
CSAM, artificial or not, is illegal in the United States.
China is bad, because they too are a colonialism and imperialistic nation.
Just like the US and Russia.