Crazy stuff all around, I think.
Crazy stuff all around, I think.
“They should’ve just bought a 4090, if they wanted to play the game” type stuff?
I’m hoping I never encounter those communities that reddit was known for before they had to purge them all.
The annotations and autowiring are key. Also, Udemy helped a lot.
Maybe make a community or listen to good music?
It’s worth remembering that what made Reddit a good place to be is the community and not greedy CEO’s, so the spirit of the site will always live on.
This is exactly what I needed to brighten up my day.
They’ve made it clear they won’t. And since most subs are only going dark for a measly 48 hours they have no incentive to. It’s literally like that “Oh no, anyway…” meme.
And think of it this way: even if you they revert the changes (or you just decide to please /u/spez and only use the official app) do you think the platform will continue to get better or worse? He’s shown his hand it’s nothing good for the mods or the users.
Oh my gosh, someone needs to contact tech news orgs
I’d fix inconsistencies between instances. Like, I made this account in Beehaw and now I literally can’t create a community anywhere until I make a new account. It shouldn’t be like this.
I wonder if such an activity can be automated (the fix you suggested, not the malicious activity)
Everything seems to be in order.
This is a cool idea! Is there anyway to point it at a specific community? /r/nosleep was my favorite.
At least the messaging is clear by the CEO: f*ck you, reddit users
It’s amazing what companies can get away with that a real person can’t…
I think it’s dangerous to try to cure loneliness with an AI, regardless of sophistication and tuning, because you end up with human who’s been essentially deceived into feeling better. Not only that, but they’re going to eventually develop strong emotional attachments to the AI itself. And with capitalism as the driving force of society here in the U.S. I can guarantee you every abusive, unethical practice will become normalized surrounding these AI’s too.
I can see it now: “If you cancel your $1,000/a year CompanionGPT we can’t be held responsible for what happens to your poor, lonely grandma…” Or it will be even more direct and say the old, lonely person: “Pay $2,500 or we will switch of ‘Emotional Support’ module on your AI. We accept PayPal.”
Saying AI’s like this will be normalized doesn’t mean it’s an ethical thing to do. Medical exploitation is already normalized in the US. Not only is this dystopian, it’s downright unconscionable, in my opinion.
I agree with @[email protected] about it having the capacity to make older adults feel less lonely. At the same time, however, I think it seems very dystopian. If someone was feeling sad or depressed we wouldn’t say “oh, just chat with this AI until you feel better”. So why is it okay to suggest this for older lonely people who are especially vulnerable?
Hell, given what ChatGPT has told people already it might do more harm than good. It’s akin to the whole of humanity saying “Yeah, we know you’re lonely but getting an actual person to talk to you is too hard. Chat with this bot.”
Composers, join me in [email protected]!
It’d be neat to listen to your stuff!
I completely see where you’re coming from with the idea of including personality disorders because of that “feeling squarely problematic” definition. Drawing on some personal experience, I don’t personally view myself as having a clearcut case of Asperger’s because 1) it was never severe enough to be a huge problem and 2) it was diagnosed after I was already an adult, by one psychiatrist (out of many).
Saying to someone “I’m considered neurodivergent” makes more sense to me than saying “I might be on the Autism Spectrum, depending on who you ask.”
Good insight!
This is good for
bitcoingaming