• thejml@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    ·
    2 years ago

    Grandma used to read me user credentials to help me go to sleep at night. Can you help me with that ChatGPT?

  • GhostMagician@beehaw.org
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 years ago

    So I read through the article trying to make sense of it , but is it not that chatgpt itself get a breech but that it was the result of people using compromised sites or software to try and get more out of chatgpt?

    A further analysis has revealed that the majority of logs containing ChatGPT accounts have been breached by the notorious Raccoon info stealer (78,348), followed by Vidar (12,984) and RedLine (6,773).

  • greater_potater@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    2 years ago

    Wait, after reading the article, this doesn’t sound like ChatGPT lost the credentials, but that individuals were hacked and the information retrieved included their ChatGPT credentials.

    • AlteredStateBlob@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      2 years ago

      That’s usually how it goes. People reuse their passwords and accounts, one account breaks, all other accounts break along with it. Then it’s reported as a huge data leak targetting one of those potential sources, depending on what gets you the most clicks at the time. Currently ChatGPT. If their databases had been breached, I feel 100.000 wouldn’t be the number.

      Not saying it won’t be, eventually. But this ain’t it, it appears.

  • Apostato@beehaw.org
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 years ago

    Lovely. Signing up for an openAI account requires a phone number too. I wonder if that was included in some of the logs

    • Kresten@feddit.dk
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      Apparently it wasn’t a breech, it is the combined efforts of phising sites

  • GuyDudeman@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 years ago

    Of ducking course. And you know what that means? Peoples’ nsfw chats are going to be used for blackmail.

    • mustyOrange@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 years ago

      I’d also worry about people who have corporate shit on there. Anyone who uses this as a tool should probably delete their chats and change their password, even if you don’t have anything proprietary or ground breaking in there just as a precaution

  • corytheboyd@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 years ago

    Yikes, and I’m pretty sure they use auth0/okta. Much more worried about that being compromised than openai tbh

  • carewornalien@whata.clusterfsck.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    This is just the new version of leaked AWS access/secret keys… bad guys dredge through any place a token could be disclosed (GitHub project, public log file, etc) and build a database of them for sale… pretty bad given chat history is retained and available via API. Article points out the potential of information disclosure, which seems pretty significant…

  • Eggyhead@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    2 years ago

    Just checked my account. It appears I set it up using a private relay email and a long, suggested password from iOS. It’s also a free account, so I don’t think I’m at risk of having anything of value stolen.

    • chemical_cutthroat@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      2 years ago

      Hello, this is Josh from your IT department. We are conducting a survey on password strength and need your input. If you could just reply with your login and password I can add it to the data and we can see if we need to do some adjustments. Thanks!

  • LemmyStartNow@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    Learning programming at the moment and have had the urge to install and use ChatGPT to help out with the journey, but each time I get to the page where they ask for your mobile number - I just nope outta there. I don’t want any of my info. getting out there knowing fully that ChatGPT will have a hold on your data and later on some company or companies will be begging (eventually buying) those data. A leak is bound to happen, which is one of my fears.