AI firms propose ‘personhood credentials’ to combat online deception, offering a cryptographically authenticated way to verify real people without sacrificing privacy—though critics warn it may empower governments to control who speaks online.

  • shortwavesurfer
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Use a proof of work system. The more work that is required, the fewer bots are going to actually take the time to do it. You could easily put in a system that says something to the effect of, has this person done at least 24 hours worth of computational work in order to validate this. If no, then they can’t do whatever. If yes, then they can do that thing. There’s a very low chance that a bought would actually do 24 hours worth of work. And even if they did, they sure as hell wouldn’t be generating millions of accounts doing it that way.

    The way I see it, you force some sort of proof of work that takes 24 hours to do, and then you can just submit that to each individual website you wish to work with so that they can validate that you’ve actually done the work you say you have.