Cheers!

  • asjmcguire@kbin.social
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    Next step - should be a server that simply coordinates video transcoding, and users can run an application on their computer which will do the transcoding when it’s idle and deliver the transcoded video back to the server. Like the rest of the Fediverse, make the community actually part of the community. I’m sure many of us would be happy to donate spare CPU time.

      • bumbly@readit.buzz
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Not if you have a consensus algorithm and the machines all return a hash of the video they encoded. If they build IPFS support then the encoding machine could upload the file there, return the IPFS content address and the server can then pick an agreed upon address.

        • xthexder@beehaw.org
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          I’m not OP, but if transcoding is happening on user CPUs, it’s theoretically possible to modify or inject stuff into the transcoded video. There’d need to be some way of validating a transcode matches the original, which is non-trivial.
          A consensus algorithm could work, but that would massively increase the required compute. I’m not even sure things like NVENC vs CPU ffmpeg are deterministic in how tbey compress video. Different encoders could very likely end up with visually identical transcodes, but the hashes wouldn’t always match.
          Maybe someone else has a better idea for validating transcodes?

      • asjmcguire@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Shouldn’t fundamentally be much different to seti@home, boinc etc. Break a video into chunks, and let multiple computers encode a chunk each. If the chunks were small enough, most people probably wouldn’t even realise their computer had just encoded a chunk of video