• stoy
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    That changes nothing, you had the book inspected and hot the data.

      • stoy
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        No, you are hung up on trying to read the book without actually reading it.

        That breaks the puzzle, since the device would not be able to anslyze the inside of an item of food from a pucture of the inside, and can only use highly generic data based on what it can assume from an image of the outside

        • MrScottyTay@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Re-read the first one I sent.

          You can get a pretty good generalisation if you know what the food is. How do you think current apps for tracking nutrition work? All that this will do is just try and figure out what the food is from the picture rather than the user typing it in. Most foods you can tell what it is without “looking inside”. I’m pretty sure there’s apps that do that now, this isn’t something new and groundbreaking.

          And for nutrition you don’t need to be 100% exact when tracking it. Because you can’t be 100% even if you do know exact ingredients and how much of each one. Everything always has a variance. This method doesn’t need to be perfect for it to meet the needs of most that will use it.

          • stoy
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            I agree that you can get a generic value of nutrition from a photo of a simple, fruit or vegetable, but since a pie/cake contains soo much stuff that looks identical to other stuff, rendering any photographic analysis useless.

            So yes, you can get some idea of the nutrition of some foods, but way too low to be useful.