• 0 Posts
  • 64 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle










  • After several years of using Linux for work and school, I made the leap to daily driving linux on my personal computer. I stuck with it for two years. Hundreds of hours I sunk into an endless stream of inane troubleshooting. Linux preys on my desire to fix stuff and my insane belief that just one more change, suggested by just one more obscure forum post will fix the issue.




  • Something I often see missing from discussion on privacy is that it’s not always about you, the listener. Sometimes it’s about protecting the most vulnerable people around you. For example, someone escaping from domestic violence might have a different view on how their information is protected. People struggle to see the value in privacy because it’s not been a big problem for them personally or because they think it’s hopeless. An introduction to privacy in my view is all about teaching empathy, hope, and advocating for others.

    Once they have that goal in mind, you can tie in how open source helps empower people to take back their privacy




  • This has nothing to do with the Files app, nor does it have anything to do with re-indexing of the Photos library. This has to do with fighting CSAM. Apple has started (in this or a previous update), to scan your device (including deleted files) for anything containing nudity (search for “brasserie”) and adding it to your photos library in a way that it is hidden. That way, anything that the models detect as nudity is stored in your iCloud database permanently. Apple is doing this because it allows them to screen for unknown CSAM material. Currently it can only recognize known fingerprints, but doing this allows them (and the other parties that have access to your iCloud data) to analyze unknown media.

    The bug mentioned here accidentally made those visible to the user. The change visible updates the assets in the library in a way that removes the invisibility flag, hence people noticing that there are old nudes in their library that they cannot delete.



    And speaking of deleting things, things are never really deleted. The iPhone keeps a record of messages you delete and media, inside the KnowledgeC database. This is often used for forensic purposes. Apple is migrating this to the Biome database, which has the benefit of being synchronized to iCloud. It is used to feed Siri with information, among other things. Anything you type into your devices, or fingerprints of anything you view are sent to Apple’s servers and saved. Spooky, if you ask me. But the only way we can have useful digital assistants is when they have access to everything, that’s just how it works.

    Nudes are meant to persist on iPhone. You’re just not meant to notice.