We need to have a serious chat about iPhone repairability. We judged the phones of yesteryear by how easy they were to take apart—screws, glues, how hard it was…
often are genuine, but Apple makes features not work unless paired
Because unless you pair the screen, the device has no way to know it’s genuine. If it’s not, it could implement any number of attacks, including keyloggers, screen stealers, etc
don’t believe
Why shouldn’t I? No one has given an argument that you can actually secure these peripherals without software locks, I bought my iPhone and MacBook because they offer security, even when I run Linux on it my MacBook has far superior boot security (the only thing apple has engineering control over in that use case) than any intel machines I’ve used
Also lol that article, you know the difference between one incident and a pervasive effort to mine your privacy for profit
You cannot and that’s ok. The problem here is people have different levels of risk acceptance and that’s ok. If I was a government or corporate leader I would probably prefer buying direct from apple, but most end consumers, especially those who want to do these repairs should have the choice to accept that risk on a device that they own. The manufacturer shouldn’t decide who I trust. The owner should.
people have different levels of risk acceptance and that’s ok
Except it is the editorial agenda of ifixit to promote legislation that requires this lesser level of security, which makes it not ok. Outlawing verification in software requires all devices to have the same vulnerability at the interface, it would even affect users who want to buy OEM.
Noone is saying it should be outlawed. What they are saying is that in order for a device to be considered highly repairable to an end user this type of check should be able to be turned off or not included.
You can have both though. Just add some random menu in the settings that turns bright red when using a non-certified component so security can be easily verified, but don’t needlessly lock people out and charge $500 to fix a $10-50 module on a $1000 phone
Edit:
Adding on to this, Ifixit isn’t outlawing verification, the above example of whatever red warning is a clear way they could keep it.
Um how exactly do you think these “rogue devices” would exfiltrate that data? Do you think iOS is providing Internet access to the faceID module or the display? Or do you think these devices somehow contain an entire wifi chipset to connect to the Internet to exfiltrate your data without anyone noticing an entire extra SoC soldered onto the part?
Please provide any argument as to why you think these could exfiltrate data over these interfaces? Unless you think iOS’s security is so poor that it lets any hardware device that’s attached to it get full network access? (Which I’m pretty sure is not physically even possible in most cases since those connectors are only capable of sending the type of data across for that particular sensor.)
To exfiltrate the login password from a keylogger on a macbook, for example, you need to have some software running on the cpu as well as the keyboard itself. This makes it very difficult to do in reality, as you have to infect both devices and if you do not have physical access, your exploit needs to be done across the keyboard interface, which makes it very hard to do in practice. Swapping any random keyboard in that could potentially be malicious introduces two issues, as now the keyboard itself may have a keylogger, as well as opening the possibility of exploiting some vulnerability in the cpu from the keyboard itself. You therefore open two attack surfaces that were previously closed, which is highly significant.
If you think keyloggers require software running on your physical keyboards you’re in for a rude awakening.
Keyloggers are almost always at a pure software level and are conceptually simple to make. So simple that in fact, it’s the same thing as running a regular application with background shortcuts. The only thing that is different is that regular apps aren’t saving/recording anything, they’re just listening for you to press cmd+whatever.
It takes maybe ~10-15 minutes to make a keylogger in Python that could run on any computer, mac, windows, or Linux. Maybe a little longer if you wanted to use a compiled language and properly hide it.
I think we’re on the same page? If an attacker wanted a keylogger they wouldn’t even need to go as far as a screen, there are plenty of other ways (like a 3rd party keyboard app) that would work just as well, if not better, on an iPhone.
Hell, while we’re at it, using a phishing email to get you to enter a password in a fake site or using social engineering to reset your passwords is way more effective than reverse engineering and modding a camera/screen.
There’s no reason why Apple should get to keep exclusive rights on repairs just to profit more on parts. 3rd party screens, cameras, face id modules, etc. aren’t going to suddenly make your phone less secure.
Why isn’t purchasing the part through Apple enough?
And also Is the consumer not allowed to assume the risk of going through after market repair that you seem to be concerned about?
This issue has always been about Apple trying to force older iPhones into obsolescence. They want the freedom to eventually say that no more parts exist for that device so you’ll have to upgrade. If repair shops can leverage broken phones to repair other phones, that extends the life of the device part Apples plans.
Most people will continue using older phones as long as they can because they don’t need the latest phone.
How the hell do you expect a screen to keylog you? This is a stupid argument. Even if the screen did know when the onscreen keyboard was visible how tf do you expect the logged data to go anywhere? Are you seriously worried that aftermarket iphone screens are including hidden LTE modems (and thus paying for illegitimate service) just to potentially log your keys? Do you realize how difficult and ridiculous this would be?
Aside the whole issue that a single component in a system exfiltrating data without cooperation from many of the other components in the system is just patently absurd, the honest truth is that anyone who wants to break your security isn’t going to go to the extreme length of making certain your screen is replaced with a covert unit that can somehow inform them of anything you’re doing when for most cases a pair of binoculars will get the same job done for much cheaper and is at least half as convoluted, a hit to the head with a $5 wrench gets your fingerprint much more easily than a replacement fingerprint scanner does, and most compromises of a user would be far more effectively done in software rather than hardware. Software which constantly has new bugs to exploit while getting a crooked piece of hardware navigated into place is just an absurdly unlikely occurrence that would require a massive coverup the size of which is out of the reach of most entities in existence.
Do you have any evidence that there’s a pervasive effort from third party repair to mine your privacy for profit? I’d love to see it.
Also, fine, let’s assume they have no way of knowing it’s genuine. Why don’t they release the tool to pair the OEM screens publicly? It’d only work on the real ones, and they have such a tool, so if it’s actually about security, there’s no reason not to.
Because unless you pair the screen, the device has no way to know it’s genuine. If it’s not, it could implement any number of attacks, including keyloggers, screen stealers, etc
Why shouldn’t I? No one has given an argument that you can actually secure these peripherals without software locks, I bought my iPhone and MacBook because they offer security, even when I run Linux on it my MacBook has far superior boot security (the only thing apple has engineering control over in that use case) than any intel machines I’ve used
Also lol that article, you know the difference between one incident and a pervasive effort to mine your privacy for profit
Anything to defend the people who make your favorite magic rectangle amirite
No, give me the argument that you can secure these interfaces, some of which provide biometric security, without verifying vendor origin in software
You cannot and that’s ok. The problem here is people have different levels of risk acceptance and that’s ok. If I was a government or corporate leader I would probably prefer buying direct from apple, but most end consumers, especially those who want to do these repairs should have the choice to accept that risk on a device that they own. The manufacturer shouldn’t decide who I trust. The owner should.
Except it is the editorial agenda of ifixit to promote legislation that requires this lesser level of security, which makes it not ok. Outlawing verification in software requires all devices to have the same vulnerability at the interface, it would even affect users who want to buy OEM.
Noone is saying it should be outlawed. What they are saying is that in order for a device to be considered highly repairable to an end user this type of check should be able to be turned off or not included.
Tell me you don’t know shit about tech without telling me you don’t know shit about tech.
But, my god, Steve jobs would laugh at how easy his marketing techniques made dumb people feel smart.
Removed by mod
You can have both though. Just add some random menu in the settings that turns bright red when using a non-certified component so security can be easily verified, but don’t needlessly lock people out and charge $500 to fix a $10-50 module on a $1000 phone
Edit: Adding on to this, Ifixit isn’t outlawing verification, the above example of whatever red warning is a clear way they could keep it.
Um how exactly do you think these “rogue devices” would exfiltrate that data? Do you think iOS is providing Internet access to the faceID module or the display? Or do you think these devices somehow contain an entire wifi chipset to connect to the Internet to exfiltrate your data without anyone noticing an entire extra SoC soldered onto the part?
Please provide any argument as to why you think these could exfiltrate data over these interfaces? Unless you think iOS’s security is so poor that it lets any hardware device that’s attached to it get full network access? (Which I’m pretty sure is not physically even possible in most cases since those connectors are only capable of sending the type of data across for that particular sensor.)
To exfiltrate the login password from a keylogger on a macbook, for example, you need to have some software running on the cpu as well as the keyboard itself. This makes it very difficult to do in reality, as you have to infect both devices and if you do not have physical access, your exploit needs to be done across the keyboard interface, which makes it very hard to do in practice. Swapping any random keyboard in that could potentially be malicious introduces two issues, as now the keyboard itself may have a keylogger, as well as opening the possibility of exploiting some vulnerability in the cpu from the keyboard itself. You therefore open two attack surfaces that were previously closed, which is highly significant.
If you think keyloggers require software running on your physical keyboards you’re in for a rude awakening.
Keyloggers are almost always at a pure software level and are conceptually simple to make. So simple that in fact, it’s the same thing as running a regular application with background shortcuts. The only thing that is different is that regular apps aren’t saving/recording anything, they’re just listening for you to press cmd+whatever.
It takes maybe ~10-15 minutes to make a keylogger in Python that could run on any computer, mac, windows, or Linux. Maybe a little longer if you wanted to use a compiled language and properly hide it.
Sorry to burst your bubble.
And what does that have to do with the risk of a screen repair?
I can also install a key logger on Linux and I can also freely change the SSD to anything I buy on the internet.
And yet somehow people still use computers!? Madness.
I think we’re on the same page? If an attacker wanted a keylogger they wouldn’t even need to go as far as a screen, there are plenty of other ways (like a 3rd party keyboard app) that would work just as well, if not better, on an iPhone.
Hell, while we’re at it, using a phishing email to get you to enter a password in a fake site or using social engineering to reset your passwords is way more effective than reverse engineering and modding a camera/screen.
There’s no reason why Apple should get to keep exclusive rights on repairs just to profit more on parts. 3rd party screens, cameras, face id modules, etc. aren’t going to suddenly make your phone less secure.
Ok, agreed we are on the same page! My misunderstanding.
(I thought you were defending the idea a keylogger is a risk not worth taking with a screen replacement, somehow.)
Why do you think a keylogger needs hardware?
If someone wants to infect your phone, they would do so remotely through software. Easier, more effective and most of all invisible.
Why isn’t purchasing the part through Apple enough?
And also Is the consumer not allowed to assume the risk of going through after market repair that you seem to be concerned about?
This issue has always been about Apple trying to force older iPhones into obsolescence. They want the freedom to eventually say that no more parts exist for that device so you’ll have to upgrade. If repair shops can leverage broken phones to repair other phones, that extends the life of the device part Apples plans.
Most people will continue using older phones as long as they can because they don’t need the latest phone.
How the hell do you expect a screen to keylog you? This is a stupid argument. Even if the screen did know when the onscreen keyboard was visible how tf do you expect the logged data to go anywhere? Are you seriously worried that aftermarket iphone screens are including hidden LTE modems (and thus paying for illegitimate service) just to potentially log your keys? Do you realize how difficult and ridiculous this would be?
I bet someone could make that actually happen, but if they could do that they’d probably just find or buy a software vulnerability to attack you with.
As always, there is an XKCD for this.
https://xkcd.com/538/
Aside the whole issue that a single component in a system exfiltrating data without cooperation from many of the other components in the system is just patently absurd, the honest truth is that anyone who wants to break your security isn’t going to go to the extreme length of making certain your screen is replaced with a covert unit that can somehow inform them of anything you’re doing when for most cases a pair of binoculars will get the same job done for much cheaper and is at least half as convoluted, a hit to the head with a $5 wrench gets your fingerprint much more easily than a replacement fingerprint scanner does, and most compromises of a user would be far more effectively done in software rather than hardware. Software which constantly has new bugs to exploit while getting a crooked piece of hardware navigated into place is just an absurdly unlikely occurrence that would require a massive coverup the size of which is out of the reach of most entities in existence.
Pervasive effort? Any examples?
Do you have any evidence that there’s a pervasive effort from third party repair to mine your privacy for profit? I’d love to see it.
Also, fine, let’s assume they have no way of knowing it’s genuine. Why don’t they release the tool to pair the OEM screens publicly? It’d only work on the real ones, and they have such a tool, so if it’s actually about security, there’s no reason not to.