- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Signal’s president reveals the cost of running the privacy-preserving platform—not just to drum up donations, but to call out the for-profit surveillance business models it competes against.
The encrypted messaging and calling app Signal has become a one-of-a-kind phenomenon in the tech world: It has grown from the preferred encrypted messenger for the paranoid privacy elite into a legitimately mainstream service with hundreds of millions of installs worldwide. And it has done this entirely as a nonprofit effort, with no venture capital or monetization model, all while holding its own against the best-funded Silicon Valley competitors in the world, like WhatsApp, Facebook Messenger, Gmail, and iMessage.
Today, Signal is revealing something about what it takes to pull that off—and it’s not cheap. For the first time, the Signal Foundation that runs the app has published a full breakdown of Signal’s operating costs: around $40 million this year, projected to hit $50 million by 2025.
Signal’s president, Meredith Whittaker, says her decision to publish the detailed cost numbers in a blog post for the first time—going well beyond the IRS disclosures legally required of nonprofits—was more than just as a frank appeal for year-end donations. By revealing the price of operating a modern communications service, she says, she wanted to call attention to how competitors pay these same expenses: either by profiting directly from monetizing users’ data or, she argues, by locking users into networks that very often operate with that same corporate surveillance business model.
“By being honest about these costs ourselves, we believe that helps provide a view of the engine of the tech industry, the surveillance business model, that is not always apparent to people,” Whittaker tells WIRED. Running a service like Signal—or WhatsApp or Gmail or Telegram—is, she says, “surprisingly expensive. You may not know that, and there’s a good reason you don’t know that, and it’s because it’s not something that companies who pay those expenses via surveillance want you to know.”
Signal pays $14 million a year in infrastructure costs, for instance, including the price of servers, bandwidth, and storage. It uses about 20 petabytes per year of bandwidth, or 20 million gigabytes, to enable voice and video calling alone, which comes to $1.7 million a year. The biggest chunk of those infrastructure costs, fully $6 million annually, goes to telecom firms to pay for the SMS text messages Signal uses to send registration codes to verify new Signal accounts’ phone numbers. That cost has gone up, Signal says, as telecom firms charge more for those text messages in an effort to offset the shrinking use of SMS in favor of cheaper services like Signal and WhatsApp worldwide.
Another $19 million a year or so out of Signal’s budget pays for its staff. Signal now employs about 50 people, a far larger team than a few years ago. In 2016, Signal had just three full-time employees working in a single room in a coworking space in San Francisco. “People didn’t take vacations,” Whittaker says. “People didn’t get on planes because they didn’t want to be offline if there was an outage or something.” While that skeleton-crew era is over—Whittaker says it wasn’t sustainable for those few overworked staffers—she argues that a team of 50 people is still a tiny number compared to services with similar-sized user bases, which often have thousands of employees.
read more: https://www.wired.com/story/signal-operating-costs/
archive link: https://archive.ph/O5rzD
The point is to protect your face data, the hash IS the password, but you don’t want people to be able to tell how you look like by sending the raw images of your face over the net
That would do nothing to validate that the user is real, they can just insert any hash and claim it’s their face’s hash. At that point we can just use regular passwords, but as I said that won’t solve the spam Accounts issue.
You can make sure that the user used the signed binary to generate the token. Each token has a nonce and a validity period. This binary requires the use of the camera API, but also requires liveness analysis by making you move while authenticating. You can change the way the user is forced to move to make sure it’s not the same video feed connected to the camera
Could work, but it doesn’t stop actual people from creating spam Accounts.
If one wants to put real effort into it, the camera/gyro sensors could be malicious or a robotic arm could be built. Maybe it would work with some fake background.
The camera and gyro sensors can be faked for sure, but the app can be updated to detect inconsistent lighting. These kinds of apps can use a fill light on the screen to make the face change colors.
So use teal when you nod, use purple when you turn to the right, etc. If the color is not detected, tell the user to turn up the screen brightness until it is. Of course, it makes it impossible to do it in daylight, but you can go in the shadow or inside temporarily most of the time. There is a possiblity of support helping you if the scan just won’t work with your device, for example by verifying your government ID if you agree to that
In the end, no system is perfect and you are just trying to discourage the laziest spammers. Using phone numbers just means a real person can buy new numbers. I can get each number for a total cost of $0.99, far less effort than trying to catch up with each app update