Tadi Channel – Telegram
Tadi Channel
796 subscribers
357 photos
12 videos
6 files
219 links
Random stuff I consider worthy of sharing. Mostly tech.
Download Telegram
Forwarded from Giuseppe
https://practical_emv.gitlab.io/
The Apple Pay lock screen can be bypassed for any iPhone with a Visa card set up in transit mode. The contactless limit can also be bypassed allowing unlimited EMV contactless transactions from a locked iPhone.
This attack is made possible by a combination of flaws in both Apple Pay and Visa’s system. It does not, for instance, affect Mastercard on Apple Pay or Visa on Samsung Pay.
It supposedly works on deep links, so a fully rogue (interaction-less) installation can be possible if that's true.
Most notorious form of bypassing Google Play (outside of core GMS apps) is obviously the Facebook updater, there's a lot of stock ROMs that come with it and it's quite privileged, other than the ability of installing stuff.
Forwarded from cOVIDIUS
A good measure of education on security and privacy is proof of the design implications. Simply assuming complete scrapability of publicly accessible (probeable) data lets you avoid many upcoming fuckups.
https://github.com/tejado/telegram-nearby-map
Forwarded from Nekogram
Nekogram has been removed by Google from Play Store for not providing them a phone number to login and review.
Forwarded from Nekogram
I really want to know what Google has reviewed before, without a phone number to login
https://www.theverge.com/22811740/qualcomm-snapdragon-8-gen-1-always-on-camera-privacy-security-concerns

No, it isn't, it serves a specific purpose just like a fingerprint sensor. Both can be abused to read plain data, just like camera stack or Android framework can be abused to capture a photo unknowingly to user. After all, a well-compromised device can use camera any time, this feature doesn't compromise it by itself. In the official words, this feature can be configured to work as a poorman's (as sadly Android doesn't flush the keys with a screen lock, even with the "lockdown" one) deadman switch. Features like Smart Lock should indeed work both ways: letting you to nerf protections in trusted environment (pun not intended), and lock tightly in case of anything risky.


The real privacy concern premiered with an earlier generation, and became showcased also on the current one. Basically, it's the idea of using an embedded, permanent unit/device-unique private key that is meant to be unaccessible to user. By default, that's easily meaningless, just a string of random letters. The problem starts when all of the companies trust only that key, never the one made by you, no matter how much you're able to attest yourself. Such a tactic is dubious both for the security of a key and privacy, since the key involuntarily confirms your device model and/or the exact unit, without letting you to ever make it "clean", since its recognition is based on such authority in the first place. The only way of getting rid of a key bound to you in that scenario is changing your device unit to another, that is if such key is permanently unit-unique. While in case of a key shared among many devices, this idea simply becomes ridiculous. Why? In case of just one leak or successful extraction, all of the trust of a key is lost, while the truth is that you'll never know if it didn't already occur.

As much as it's hard to shortly explain everything that can go bad with the idea of using corp-provisioned permanent key to verify authenticity, the actual biggest concern is big market adoption. The risks of fingerprinting a person become enormous. Unnecessary trust in such architecture increases the incentives to paralyze a service that became used to trusting uncompromised state of a key. Devices containing revoked keys effectively become trash. You can go on, but let's simplify this. Imagine that to register for a social media account, you need to sign your account creation request with your digital-capable ID card. Both the government and social media page may say it's safe, that government doesn't get to know you created a social media account (because your key can be verified offline to come from the government by being signed) and that social media only sees the signed gibberish without your name. The problem starts when this data gets collected (as it has to be). Access to gov's data bound to your ID card's public key + possibility of checking collected public key at a social media page gets you absolutely direct de-anonimizing capabilities no matter what they promised you. Very similar principles apply to more disposable metadata. If you can't easily get rid of your key pair, you're more vulnerable to getting your identities linked by a malicious actor. There's no way around it.

Tl;dr: corp-provisioned attestation system for multimedia has similar security and privacy risks to hardware-backed SafetyNet implementation. Stuff like this enables fingerprinting, while making the companies lazier due to trust in legitimacy of received data. And the more bulletproof the implementation becomes, the more privacy-invasive it gets, still not guaranteeing that goals are completely met, as every existing plain text private key can get compromised, resulting in loss by the company choosing to trust them and loss by the end user, whose hw-bound key becomes revoked.