Apple Has Begun Scanning Users Files EVEN WITH iCloud TURNED OFF

  Рет қаралды 188,762

Mental Outlaw

Mental Outlaw

Жыл бұрын

In this video I discuss how several news medias have announced that Apple will no longer pursue scanning peoples iCloud accounts for CCAM and a blog post that appears to be showing that Apple is indeed scanning local filesystems on Mac OS without users consent (iCloud and analytics turned off)
sneak.berlin/20230115/macos-s...
₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿
Monero
45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436
Bitcoin
3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV
Ethereum
0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079
Litecoin
MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
Dash
Xh9PXPEy5RoLJgFDGYCDjrbXdjshMaYerz
Zcash
t1aWtU5SBpxuUWBSwDKy4gTkT2T1ZwtFvrr
Chainlink
0x0f7f21D267d2C9dbae17fd8c20012eFEA3678F14
Bitcoin Cash
qz2st00dtu9e79zrq5wshsgaxsjw299n7c69th8ryp
Etherum Classic
0xeA641e59913960f578ad39A6B4d02051A5556BfC
USD Coin
0x0B045f743A693b225630862a3464B52fefE79FdB
Subscribe to my KZfaq channel goo.gl/9U10Wz
and be sure to click that notification bell so you know when new videos are released.

Пікірлер: 2 200
@paull1248
@paull1248 Жыл бұрын
Every time I hear "Think about the children", it makes me sick to my stomach. I'm sure a trillion dollar company with ridiculous margins, who exploits people in 3rd world countries to make production costs even lower, truly care, deeply about children. The fact they're trying to encroach on your privacy is bad enough, hiding behind children while doing so is disgusting.
@O1OO1O1
@O1OO1O1 Жыл бұрын
"that's for using children" - Trinity, Matrix Resurrections
@powerdude_dk
@powerdude_dk Жыл бұрын
Solid point there
@christiangonzalez6945
@christiangonzalez6945 Жыл бұрын
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
@christiangonzalez6945
@christiangonzalez6945 Жыл бұрын
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
@Mustachioed_Mollusk
@Mustachioed_Mollusk Жыл бұрын
We’re going to need you to stop posting opposing opinions because what if some children read what you said? Be responsible.
@ichigo_nyanko
@ichigo_nyanko Жыл бұрын
I can totally see this being adapted (quietly) to scan and detect copyrighted material and snitching to the police whenever it is found.
@jer1776
@jer1776 Жыл бұрын
Yep, as well as memes that promote "hate".
@RaaynML
@RaaynML Жыл бұрын
Anyone can have a different definition of freedom
@xXx_Regulus_xXx
@xXx_Regulus_xXx Жыл бұрын
that's definitely one of the primary goals, they just use the unobjectionable goal as the one to get their foot in the door. "we're gonna make sure there's no CP on your computer, which we know you don't have anon so don't worry about it. And while we're in there we're gonna check for pirated anime, maybe see if that netflix login in your keyring was borrowed from anybody. But you're not some TOS breaking freak, so you have nothing to worry about right bro?"
@zeppie_
@zeppie_ Жыл бұрын
I think it’s more likely that it straight up deletes those files from your hard drive and when you try to open it again you get some annoying as fucc notification like “we couldn’t find your files :(“
@youreyesarebleeding1368
@youreyesarebleeding1368 Жыл бұрын
@@jer1776 These heckin trolls on 4chan spreading H8 sp33ch memes are currently the BIGGEST THREAT TO OUR DeMoCrAcY!!!! We HAVE to scan your personal files, it's to protect Democracy! and the children! and minorities! and immigrants!!!
@kinomoto3633
@kinomoto3633 Жыл бұрын
one step closer to the day when having "offline storage" will be a suspicious, borderline criminal thing to have
@BlazeEst
@BlazeEst Жыл бұрын
Bruh 💀
@swish6143
@swish6143 Жыл бұрын
Same as cash.
@leandrogastonlovera8909
@leandrogastonlovera8909 Жыл бұрын
You can be jailed for failing to decrypt a hard-drive or file.
@SuperMassiveMax
@SuperMassiveMax Жыл бұрын
Time to switch to SD cards and USB drives with Veracrypt-encryption.
@NihongoWakannai
@NihongoWakannai Жыл бұрын
@@leandrogastonlovera8909 well yeah no shit, if they have a warrant then they have the right to demand you show it to them. The difference is that they need a court ordered warrant to do that, it's not a private company looking at your files whenever they want.
@mad_vegan
@mad_vegan Жыл бұрын
Soon we'll have "personalized ads" based on the types of files we have on our devices.
@Dave102693
@Dave102693 Жыл бұрын
Google and MS already does this
@riftsquid7659
@riftsquid7659 Жыл бұрын
You’re already getting ads based on every single thing that comes out of your mouth. 99% of your apps have baked in microphone access.
@Dave102693
@Dave102693 Жыл бұрын
@@riftsquid7659 exactly
@Hackanhacker
@Hackanhacker Жыл бұрын
@@Dave102693 they arnt on on my device and its shows xD Mostly i have no ad anywhere lol fuck oof no in my house but when one or two slip trough it have no shit related to me xD its pretty hard to get rid of thos system.tho (phone)
@bremsberg
@bremsberg Жыл бұрын
Really miss the times when we had more hardware choices rather than between "don't be evil" trash bin and "think different" trash bin.
@nevabeensmart
@nevabeensmart Жыл бұрын
Monopoly is more than a kids game...
@Arimodu
@Arimodu Жыл бұрын
Its not really the hardware as it is the pre-packaged software. If someone took a recent samsung, slapped a decent ROM on it and re-sold it you would have your good choice, but that person would also have a BIG FAT LAWSUIT in their lap the next day.
@villagernumber77
@villagernumber77 Жыл бұрын
You forgot the where do you want to go today rubbish bin
@TheOfficalAndI
@TheOfficalAndI Жыл бұрын
And the first trashbin dropped the whole not being evil thing.
@vallisdaemonumofficial
@vallisdaemonumofficial Жыл бұрын
@@Arimodu this is why it should be as easy to build a phone as it a PC. Seriously, fuck a lotta this hardware that's glued together.
@SergioLeonardoCornejo
@SergioLeonardoCornejo Жыл бұрын
Protecting children is always the excuse to impose authoritarian measures.
@ilearncode7365
@ilearncode7365 Жыл бұрын
The same people that think a bad sequence of pixel values is the worst thing ever because they care about children so much, are the same people that support a woman’s right to kill their own child while in the womb.
@HisCarlnessI
@HisCarlnessI Жыл бұрын
We both thinking about gun rights, and statistically crazy rare events?
@flamestoyershadowkill6400
@flamestoyershadowkill6400 Жыл бұрын
Or minorities
@UninspiredUsername40
@UninspiredUsername40 Жыл бұрын
Or terrorism
@Akab
@Akab Жыл бұрын
@@HisCarlnessI a bit of regulation like a background check should always happen(i mean it is already anyways in most states) but guns should definitely not be banned or even over regulated, yes. And yeah, "think of the children" is really such an old excuse, you would think people would've realized that farce by now but they still don't 🙄
@anthonyeid6534
@anthonyeid6534 Жыл бұрын
TLDR: mediaanalysisd is a process that has been around for years; its purpose is to send neural hashes to apple and get information on what those hashes mean. Such as if there's a cat in a photo, a painting, etc. It can be disabled by turning off Siri suggestions by going to System Settings > Siri & Spotlight. Note disabling mediaanalysisd will turn off visual look up. Mediaaanalysisd is apart of visual look up (VLU) which is a system apple uses to recognize objects in photos like buildings, animals, paintings and more and give users information about the photo. VLU works by computing a neural hash locally that is used to distinguish photos/videos by the objects within them. During VLU apple requests a neural hash to compute what that hash represents and sends it back to the user. I'm assuming the database used to compute the hash’s meaning is too massive to be used locally and wouldn't be able to stay updated with new information. I really didn't like the article this video is based on because it makes claims without any real evidence or research. mediaanalysisd is a process that has been around for years and it's very easy to look up what it's used for and how to disable it. The author is close to some conspiracy theorist in my opinion. Anyway a much more in depth read on this topic can be found here: eclecticlight.co/2023/01/18/is-apple-checking-images-we-view-in-the-finder/
@mapl3mage
@mapl3mage Жыл бұрын
literally the only person who bothered to look up and search what the program actually does. no one else bothered. to be fair, the program name is suspicious and enough to ring alarm bells for anyone who believes the government is out to get them, which is literally the target audience of this channel.
@bcj842
@bcj842 Жыл бұрын
So if I turn that off I won’t be able to grab people out of a photo and make a clipart of them?
@poweron3654
@poweron3654 Жыл бұрын
Should be much higher up.
@Me-eb3wv
@Me-eb3wv Жыл бұрын
Interesting
@js32096
@js32096 Жыл бұрын
Sad that I had to scroll this far, to see your comment. I recently subscribed to Mental Outlaw and if he doesn't address this misinformation, I'll stop trusting his channel. Takes minutes to debunk. Mental Outlaw should be doing this due diligence before disseminating to his audience.
@Hypnotically_Caucasian
@Hypnotically_Caucasian Жыл бұрын
company: “wOnT sOmEbOdY pLeAsE tHiNk oF tHe cHiLdRen?!?” Also company: * uses child labor in India to make $1,000 phones in horrific conditions *
@decoy3418
@decoy3418 Жыл бұрын
It didn't happen in America so it's not real.
@sorryi6685
@sorryi6685 Жыл бұрын
Their is definitely no child labour in India to make Iphones. Child Labour exist in India but it is unorganised sector and in interior of India but definitely not in Iphones. That will be a PR nightmare. Why take risk when there is huge adult population who will work for very low wage
@avyam7509
@avyam7509 Жыл бұрын
It's china.
@justaweeb14688
@justaweeb14688 Жыл бұрын
It’s impossible to not have human rights violations in any of the manufacturing process. Fair phone tried and failed.
@Schopenhauer69
@Schopenhauer69 Жыл бұрын
China maybe. Child labor is a crime in India.
@turtleswithbombs
@turtleswithbombs Жыл бұрын
Brb, going to go break into people's houses to make sure they're not abusing children
@soda3185
@soda3185 Жыл бұрын
The hero we need but not the one we deserve /s
@users4007
@users4007 Жыл бұрын
@you will see it I suspect this to be rick roll
@whitelily2942
@whitelily2942 Жыл бұрын
@@users4007 no it’s spam
@idiotontheweb
@idiotontheweb Жыл бұрын
@TruthfulIy thx bro I needed it
@namesurname4666
@namesurname4666 Жыл бұрын
during your trip you want some cupcakes?
@Blood-PawWerewolf
@Blood-PawWerewolf Жыл бұрын
I called it the moment they “quietly” abandoned the CSAM BS last year. We all knew it was be snuck in without our knowledge
@21N13
@21N13 Жыл бұрын
Do you have any evidence that this has been snuck in? The process that this KZfaqr who earns money off of this video mentions, without going into detail on while basing the entire video on it, has existed on macOS for over a decade. You can now select text in images, images are scanned for subjects and Visual Look Up fetches related information over the web. Both you and the KZfaqr earning money off of this clickbait video could've done 1 web search and discovered everything you could possibly want to know about how the process works. Instead you're spreading fake news and FUD here.
@timewave02012
@timewave02012 Жыл бұрын
Apple's original whitepaper described the scanning as host-side, using a form of homomorphic encryption to avoid revealing the perceptual hashes needed to craft preimage or collision attacks (think: attackers being able to craft false positive images). I don't support it in any way, but it should be no surprise, considering this is exactly how the system was originally designed to work.
@GareWorks
@GareWorks Жыл бұрын
Yeah, I think a lot of us had a pretty good idea of what they were really up to.
@RD-eh3tz
@RD-eh3tz Жыл бұрын
@@timewave02012 typical Apple being homomorphic, when will they learn to tolerate minorities.
@paulosullivan3472
@paulosullivan3472 Жыл бұрын
I think its safe to assume all tech companies are doing this with the full support of the governments.
@Micchi-
@Micchi- Жыл бұрын
Always has been *gunshot
@-_Somebody_
@-_Somebody_ Жыл бұрын
@@Micchi- I see what you did there
@warlockpaladin2261
@warlockpaladin2261 Жыл бұрын
Apple's relationship with China adds an interesting wrinkle to this fabric.
@benni1015
@benni1015 Жыл бұрын
If you look at what the EU commission does right now i feel like many governments want it to go much much further.
@Angelarius82
@Angelarius82 Жыл бұрын
One thing about this that no one seems to have considered is this: Whatever this software is it will only ever be able to make suggestions about what the photo "might" be. Before any police or authorities are involved a real human will need to look at the suggestions that the AI has given. If you have private photos of yourself on your computer that might be enough to tigger a suggestion and next thing you know some stranger is looking at it because an AI said it "might" be something.
@Axman6
@Axman6 Жыл бұрын
The technology they were proposing to use was specifically not AI based, but Mental Outlaw just made assumptions because he’s too lazy to do any research at all. There’s not a single piece of evidence that anything here has anything to do with CSAM - go read the original blog post and see if you can find anything. IIRC, they were even going to require several hits against known CSAM before any action at all would be taken.
@nonormies
@nonormies Жыл бұрын
@@Axman6 do corporate boots actually taste like apples?
@Axman6
@Axman6 Жыл бұрын
@@Channel-gz9hm I’m not a fan of any of this, I’m glad Apple,reversed the decision, but just making up facts without evidence and getting mad is pathetic and delusional. This is like Qanon levels of leaps of logic. Paid shill, I wish; I’m just interested in verifiable facts instead of sensationalism with zero evidence.
@anglepsycho
@anglepsycho Жыл бұрын
It probably will be both AI and human error in the way since, currently, AI has to mess up and blend in with brush tools to make a form of a match like DALLE.
@thesenamesaretaken
@thesenamesaretaken Жыл бұрын
@@Axman6 it's a Qanon level leap to logic to think tech companies that datamine customers and that collude with governments might do both at the same time, wew
@kevina.4036
@kevina.4036 Жыл бұрын
"We know what is best for you. Think of the children". Atrocities have been committed for less.
@wrongthinker843
@wrongthinker843 Жыл бұрын
"Think of the children" - a megacorporation that exploits child labor
@YouKnowMeDuh
@YouKnowMeDuh Жыл бұрын
Less? You can drop a lot of words: "We know what's best." Yep, it's happened.
@nigeltheoutlaw
@nigeltheoutlaw Жыл бұрын
I'm really grateful to guys like you and Upper Echelon calling attention to problems like this that the mainstream is in full, unhesitating support of. This is some truly frightening, dystopian crap, and it's eternally disappointing how unintelligent the average NPC is that they lack any and all pattern recognition to realize "protect the children" is never the goal.
@ProteinFeen
@ProteinFeen Жыл бұрын
Thanks for the shoutout to upper echelon bouta check him out❤
@gamtax
@gamtax Жыл бұрын
@@ProteinFeen That guy came from gaming channel to commentary channel. Worth to check it out.
@ProteinFeen
@ProteinFeen Жыл бұрын
@@gamtax yeah I just subscribed to him, I really like the cyber security info channels especially this one. Keeps things fresh.
@maxia8302
@maxia8302 Жыл бұрын
While I like UEG, he sometimes gets stuff wrong. Problem is, we don‘t track false predictions. Remember his discussion that Musk would never buy Twitter or that SBF would be Epsteined?
@RT-qd8yl
@RT-qd8yl Жыл бұрын
That's why we need to round up the NPCs and put them all in a prison camp.
@DarkMetaOFFICIAL
@DarkMetaOFFICIAL Жыл бұрын
Trusting Apple with your privacy is like having Epstein as the family babysitter
@anglepsycho
@anglepsycho Жыл бұрын
Or Lena Dunham as the homeschool teacher.
@DarkMetaOFFICIAL
@DarkMetaOFFICIAL Жыл бұрын
@@anglepsycho lmao
@diablo.the.cheater
@diablo.the.cheater Жыл бұрын
Or trusting me with running a government.
@JayRagon
@JayRagon Жыл бұрын
@@diablo.the.cheater that's honestly not bad enough to compare to trusting apple with privacy
@Yorkshire42069
@Yorkshire42069 Жыл бұрын
What do you do to get around it?
@4.0.4
@4.0.4 Жыл бұрын
Using children as human shields for surveillance technology is in line with using them in sweatshops to make the iPhones 👍
@janpapaj4373
@janpapaj4373 Жыл бұрын
HOW THE FUCK DID APPLE TRAIN THE DETECTION MODEL
@Bloom_HD
@Bloom_HD Жыл бұрын
3 letter agencies have huge databases of cp. That's where it comes from whenever one of their whistleblowers wakes up to find 18TB of it uploaded onto his HDD and the local police "anonymously" tipped.
@alifahran8033
@alifahran8033 Жыл бұрын
The elites' private collection. A lot of people were regulars on Epstein's island.
@o-hogameplay185
@o-hogameplay185 Жыл бұрын
it is actually not that difficult. they just hash every images, and see if they match with hashes of known csam
@hihihihi3806
@hihihihi3806 Жыл бұрын
@@Bloom_HD the agencies r the real pedos
@Bloom_HD
@Bloom_HD Жыл бұрын
@@o-hogameplay185 take the image, slightly alter it by manipulating brightness by 0.1% or adding a single off-color pixel into the corner. And boom, new hash. That's not what they do.
@NewWarrior21st
@NewWarrior21st Жыл бұрын
I actually cracked up when you we're talking about Apple using the children pretext and then it cut to EDP 😆
@Dave102693
@Dave102693 Жыл бұрын
Facts
@0xsupersane920
@0xsupersane920 Жыл бұрын
The worst part is that they advertise: "Privacy, that's iPhone" and then do shxt like this. They have the money to advertise to get the most reach possible, like for major sporting events, awards shows and all over the internet, which makes this problem even worse.
@beardalaxy
@beardalaxy Жыл бұрын
There was the story from i think a year or 2 ago where Google flagged a guy who had sent a text of a rash on his child's genitals to their doctor, and he got visited by the cops and everything. Had a really bad time with it. I can't imagine the stress from that happening with everyone thinking you actually had CSAM on you. That's messed up.
@janmaker227
@janmaker227 Жыл бұрын
This!!! Had a similar situation and you know wha try the most crazy thing? As a parent you get crazy thinking about how many people get to look at it that are not you or your doctor!
@beardalaxy
@beardalaxy Жыл бұрын
@@xCDF-pt8kj right, it's impossible for an AI to know intent unless you very clearly spell it out to them on a case-by-case basis, and sometimes not even then.
@tangentfox4677
@tangentfox4677 Жыл бұрын
The true highlight of that particular story is how even after proving innocence and getting everything sorted out, Google upheld banning him from their platforms for CP. Which means this guy can never work at any company that uses Google products, can't access any of his data that Google held, can't use Android, can't access his email, can't ever use any tool built on a Google login. He is effectively locked out of a significant portion of society simply because an automated system doesn't understand context matters.
@evil_radfem9162
@evil_radfem9162 Жыл бұрын
so, the scanner worked
@beardalaxy
@beardalaxy Жыл бұрын
@@evil_radfem9162 it did, but it lacks context and thus fucks over someone
@cubusek5849
@cubusek5849 Жыл бұрын
Maybe they should include to detect photos of repairing your Apple device, so they can send the secret anti-right to repair police
@canardchronique3477
@canardchronique3477 Жыл бұрын
Are you under the impression that they aren't currently doing exactly that? On the remote chance they aren't yet, maybe you shouldn't give them any ideas...
@Hackanhacker
@Hackanhacker Жыл бұрын
at ome point apple gunma make transformers for rwal damn it
@Network126
@Network126 Жыл бұрын
@@Hackanhacker Lol how old are you?
@KyzoVR
@KyzoVR 4 ай бұрын
@@Network126i am 4
@arghpee
@arghpee Жыл бұрын
>petite woman nudes false flagged by Apple Bot >sent to Apple HQ all part of the plan.
@appalachiabrauchfrau
@appalachiabrauchfrau Жыл бұрын
tfw 4ft11 womanlet with babyface realizing Apple has turned me into a weapon, gg.
@arghpee
@arghpee Жыл бұрын
@@appalachiabrauchfrau mfw 6'5 and own an Android. 🗿
@BlazeEst
@BlazeEst Жыл бұрын
Just deleted nsfw content of petite aged women because this video got me paranoid, in the future there’s gonna be laws from being attracted to youthful petite women
@synexiasaturnds727yearsago7
@synexiasaturnds727yearsago7 Жыл бұрын
@@arghpee damn, should've trolled big bro whatever, you're putting the "big bro" in the first sentence anyways
@anglepsycho
@anglepsycho Жыл бұрын
Lmao, I'm concerned for East Asian middle-aged women now, better get that surgery if you don't want anything bad done to your chest online.
@matthewsjardine
@matthewsjardine Жыл бұрын
Apple wasn't happy with the old adage: "The cloud is just someone else's computer...". They decided to take it one step further, 'your computer is just someone else's computer'.
@paracelsus407
@paracelsus407 Жыл бұрын
If Apple wanted to protect children, they'd build it into a Camera app, rather than search files on the device. All this does is make it easier to frame innocent people.
@brandonw1604
@brandonw1604 Жыл бұрын
They don’t search on device. This article is 100% wrong.
@xinfinity4756
@xinfinity4756 Жыл бұрын
@@brandonw1604 could you provide a reputable article demonstrating or at least explaining how/ why it doesn't?
@brandonw1604
@brandonw1604 Жыл бұрын
@@xXGeth270Xx but they’re not scanning anything so there’s that.
@Axman6
@Axman6 Жыл бұрын
Having it in the camera makes no sense, this was supposed to detect *known* CSAM material, which is derived from a database of image hashes of material collected by law enforcement agencies. It isn’t capable of detecting new material being generated, which is a massively more difficult problem and one significantly more prone to false positives.
@brandonw1604
@brandonw1604 Жыл бұрын
@@xinfinity4756 the other part is common sense. The scanning was done server side on iCloud. They’re not going to use a process calling home that you can block with an application firewall like Little Snitch.
@VBYTP
@VBYTP Жыл бұрын
One more reason I'm really glad I didn't buy an Apple device this Christmas. IMO if you want to protect children, start putting in real punishments for child abusers and leave good normal people alone.
@notafbihoneypot8487
@notafbihoneypot8487 Жыл бұрын
It's like getting rid of encryption, it would only hurt everyone. Even if it's for a good cause. I don't trust that it wouldn't be abused on a massive level
@pluto8404
@pluto8404 Жыл бұрын
That would be the logical thing to do. But as we have seen in the gun debate or war on drugs, it is about controlling the normal people, they dont care about the criminals.
@ryderostby
@ryderostby Жыл бұрын
Its always the safety excuse, always when it comes to taking away your privacy because ‘the less control you have the better’. Yeah I feel even better giving my data to the biggest data whores on this fucking planet
@m0-m0597
@m0-m0597 Жыл бұрын
come on guys, who really needs privacy? Let them just look through you. All will be fine. Also, isn't it just annoying being constantly concerned about stuff like that? Enjoy life and stop thinking :-)
@SergioLeonardoCornejo
@SergioLeonardoCornejo Жыл бұрын
Children are the excuse because they know that appeals to the feelings of people
@stephaneduhamel7706
@stephaneduhamel7706 Жыл бұрын
How would they even train an AI to recognize this kind of images? I can't imagine any ethical or legal way of getting training data.
@sprtwlf9314
@sprtwlf9314 Жыл бұрын
Ethical and legal aren't considerations for the global elite.
@sampletext9426
@sampletext9426 Жыл бұрын
@Happy Hippie Hose why is it ok for the government to have the largest collection of see pea, but not us?
@masterloquendo0
@masterloquendo0 Жыл бұрын
@@sampletext9426 nigga what?
@ali-1000
@ali-1000 Жыл бұрын
@@sampletext9426are you saying that you should have the right to store and view CSAM material 😟🤨
@sampletext9426
@sampletext9426 Жыл бұрын
@@ali-1000 absolutely not, but we both can agree that our leaders can store and keep all they want
@Reaya
@Reaya Жыл бұрын
The people who store this type of data aren't stupid enough to put it in the cloud, and I'm sure Apple is aware of this. The only reason this was even considered is that they want to collect even more information from their users under the guise of protecting the children.
@YouAreStillNotablaze
@YouAreStillNotablaze Жыл бұрын
They actually really are that stupid and the thing is so many do this out in the open it's astronomical and LEAs can't even keep up.
@MunyuShizumi
@MunyuShizumi Жыл бұрын
A moment of silence for all the conversations where our disapproval will immediately be rebuked with "So you're defending child predators?" followed by incoherent screeching noises.
@DsiakMondala
@DsiakMondala Жыл бұрын
reeeeEEEE
@BrandonCurington1
@BrandonCurington1 Жыл бұрын
People, this is what we have been talking about. The future. We need to stop it while we still can. (Remember; you will own nothing, and you will be happy) I do not want to live in a world like this.
@totallynotsarcastic7392
@totallynotsarcastic7392 Жыл бұрын
there is no political way to stop it
@appalachiabrauchfrau
@appalachiabrauchfrau Жыл бұрын
I'm just going back to disposable cameras. Got a roll of film developed and it felt like opening a present.
@MOTH_moff
@MOTH_moff Жыл бұрын
Print all your photos and keep them in a book. I'm only half joking.
@BrandonCurington1
@BrandonCurington1 Жыл бұрын
@@MOTH_moff yeah thanks im gonna print out my messages and mail them mf
@PsRohrbaugh
@PsRohrbaugh Жыл бұрын
There are thousands of people who had charges dropped when they proved that a virus or hacker downloaded the illegal images, not them. There are thousands in jail right now who claim to be innocent but were unable to prove it. That should terrify you.
@PsRohrbaugh
@PsRohrbaugh Жыл бұрын
@@bacon222 No. I can't post links here, but search things like "State Worker's Child Porn Charges Dropped; Virus Blamed" "Child porn downloaded by mistake" "Computer porn hacker is making our lives a misery" There used to be a lot more examples easily available through Google, but I've spent 20 minutes searching and can't find any. Hmm... 🤔
@fatalityin1
@fatalityin1 Жыл бұрын
@@bacon222 It is a common "hack" first introduced by 4ch during their scientology raids, when they first with driveby-laptops scanned the wlan and then downloaded illegal stuff to the scientology router storages. Now that software is a lot more sophisticated, some viruses even deliberately download illegal images from a govt honeypot into a hidden folder in your sys directory, making it impossible to find while you can expect a "FBI, open up" within a week. The shocking thing is: the more you are versatile with PCs, the more likely you are going to get sentenced, because in theory you should have been able to prevent it. People workin in IT in my country already fight this kind of legislature move, just because you work in IT doesn't mean you are omnipotent and even the accusation is enough to make you unemployed forever
@Dave102693
@Dave102693 Жыл бұрын
Idk why don’t think that this happens more often then not.
@fayenotfaye
@fayenotfaye Жыл бұрын
“There are thousands in jail right now who claim to be innocent but we’re unable to prove it” Source: it came to them in a dream
@Memelord18
@Memelord18 Жыл бұрын
source?
@mario7501
@mario7501 Жыл бұрын
There's a reason the government always calls laws that infringe on privacy something like "kid's online protection act". It's a very sinister game
@raj18x
@raj18x Ай бұрын
Agreed
@Tathanic
@Tathanic Жыл бұрын
say you don't like apple, "your icloud" suddenly finds 100TB of CP they found while scanning
@sampletext9426
@sampletext9426 Жыл бұрын
its scary that sea pe has become a weapon that can incriminate people. why aren't the citizens worried?
@25thDaveWalker
@25thDaveWalker Жыл бұрын
I love it that there's a reply under this comment but I can't view it, because youtube hid it from me. What a time to be alive
@audigamer8261
@audigamer8261 Жыл бұрын
@@25thDaveWalker same
@warlockpaladin2261
@warlockpaladin2261 Жыл бұрын
I too have noticed that the comment count is frequently off on YT.
@fayenotfaye
@fayenotfaye Жыл бұрын
@@25thDaveWalker it’s spam. There’s no proof as of yet that KZfaq shadow bans people from comments, they sometimes do it from the recommended feed but not from comments.
@ZucchiniCzar
@ZucchiniCzar Жыл бұрын
It's always in the name of "security".
@B1anticabal999
@B1anticabal999 Жыл бұрын
Or "officer safety "
@XXLuigiMario
@XXLuigiMario Жыл бұрын
Just not *your* security
@SimGunther
@SimGunther Жыл бұрын
More reason to use the BSDs, Linux, Haiku, and TempleOS
@nigeltheoutlaw
@nigeltheoutlaw Жыл бұрын
RIP Terry
@frog7362
@frog7362 Жыл бұрын
RIP brother terry
@hereticanthem5652
@hereticanthem5652 Жыл бұрын
BDSM as well
@steelfox1448
@steelfox1448 Жыл бұрын
RIP Terry
@RandyHanley
@RandyHanley Жыл бұрын
So true! These crooked companies are no different than crooked politicians, trying to sell it as something for the good of the people.
@gamergaminggmod
@gamergaminggmod Жыл бұрын
I love Apple - Pay 3 times more for a laptop that's overpriced af, and have your privacy violated, becuase "Think of the children". I remember these classic Simspons episodes where Reverend's wife screams "Won't somebody, please, think of the children!"
@by9798
@by9798 Жыл бұрын
You're my favorite KZfaqr bro! Always alerting me of important tech news without stressing me out and making me rage. Easy listening even when the matter is serious or infringing.
@JamesWilson01
@JamesWilson01 Жыл бұрын
The sad thing is that most Apple users are so brainwashed that they either don't care or think this kind of thing is a great idea 😬
@forbidden-cyrillic-handle
@forbidden-cyrillic-handle Жыл бұрын
Good for them. They get exactly what they want. The last good Apple for me was Apple ][.
@nitebreak
@nitebreak Жыл бұрын
@@forbidden-cyrillic-handle I have an apple phone and I like it well enough because of some of the features but this makes me nervous. I have music on my iTunes that was not all purchased and I wonder if they are gonna start dmca on private files…
@localvoid69420
@localvoid69420 Жыл бұрын
@@nitebreak same, hope they don't dmca me cuz I don't wanna get in trouble just because I was listening to music that's not available on itunes
@ArdivKmen
@ArdivKmen Жыл бұрын
@@forbidden-cyrillic-handle Every single phone manufacturer copies Apple in one way or another, what makes you think it will only be Apple that does stuff like this? No matter how much we complain all phones will get this eventually. You see what kind of wiretap people install in their homes, all they have to say "Think of the children" and suddenly disabling that feature makes you a suspected child molester.
@Skullet
@Skullet Жыл бұрын
No, the sad thing here is making gross generalisations about people based on the products they buy.
@veirant5004
@veirant5004 Жыл бұрын
The joke is that you can be imprisoned for a decade just for clicking on a "download" button somewhere on the Internet. I don't even give a cr*p about the content of what is being downloaded. Jail for saving a picture from the 'net. It's the f*cking end. And yeah, hey to the freedomest state of America ❤️.
@gorofujita5767
@gorofujita5767 Жыл бұрын
This has less to do with scanning for copyrighted material, and more to do with spying (or even falsely incriminating, when really needed) political or ideological opponents marked by the NSA. Also, in limit situations, think of what they could possibly do with undesirable counter-establishment speakers. If they can scan your files, they can also make something "bad" mysteriously appear in your phone during police custody. There's a world os possibilities to this, and I wish more people thought about the implications seriously.
@bedazzledmisery6969
@bedazzledmisery6969 Жыл бұрын
My prediction is this is gonna really bring back old school film photography and developing in a dark room to prevent images uploaded to any types of clouds.
@captainsmirk9218
@captainsmirk9218 Жыл бұрын
A LOT of people put their intellectual property in icloud - ideas, business plans, unpublished book scripts, trade strategies, engineer designs, personal and bank information, etc. etc. Its only a matter of time before this is stolen by a bad actor given this level of access "for the children" (whether a hacker or contractor paid to look through files). I've been an iphone supporter since the iphone 2G - I am now getting rid of ALL my apple products. The access isn't just images and videos.
@Haise-san
@Haise-san Жыл бұрын
Me too, fuck that company
@bcj842
@bcj842 Жыл бұрын
Don’t throw your stuff out unless you plan on buying some next-level privacy shizz to replace it. Throwing out your Apple device over a headline just to go out and buy something from Xiaomi or Google is just trading one prison cell for another.
@MrTonyBarzini
@MrTonyBarzini Жыл бұрын
@@bcj842 is pinephone viable?
@bcj842
@bcj842 Жыл бұрын
@@MrTonyBarzini Sadly, this is about where my expertise ends… I know there’s more privacy-oriented devices out there, but what I don’t know is which ones are solid and which ones to avoid. I don’t have the technical background to make that call.
@raphaelcardoso7927
@raphaelcardoso7927 Жыл бұрын
@Kougami where in their user agreement?
@xE92vD
@xE92vD Жыл бұрын
I wouldn't be surprised if iPhone users still continued using their spyware filled devices after witnessing this.
@ygx6
@ygx6 Жыл бұрын
@Vetrus cuz they isleep when people state facts
@fanban2926
@fanban2926 Жыл бұрын
Same with Samsungs tho
@digi3218
@digi3218 Жыл бұрын
witnessing what? Sent on iPhone
@ygx6
@ygx6 Жыл бұрын
@@fanban2926 duh, google devices but apple just exposed themselves for scanning local images and forwarding them to their servers, google hasn't _yet_
@staidey5994
@staidey5994 Жыл бұрын
@@fanban2926 it's a complete different thing when the company doesn't actively advertise their privacy features. Afaik, neither Google nor Samsung nor any other Android device manufacturers have ever advertised their phones as being private; however, apple had been claiming the privacy features of their phones until they got recently exposed for having their phone not being as private as they've claimed this entire time.
@kotzpenner
@kotzpenner Жыл бұрын
Corporations and Governments try not to spy on their customers/citizens (IMPOSSIBLE, GONE SEXUAL)
@hmr1122
@hmr1122 Жыл бұрын
I wouldn't be surprised if windows already has some type of file scanning in the works.
@by9798
@by9798 Жыл бұрын
Onedrive randomly sends me notifications about how it made various photo albums for me every few months and it freaks me out a little bit. Even if it's just looking at timestamps in metadata it's like WTF I did not ask for this.
@anglepsycho
@anglepsycho Жыл бұрын
@@by9798 I forgot they do that-
@ra2enjoyer708
@ra2enjoyer708 Жыл бұрын
They did play with the idea of showing you ads right in the file manager, so it's not so far off. Especially how windows normalizes having superuser privileges at all times, therefore any program can send and receive arbitrary data from the internet.
@DsiakMondala
@DsiakMondala Жыл бұрын
@FLN 764 Trop kek. There is no such a thing as deleting something you uploaded to the internet new friend. It is there forever.
@DsiakMondala
@DsiakMondala Жыл бұрын
@FLN 764 You... think you get to choose what goes on your windows installation? A-are you t-that new? OwO'
@GSFigure
@GSFigure Жыл бұрын
Madness is about to flourish.
@notafbihoneypot8487
@notafbihoneypot8487 Жыл бұрын
Don't worry. Give me your Phone number and I'll send You secure link to protect your DATA. SPONSORED BY NORDVPN
@MaxDankOG
@MaxDankOG Жыл бұрын
@Vetrussuper based
@totallynotsarcastic7392
@totallynotsarcastic7392 Жыл бұрын
"about to"?
@big_red_machine3547
@big_red_machine3547 Жыл бұрын
And AI is about to magnify what’s already so wrong about this sick society
@Slugbunny
@Slugbunny Жыл бұрын
Can practically see the privacy being dusted out of that ecosystem.
@daverei1211
@daverei1211 Жыл бұрын
You got to wonder the potential misuse of this where bad actors use adware or drive by malware to drop “suspicious” images and then try to extort you by showing that there is a file, and post to this scanning, and that by paying them 1btc for them to protect you……
@o-hogameplay185
@o-hogameplay185 Жыл бұрын
Louis Rossmann made a video about a man, who was asked by his son's doctor to send pictures about his son's injuries, so he can get the correct medicine faster. google scanned the photos, and alerted the police because they mistook it as csam. the worst part in this is, that if i am correct, before talking to the police, a human went through those photos, CONFIRMING they were csam... so not only IA will scan your photos (or files) but some karens too
@geeshta
@geeshta Жыл бұрын
So if Google automatically makes collections out of my photos like "Animals", "Food", "Nature" etc. does it mean it scans my photos as well?
@PvtAnonymous
@PvtAnonymous Жыл бұрын
is this a rhetorical question?
@tmacman0418
@tmacman0418 Жыл бұрын
Yes, all your photos are fed into it's AI including the faces of everyone you took a picture with so they can track and send ads to you more effectively.
@totallynotsarcastic7392
@totallynotsarcastic7392 Жыл бұрын
do you really need to ask?
@DarthChrisB
@DarthChrisB Жыл бұрын
No, the computer just happens to know what's in the photo whithout looking at it...
@filiphabek271
@filiphabek271 Жыл бұрын
For me it doesn't happen. Which google product do you use?
@ProteinFeen
@ProteinFeen Жыл бұрын
Just found your account and man you are an awesome creator. I’m surprised I haven’t seen your videos before. I know nothing about half the stuff you talk about but you got a like and sun from me!❤
@ProteinFeen
@ProteinFeen Жыл бұрын
Sub*
@MentalOutlaw
@MentalOutlaw Жыл бұрын
Thanks, glad you enjoy the videos
@mattgamei5vods649
@mattgamei5vods649 Жыл бұрын
@@MentalOutlaw when do you launch your onlyfans?
@ProteinFeen
@ProteinFeen Жыл бұрын
@@MentalOutlaw great stuff watching this on my iPhone 14 kms😂😂
@ProteinFeen
@ProteinFeen Жыл бұрын
@TruthfulIy someone has to clean the toilets everyone can’t just play in the computer
@bradhaines3142
@bradhaines3142 Жыл бұрын
this is going to be a great way for apple to destroy peoples lives for nothing. and even worse, this could end up a boy cried wolf issue of all the false positives making people ignore when its a legit positive.
@ozymandias_yt
@ozymandias_yt Жыл бұрын
Two things first: 1. The following criticism isn’t supposed to defend Apple. It’s just about getting a broader perspective. 2. The approach of analysing personal data is alarming, which makes the main point of this video absolutely valid. BUT I think there are a few things wrong here. 1. “Mediaanalysisd” is not new. It has been around for some time and is used for many different things. One of them is Spotlight. Everything typed into spotlight is sent to Apple Servers for analysis, because there is a web search feature in this program as well. Even without iCloud, I can imagine the intelligent file search is still turned on, which allows the user to search for stuff and MacOS finds local pictures with the corresponding thing in them. I see no evidence, that Jeffery Pauls pictures are sent to Apple or that they are scanned for CSAM content. 2. Apple uses hashes for comparison with the NCMEC database. This means there is no AI, which is analysing photos like some people seem to imagine it. Theoretically it is even supposed to „check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures“ (The Verge). The last statement is questionable, because the algorithm apparently can not guarantee perfect avoidance of false positives. However this entire concept can’t be compared with full image recognition software, even though it is still scary. 3. Another thing so many people got wrong. Apple never said they will stop working on their CSAM scanning software. They just stated that they will push the release further into the future because further development is needed. This means we definitely still need to keep an eye on this. 4. There are at least a few ways of transferring photos from iOS devices to the Mac without the photos app. The Finder, which is the file manager, is indeed one of them.
@nousquest
@nousquest Жыл бұрын
The fact that there are closed source media analysis tools, important part being that they have to contact apple's servers to work, running in the background, given their track record, makes the suspicions valid.
@MillywiggZ
@MillywiggZ Жыл бұрын
Adobe is doing the same with everyone’s Photoshop, Illustrator, etc. files. But that’s to train their A.I. art program.
@Spaghetti742
@Spaghetti742 Жыл бұрын
Thats actually a decent idea, if it were consent based. I mean think about all of the people who use photoshop/illustrator. BUt there needs to be a way to opt out
@Memelord18
@Memelord18 Жыл бұрын
@Spaghetti8696 I think it should be opt in for copyright reasons
@users4007
@users4007 Жыл бұрын
damn, if I ever use photoshop I’m def gonna pirate an old version then
@Spaghetti742
@Spaghetti742 Жыл бұрын
@@Memelord18 True, didn't think about that
@nothing_
@nothing_ Жыл бұрын
Can't they just scrape the internet like everyone else?
@smolbirb4
@smolbirb4 Жыл бұрын
I love your content especially the more Linux/privacy focused content, and more so when you cover gentoo stuff hardly anyone talks about gentoo
@Wampa842
@Wampa842 Жыл бұрын
They can't talk until they're done compiling.
@notafbihoneypot8487
@notafbihoneypot8487 Жыл бұрын
Gentoo takes so long to compile.
@notafbihoneypot8487
@notafbihoneypot8487 Жыл бұрын
@@Wampa842 😂😂😂
@smolbirb4
@smolbirb4 Жыл бұрын
@@Wampa842 true
@Icee47
@Icee47 Жыл бұрын
Bro I found your channel yesterday, and I’m binging all your vids man. So good! I’ll start becoming more private as well…
@LokiScarletWasHere
@LokiScarletWasHere Жыл бұрын
Client-side scanning is exactly what they advertised they'd do, before claiming they're rolling it back. What's changed is they're doing it even with iCloud turned off.
@IAmAlpharius14
@IAmAlpharius14 Жыл бұрын
always when i use any cloud platform i treat it as if all my files are publicly visible and just encrypt everything i upload onto it using gpg
@rejvaik00
@rejvaik00 Жыл бұрын
Can you teach me this?
@alexandervowles3518
@alexandervowles3518 Жыл бұрын
@@rejvaik00 you can just use 7zip to lock your files if you're lazy
@jamhamtime1878
@jamhamtime1878 Жыл бұрын
Yeah, but is there better methods that's much more easily integrated in any OS? I'm currently just using gpg, simple in linux, can easily install/use in windows, but very annoying to use on my phone when I do need it. Other comment said 7zip, maybe that would be more convenient? I never knew 7zip can even lock with passwords, I definitely will try later. Are there any more possibly convenient method for linux+windows+android? And honestly, gpg on android is not THAT inconvenient.
@aguywithaytusername
@aguywithaytusername Жыл бұрын
wait a minute. if it's an ai, how did they train it?
@upcomingweeb136
@upcomingweeb136 Жыл бұрын
Good question
@Dratchev241
@Dratchev241 Жыл бұрын
by feeding it tons of images that would get us tossed in a nice hotel with grey bars and doors.
@Ryfinius
@Ryfinius Жыл бұрын
Watch the snl skit where Dwayne Johnson trained his evil robot.
@RogueA.I.
@RogueA.I. Жыл бұрын
You know how…
@IaMaPh1991
@IaMaPh1991 Жыл бұрын
The glowies have literal terabytes of material in their possession, which they very likely fap to when they aren't arresting and prosecuting innocent citizens for "evidence" they likely planted in the first place. Of course they are willing to share it with a powerful corporation. It's certainly not illegal when THEY posses and distribute it amongst one another...
@h.e.pennypacker4728
@h.e.pennypacker4728 Жыл бұрын
This could be the best youtube channel for any topic, definitely for anything tech related
@alexander1989x
@alexander1989x Жыл бұрын
It's a very thin line between "scanning for crime" and "scanning for criticism, dissent, journalism and negativiry".
@warlockpaladin2261
@warlockpaladin2261 Жыл бұрын
Apple's relationship with China adds an interesting wrinkle to this fabric.
@c-LAW
@c-LAW Жыл бұрын
Just the attorney fees defending against an accusation is thousands of dollars if not tens of thousands.
@kevinmiller5467
@kevinmiller5467 Жыл бұрын
Don't worry investor, it doesn't cost Apple a dime.
@LarsLarsen77
@LarsLarsen77 Жыл бұрын
One of MANY reasons why I've never owned an apple product.
@DoublesC
@DoublesC Жыл бұрын
9:15 Apple is a honeypot for people who want privacy but don't understand technology
@benni1015
@benni1015 Жыл бұрын
Since barely anybody seems to know, in the EU the eu commission tries to implement something similar to CSAM, but instead for files in some cloud storage it is for our private communication. With client side scanning they want to have an A.I read through your messages, scan your photos etc before they are encrypted and sent to the receipiant.
@Basieeee
@Basieeee Жыл бұрын
I bet they will just scan for known checksums of csam files, but it could obviously be extended to anything they want to track.
@Bloom_HD
@Bloom_HD Жыл бұрын
Yes. Give them the benefit of the doubt. Always. Apple is your friend. /s
@homuraakemi9556
@homuraakemi9556 Жыл бұрын
Scanning the checksum wouldn't work since any alteration to the photo would change the checksum and that wouldn't catch any new CSAM either
@pluto8404
@pluto8404 Жыл бұрын
I hear Tim Cook liked "partying" on a famous island in the US Virgin Islands. I am sure he truly cares about this cause.
@ygx6
@ygx6 Жыл бұрын
They check for fuzzy hashes, not 1:1 checksums
@ygx6
@ygx6 Жыл бұрын
@@homuraakemi9556 they check for fuzzy hashes, not the exact checksum so cropping and editing won't do much but yes, new csam images would slide by the check
@buddylee6203
@buddylee6203 Жыл бұрын
Im kinda tired of xmrig and lominer being reported as a virus
@GuideZer0
@GuideZer0 Жыл бұрын
I wish I could tell companies, "I don't care about the specificity of your advertising. The violation of privacy and commodification of personal information for the purpose of targeted advertising is creepy to me and I don't want you to try to sell me things on that basis!!"
@floppa9415
@floppa9415 Жыл бұрын
That explains why battery life is going down the shitter.
@c-LAW
@c-LAW Жыл бұрын
7:46 "Litttle Snitch" is a great little utilities. I've used it on Mac for many years.
@warlockpaladin2261
@warlockpaladin2261 Жыл бұрын
Explain?
@c-LAW
@c-LAW Жыл бұрын
@@warlockpaladin2261 It's an outbound app firewall
@Mustachioed_Mollusk
@Mustachioed_Mollusk Жыл бұрын
This HAS to be a major breach in privacy. What information is going to be stolen using the, “Think of the children” excuse?
@kxuydhj
@kxuydhj Жыл бұрын
when this controversy first showed up i thought "good, if there's anything i hate more than children it's child predators", but the road to hell is paved with good intentions and this really is a great example. me getting pissed off at targeted ads might also have had something to do with my switch in attitude, but whatevs.
@BastianInukChristensen
@BastianInukChristensen Жыл бұрын
Both Google and MS already does CSAM scanning, but for the time being I only know of it on their cloud services.
@spencer3752
@spencer3752 Жыл бұрын
Unfortunately, the author fails to establish that `mediaanalysisd` is actually scanning their files and transmitting any data about those files to Apple's APIs in a way that is inconsistent with their policies. In fact, quick research suggests this has already been investigated with conclusions to the contrary. The only thing the author proves is that a network connection was attempted, which can be for any purpose, including purposes for which the user has provided consent. For example, Apple will OCR images with text automatically, a capability which is powered by mediaanalysisd, and is available irrespective whether images are stored in iCloud storage or not...
@perrywood3839
@perrywood3839 Жыл бұрын
Would love a video on how to setup something like Little Snitch/Glasswire/open source alternatives for those of us still using OSX/Windows to try and cull stuff like this
@TywinLannister0
@TywinLannister0 Жыл бұрын
Little Snitch Little Snitch is a firewall application that monitors and controls outbound internet traffic. Paid • Proprietary Firewall Mac
@andre-le-bone-aparte
@andre-le-bone-aparte Жыл бұрын
Question: Can you do a monthly news update of these kinds of topics? - This was helpful for us who use Unix / Linux but have to interact with MacOS / WinOS for work.
@11alekon
@11alekon Жыл бұрын
I wonder what would happen to weapon artist for games, cause we have tons and tons of images/videos/documents for all sorts of weapons, even ourselves holding the the gun to understand how to use it. Apple is going to go mental over it in a string gun country like the UK
@ra2enjoyer708
@ra2enjoyer708 Жыл бұрын
Just get a friendly visit by a glowie every now and then. And also don't forget to pay an attorney each time.
@mimikyu_
@mimikyu_ Жыл бұрын
as an artist i didnt even think about this. ive been learning to draw weapons recently and i know my phone already scans the images and guesses what object is in them. like it made a folder for my cats full of all the cat photos in my phone without me doing anything. so it can easily see me having weapon images and think im some sort of criminal
@tp6335
@tp6335 Жыл бұрын
What interests me the most is, if Apple is training a closed source ai in house to look for csam, they are bound to have a training set in their possession or provided to them. Is the existence of such a training set not highly immoral? What if there is a bad actor somewhere involved and a leak occurs?
@nsfeliz7825
@nsfeliz7825 Жыл бұрын
youre right ,someone somewhere is being paid to posssess child pron . for the purpose of training ai. th$ mere possesion of cp is indeed illegal.
@pyromcr
@pyromcr Жыл бұрын
Some engineer at Apple is just looking for an excuse to jack off all day and came up with this project.
@MisterPancake778
@MisterPancake778 Жыл бұрын
Me keeping memes on my various cloud accounts to make the FBI agents look at thousands of memes while they work
@transposedmatrix
@transposedmatrix Жыл бұрын
Regarding the false positives, Apple has stated that 1) you don’t get flagged for a single positive, 2) the probability of false flagging is roughly 1 in 10^12, and 3) that all flagged images are reviewed manually. So the scenario you described in the beginning is very unlikely, if not just straight up incorrect. This is all stated in the CSAM paper Apple has released by the way, and probably one of of the first sources you’d check… That being said, everything else mentioned in the video is obviously a cause for concern.
@sirpendelton5710
@sirpendelton5710 Жыл бұрын
Idc about false positives I don't want some jew looking at my phone
@transposedmatrix
@transposedmatrix Жыл бұрын
@@sirpendelton5710 Me too. I’m just saying that the first half of the video is basically completely wrong.
@Nightcaat
@Nightcaat Жыл бұрын
The other half’s wrong too. There’s no evidence that mediaanalysisd is actually checking for CSAM. The daemon has been present in macOS for years and is related to Spotlight for things like searching for text in photos
@transposedmatrix
@transposedmatrix Жыл бұрын
@@Nightcaat Yes, you’re right, I don’t think it’s conclusive evidence either. That being said, I don’t feel confident enough in my knowledge about that topic to claim it myself.
@kek207
@kek207 Жыл бұрын
Saying you don't have anything to hide is like telling people: you don't have anything to say ~Edward Snowden
@SuperTort0ise
@SuperTort0ise Жыл бұрын
9:24 "What happens on your iPhone, stays on your iPhone." Until your iPhone tells us what happened on it.
@pqsk
@pqsk Жыл бұрын
MS has been spying on Windows users for decades. I learned about this in a windows administration class. Up to windows XP there was a way to block it, but when Vista came out of you block it then windows no longer works.
@sirflimflam
@sirflimflam Жыл бұрын
mediaanalysisd is related to visual lookup. I have my own qualms with the inability to disable VSL but it's not inherently related to the whole CSAM thing. it's also been around forever in one form or another.
@goofylookincat5028
@goofylookincat5028 Жыл бұрын
I can’t remember where I heard it, but if image scanning isn’t actually involved, Apple will at least try and scan the hashes of the files and see if it matches up with their database. Now this is still bad because when this technology gets to governments, they can scan and look for ANY kind of file such as a picture of Xi Jinping photoshopped onto Winnie the Pooh.
@ra2enjoyer708
@ra2enjoyer708 Жыл бұрын
> image scanning isn’t actually involved > scan the hashes of the files What do you think is the process of getting a hash of the file, lol?
@Fixer_Su3ana
@Fixer_Su3ana Жыл бұрын
What do you mean photoshopped? He already looks that way.
@moki5796
@moki5796 Жыл бұрын
You should post an update to the Jeffrey Paul story. This "issue" has been resolved now, turns out it was a bug where the service would send empty requests whenever a media file was previewed. No information about the files were ever transmitted without consent.
@RegenerationOfficial
@RegenerationOfficial Жыл бұрын
While discussing, to found a company that installs private one home servers and rents leftover space, we ran into the issue to distinguish between legitimate childhood photos and abusive material. Like flirting, there is a fine line towards denying it ever was what it seemed to be.
@Spumoon
@Spumoon Жыл бұрын
Cleaned out my temp files on Windows the other day and my bing search before and after yielded different results. Maybe I'm just a hecking n00b, but that came as quite a surprise to me.
@ussenterprise3156
@ussenterprise3156 Жыл бұрын
When Your family are apple fans but you are not
@jaytrip420
@jaytrip420 Жыл бұрын
Every day I see more and more reasons why my switch to android based devices was more than necessary
@yevoidstar
@yevoidstar Жыл бұрын
This video either needs to be cleared up or outright deleted since the network connections made by mediaanalysisd were never proven or shown that it is actually reporting results of scanning images for "CSAM". And it's coming out now that this was an outright bug... Nice video though.
@apIthletIcc
@apIthletIcc Жыл бұрын
Apple: "trust us we're not spying at all when we scan local files without permission" FBI: "trust us you need to use adblock" T-Mobile: "trust us not ALL your personal info was stolen 3x in the last two years"
@DYhalto250
@DYhalto250 Жыл бұрын
I mean imagine being a parent and having baby pictures on you phone and getting arrested over that
@avyam7509
@avyam7509 Жыл бұрын
It already happened when google reported a parent to fbi for sharing the pictures of her child who was sick to a doctor.
@DYhalto250
@DYhalto250 Жыл бұрын
@@avyam7509 that's insane. My children are already grown so less of a problem except for grandchildren.. it's spookie
@ilearncode7365
@ilearncode7365 Жыл бұрын
Incorrect/false reports should be a capital offense as the accusation itself is an attempt at destroying someone’s life. That would dry up wanton reporting in 1 day.
@something4922
@something4922 Жыл бұрын
@@avyam7509 And photos without sexual intent aren't supposed to illegal(unless someone saves a copy with sexual intentions for it) but google decided to ban him and report him.
@vylbird8014
@vylbird8014 Жыл бұрын
@@something4922 And how do you prove no sexual intent?
@georgelsgomes9634
@georgelsgomes9634 Жыл бұрын
as far I remember about Apple's neuralHash white paper it actually dont require the image parsing. it's much like imagededupe. from a given file u compute a hash and then compare its value with CSAM_DB, "if the string distance is >= threshold, then is csam, else not_csam". sorry my english 😅
@Muzzino
@Muzzino Жыл бұрын
If somebody sends you an image over Discord (and other platforms), it gets cached to a hidden folder on your computer. Any image that comes up on your web browser gets cached as well, whether you're intentionally viewing it or not. Hard drives can even retain this data after it's been deleted.
@AnonymouslyBlack
@AnonymouslyBlack Жыл бұрын
Thank you for sharing this info. What a time to be alive, I swear. I'm not as well informed as you all are when it comes to secure systems, and will conduct my own research after posting this, but my question is "What is considered a secure mobile phone system these days?". Microsoft is out, Google is out, and now Apple. What are we left with? Or is it now up to the user to jailbreak their own system for some privacy?
@synexiasaturnds727yearsago7
@synexiasaturnds727yearsago7 Жыл бұрын
Google Pixel, then get rid of the OS and replace it with something else.
@ra2enjoyer708
@ra2enjoyer708 Жыл бұрын
There is no such thing as a secure mobile system, since the entire idea of smartphone is being a datamining device being constantly hooked to the internet.
@Zakkious
@Zakkious Жыл бұрын
Smartphones are inherently unsecure, unfortunately.
@vladusa
@vladusa Жыл бұрын
There was this idea at theopenpaper to design a virtual filesystem that uses steganography to alter images ever so slightly to make them useless. The filesys would encrypt the files when they enter the Mac and decrypt them on their way out, and the encrypt/decrypt logic is stored on the USB's microcontroller when it reads the bus between the Mac and the USB.
@ra2enjoyer708
@ra2enjoyer708 Жыл бұрын
And then your performance getting completely tanked, since adding encryption/decryption steps to something as basic as i/o pretty much kills caching.
@vladusa
@vladusa Жыл бұрын
@@ra2enjoyer708 But speed is the price you have to pay for security and ownership. We've seen this with Bitcoin. Your point is?
@fayenotfaye
@fayenotfaye Жыл бұрын
Just run Linux and encrypt your drive lmao
@aldrickfondracul9297
@aldrickfondracul9297 Жыл бұрын
It's times like this that I'm glad I've stuck with dumbphones for so long, even though I am pretty much fighting progress at this point. I seriously dread the day I finally get a smartphone.
@mcall9800
@mcall9800 Жыл бұрын
Apparently this was a bug in previous versions of Mac OS and the request was actually empty. Louis Rossman made a video talking about the update I would recommend.
@alifahran8033
@alifahran8033 Жыл бұрын
Every single day this cursed world convinces me more and more that Uncle Ted is right. I am one step away from throwing my smartphone into the trash can and buying a "dumb" phone. The only thing stopping me as of today is the need for 2FA for my job.
@matthew8153
@matthew8153 Жыл бұрын
“Dumb” phones today aren’t actually dumb. They use android.
@alifahran8033
@alifahran8033 Жыл бұрын
No, I meant the likes of a Nokia from the late 2000s - early 2010s. The most basic phone that you can imagine. My favourite phone of all time was my Nokia 6300. Metal frame body, thin and cool design for it's time, SD slot, ability to run games (Real Soccer 2010 and Spiderman (don't remember the exact name) were my favourite)). It had zero spookiness and all the functionality.
@matthew8153
@matthew8153 Жыл бұрын
@@alifahran8033 Sadly none of those old phones work. They took down the 3G and older networks.
@alifahran8033
@alifahran8033 Жыл бұрын
The perks of not living in the USA. In our country (Bulgaria) the network providers still support 3G, but I am not 100% sure about 2G.
@JPX64Channel
@JPX64Channel Жыл бұрын
@@matthew8153 there are a lot of 3G phones from late 00s to early 10s, they still work.
@Krynos18
@Krynos18 Жыл бұрын
Hash based detection for known files. The real problem comes in with hash collisions. If your graduation photo happens to generate the same hash as a previously identified CSAM image, even if the photos are completely different, you are out of luck until a real human bothers to look at the file.
iPhone Mistakes That RUIN Your Privacy
10:57
All Things Secured
Рет қаралды 21 М.
Microsoft Is Decrypting Your Files in The Cloud
8:14
Mental Outlaw
Рет қаралды 231 М.
MEU IRMÃO FICOU FAMOSO
00:52
Matheus Kriwat
Рет қаралды 37 МЛН
small vs big hoop #tiktok
00:12
Анастасия Тарасова
Рет қаралды 24 МЛН
The EU Wants to Control Every Citizens Chats
9:50
Mental Outlaw
Рет қаралды 185 М.
Spyware at The Hardware Level - Intel ME & AMD PSP
10:27
Mental Outlaw
Рет қаралды 458 М.
What Happened To Google Search?
14:05
Enrico Tartarotti
Рет қаралды 3,1 МЛН
Government wants to END Apple and iPhone
12:53
The Apple Circle
Рет қаралды 147 М.
Artists Are Fighting AI With AI
16:52
Mental Outlaw
Рет қаралды 302 М.
How Apple Private Relay Kills Data Profiling
9:36
Rene Ritchie
Рет қаралды 177 М.
Windows, macOS & Linux PRIVACY compared: why do they need ALL THIS DATA?!
16:05
The Linux Experiment
Рет қаралды 115 М.
Apple’s Craig Federighi Explains New iPhone Security Features | WSJ
6:26
The Wall Street Journal
Рет қаралды 317 М.
Mullvad and Tor Linked Up to Make a Web Browser
12:23
Mental Outlaw
Рет қаралды 222 М.
Hisense Official Flagship Store Hisense is the champion What is going on?
0:11
Special Effects Funny 44
Рет қаралды 2,4 МЛН
#miniphone
0:16
Miniphone
Рет қаралды 3,6 МЛН