Welcome back to mobile devices.that recaps the latest mobile OS news, mobile applications, and the overall app economy. The app industry continues to grow, with a record 218 billion downloads and $143 billion in global in 2020. Consumers last year also spent 3.5 trillion minutes using apps on alone. And in the U.S., app usage surged ahead of the time spent TV. The average American watches 3.7 hours of live TV daily but now spends four hours per day on their
This Week in Apps in your inbox every Saturday? Sign up here: techcrunch.com/newslettersthey’re also a big business. In 2019, mobile-first companies had a combined $544 , 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into a figure up 27% year-over-year. This Week in to try, too. Do you want
Apple to scan for CSAM imagery.
a significant initiative to scan devices for CSAM imagery. The company on Thursday announced a new set of features, arriving , that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement. Companies like Dropbox, Google, and Microsoft already scan for CSAM in their cloud services, but Apple had to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices; platforms detect when users upload known CSAM imagery without decrypting the images. It can even see the imagery if it’s been cropped or edited to avoid detection.
Meanwhile, on iPhone and iPad, the company will roll out protections toto filter images and alert children and parents if sexually explicit photos are sent to or from a child’s account. Children will not be shown the photos but see a grayed-out image instead. If they try to view the image through the link, they’ll be shown interruptive screens explaining why the material may be that their parents would be notified.
Some privacy advocates also benefit iOS developers who deal in user photos and uploads, as predators will no longer store CSAM imagery on iOS devices in the first place, given the new risk of detection.at such a system, believing it could expand to end-to-end encrypted photos, lead to false positives, or set the stage for more on-device government surveillance. But many cryptology experts believe the provides a good balance between privacy and utility and have endorsed the technology. In addition, Apple said reports are manually reviewed before being sent to the National Center for Missing and Exploited Children (NCMEC). The changes may