Apple Inc is gearing up to launch the latest version of its iPhone and iPad operating system, which will contain nudity detection in the Messages app for devices used by children, as per journalist Mark Gurman. What Happened: Gurman said in the latest edition of his newsletter that coming changes to Messages in iOS 15.2 were “significant.” Apple would be delivering nudity detection in Messages along with new options in Siri voice assistant for learning how to report child abuse in the latest update. There is no information regarding the technology to detect child sexual abuse material or CSAM in iCloud photos. Why It Matters: Child-owned iPhones, iPads, and Macs will analyze incoming and outgoing images passing through Messages to detect nudity. If found, the picture will appear blurred and the child will receive a warning before viewing it. Similarly, if the child attempts to send a nude image, they …