Apple recently announced plans to scan its users' iPhones for images of child sexual abuse with a tool called neuralHash, which aims to do so without decrypting user messages. While the aim is largely uncontroversial, the potential for misuse has made some in the information security and data privacy world wary. Karim Hijazi, a cybersecurity expert and CEO of Prevailion, joined Cheddar to discuss the ramifications of Apple's approach to combating child exploitation. "It's a very big step for Apple, certainly not without its challenges," Hijazi said.