Apple will soon scan iPhones in the U.S. for images of child sexual abuse. The company says it will use a tool called "neuralMatch," which will detect sexually explicit content of children without encrypting users' messages. Researchers are raising security concerns over how the new feature could be misused. Prevailion CEO Karim Jijazi joined Cheddar News to discuss.