Apple has announced that it is suspending its Siri Quality Assurance program worldwide.
What happened
Case
Sometimes Siri starts recording when she talks about itno one asks: for example, when zipping up clothes or after raising your hand with your Apple Watch. As a result, contractors often hear couples having sex, confidential medical information, and even conversations about drug deals.
Apple says contractors receive less than 1%daily user requests in order to analyze how Apple is working correctly. Typically requests last only a few seconds. At the same time, the audio recordings “were supplied anonymously, randomly and were not tied to user names or IDs.”
However, the employees themselves say that they receiveinformation about the user’s location, so finding out his name is not difficult. Yes, and they have a high turnover, so, in theory, users’ personal conversations could fall into the wrong hands.
What now
Apple responded to this informationand suspended the program involving contractors. Now it will be revised: the company plans to release an update that will allow users to choose whether to submit their records for review.
“We strive to provide the best possible experience.use of Siri while protecting user privacy. While we conduct a thorough review, we are suspending the Siri performance assessment program on a global scale. Additionally, with future software updates, users will have the opportunity to opt out of participating in the rating program,” Apple said in a statement.