On April 27th the New York Times released an article explaining the struggle parental control apps including Kidslox have had with Apple over the last year or so and revealing that Kidslox along with another parental control app provider have lodged an official complaint against Apple with the EU’s antitrust authority. The story was picked up by all major news outlets globally (BBC, CNN, Wired, Techcrunch, the Verge, Mirror, Sky News, Independent, The Times, MIT technology review and many others).
The attention has risen to such a level that after Apple exec Phil Schiller defended the company’s position in a private email, Apple felt they needed to respond to the article publicly too. In their public statement they alleged that they had “removed several parental control apps from the app store” because “they put users’ privacy and security at risk” through their use of Mobile Device Management (MDM).
This isn’t the time or place to dig into the details of our complaint to the European Authorities, but it is interesting to us that Apple has chosen to focus on the issue of user privacy and data security in their response.
It’s surprising because in the course of the 9 months that we’ve been having these difficulties with them, Apple didn’t even mention the issue of user privacy for the first 6 months, nor has it ever raised concerns over privacy during the last 5 years of the Kidslox app being in the store and having been approved by the apple review board over 50 times. Even after the issue was finally raised on the 7th month of our back-and-forth with Apple, it was never the focus of their instructions to us.
Now, suddenly, user privacy and security is centre stage in their reasoning. A simple, easy to understand explanation for their behaviour. Unfortunately, the real situation has more complexities than their statement allows for.
Having said that, we applaud Apple’s emphasis on the importance of user privacy and security. We agree wholeheartedly that these are vital elements of their ecosystem, which need protecting.
Kidslox has always prioritised user privacy and data security in our approach to implementing parental controls. The introduction of the GDPR last year led us to make a few small changes to our policies and practices, but for the most part the internal review that it provoked highlighted for us the many ways in which Kidslox already values and protects the privacy and data security of our users.
In fact, protecting our users children is the core purpose of Kidslox as a system. Protecting them from a whole range of physical, social, mental and cyber problems associated with excessive, unsafe and inappropriate device use.
The growing public awareness of the potential risks posed to children by unlimited device access became greater than ever last week as the World Health Organisation released their recommendations that infants 0–2 years old should spend 0 hours a day sat sedentary in front of a screen and 3–4 year olds should have a maximum of 1 hour. In 2015 Common Sense Media found that American teens average almost 9 hours of screen time a day.
But just like most parents who’ve given their child a device, we don’t need to be told about it’s dangers. We’ve witnessed first hand the disruption devices can cause to family communication, sleep patterns and school grades and heard plenty of horror stories from friends, family and Kidslox users of even more serious problems.
Apple obviously recognise the need to address issues like this, that’s why they’ve released features to help families place boundaries on screen time. Their supposed concern rings hollow though, when they simultaneously make it impossible for other solutions to provide effective parental control measures.
Claiming that use of MDM is unsafe as a reason for shutting down parental control apps is misleading for several reasons. First of all, the blocking of Kidslox store updates in mid-2018 was not initially related to our use of MDM. Secondly, while MDM as a technology does have some potential for abuse, we have had an MDM based app in the store for almost 5 years now and have a proven record of responsible use, including explaining to our users during set up what Kidslox uses MDM for (blocking access to apps when kids use up their parent-set time limits) and requesting their permission to allow us to do this. Thirdly and perhaps most significantly, Apple offers developers no effective alternative to MDM for achieving parental control features, despite using such alternatives themselves in their own “Screen Time” features.
Ultimately, making the “Screen Time” API’s public is the solution to this issue which would truly prove Apple’s commitment to the safety and welfare of children. This would allow 3rd party developers like us to create effective products that give users genuine choices, while also complying with Apple’s self-set standards.
That’s why we’re adding our voice to that of so many developers, parents and even former Apple executives like “father of the iPod” Tony Fadell, in calling on Apple to make those API’s public and #GiveParentsControl to choose the tools best suited to their family’s needs.