With its latest iPhone, Apple introduces Face ID. With it, users can unlock their phones by just looking at them. Microsoft has had a similar feature called Windows Hello for a few years, so Apple is playing catch up here, but Apple has been copying others for a few years now so this is nothing new.
Now that Apple has this capability, the question is how it plans to use in the future. Microsoft has pretty much left it to unlocking devices. But Apple wants to be an innovator and it is hard to believe that is as far as it will take it.
There are already algorithms that can tell your mood from your picture. If you look sad, will Apple automatically open your phone to a happy news story or iTunes song? Or automatically add an emoji to your texts when you are angry? Talk about autocorrect fails – sending a by mistake could really hurt!
On the other hand, if the phone detects that you are angry, Siri could pop a warning for you to cool down before pressing send on that email or text.
The algorithms can also make a pretty good guess of your gender, age, weight, etc. Maybe Apple will use this to ‘help’ you shop. Or find a date.
The worst would be Siri automatically sending a text to you parents if it detected something amiss. Who wouldn’t want their mother calling all the time asking what’s wrong?
What if Apple starts making your face, or some data gleaned from it, available to others, like it does with location? It could all get very creepy very quickly, appropriate for a phone with a Halloween release date.