I’ve heard the term “Identity based computing” thrown around a bit recently, but maybe we should be thinking Dynamic Privacy. “Identity based” seems not right given that for many years UX has been aware of the user: notifications just for you; your weather; your stocks; more recently, the computer being unlocked by the proximity of a biometrically authenticated device (Apple Watch).
But in watching a preview video of the iPhone X experience a little piece struck me: the phone recognizing your face, and then doing a full reveal of the previously private contents of notifications. This seemed to be taking information disclosure to another level. Subtle, but check it out (the video starts at the 2:15 mark):
A bit of a leap, but it reminded me of the Minority Report / Blade Runner ads seeming to address you personally as you walk by.
Tricky stuff. There’s a fine line between convenience, and embarrassment. If the iPhone is laying on the conference room table in a meeting, and the contents of my texts is intentionally hidden lest they reveal something personal (“Honey, did you like what I packed in your lunch box today?”, or “I can’t believe Harry is droning on again”), maybe it’s actually better that way.
Of course, this is where AI — in this case Siri — will come into play, and further entrench the OS-based agents. Siri should know that I’m in a meeting based on my calendar, and despite FaceID correctly identifying me, OS X would know to be discreet. Perhaps FaceID will end up calibrating some degree of how much you’re paying attention: you need to be able to see that you have notifications, but maybe content is revealed progressively as the phone is brought to a proximity or viewing angle deemed private enough.
Being geo- and calendar-aware is relatively straightforward (oh how the bar has been raised!). Going the next step to being aware of actually who you a with might be tough because you’re straddling OS ecosystems (Siri having to be aware of, say, your Google apps).