In the future, this should enable an ear-based sensor to analyze your form (say, you have a stride imbalance while you run) or learn gesture controls (nod your head to answer a call, or shake it to send to voicemail). At the moment, no ear-based sensor has these capabilities, but Hviid says it’s a possibility for upcoming models.