I went to an interesting meeting of the Churchill Club the other night called “Wearable Technology: The Next Frontier”. The panel was composed of a number of experts: Yeves Behar from Jawbone, Mike Bell from Intel, Justin Butler from Misfit and Monish Perkash from Lumo Body Tech. Although the moderator asked great questions, as did the audience at the end of the session, no one talked about collaboration. They all talked about the data from whatever device one was wearing, from a Fitbit to sensor laden clothing.
A recent report has Xiaomi (China, has a $15 fitness tracker) as the Second biggest Seller of Wearables behind only Fitbit. Fitbit has 34% of the current market, with Xiaomi next with almost 25% of the wearables market. With Garmin, Samsung, and Jawbone the next three biggest players. This does not count the recently released Apple Watch, which will probably sneak into the top 3 the next time the report is done.
But the question here is not that these devices will help aggregate health data, or be able to share data with each other, or even who owns the data, but will these devices stop being “walled gardens” and allow people to interact and coordinate with each other. I know a number of these apps have a social component, I use MyFitnessPal on my iPhone, and it has a social component (not that I have taken advantage of it, most of my social stuff around weight and health is done with the Boot Camp I belong to).
Xelflex motion sensors as part of clothing using fieberoptic thread
Our research shows that the device currently mostly used for collaboration is the laptop (70%), we did look at smartphones (12%) and Tablets (12%) but not wearables. However, if wearables are such a big trend and will end up being our interface to the IoT, how will we collaborate with each other around all of this data and content? I am still waiting to hear about this trend from those who are experts on wearables. Michael Sampson, An Analyst in New Zealand who I have a great deal of respect for, also did an article on this topic, but focused on the new Apple Watch as more of a collaboration tool since you can not only send/receive phone calls, but can control a PowerPoint, view and share stored photos (One Drive) and deal with Tweets, work with OneNote or EverNote, get on screen texts and calendar and meeting reminders.
Michael makes a good point of looking at wearables as a point of presence or availability, the ability to share location data and working through Microsoft Lync (Now Skype for Business), it can let you know if a colleague is nearby, if they had checked out a document you posted, to share files (OneDrive, Box,Dropbox, etc.). The ability to vote on ideas in (IBM connections), or meeting room availability and directions. However, I can already do all of these things on my iPhone (which is why I probably keep wearing my old analog/digital dive watch, in case I end up in water… which would not be good for the iPhone). Yes the 6+ is big and clunky, but it has become an invaluable interface to almost everything.
Form the Churchill Club panelists “wearables is the most difficult to design, you have to put more technology in a smaller space, and make it cool and also personal (or people will not buy or use it).” I agree with them around all the hardware and software challenges as well as UI issues with wearables. However, my opinion is that more and more clothes now have sensors in them, and since most of us wear clothes (as a habit), it is an easy behavior to take advantage of, rather than requiring a new habit of strapping something onto your wrist, head, neck, etc. So I see today’s wearable devices as every early prototypes for what will become a common wearable, smart clothing. So then the question is, can my shirt talk to your underwear…or vice versa?
Chemical sensing electrodes in the elastic waistband of underwear