As an app developer, you can retrieve sensor data from SmartBand 2 using standard sets of APIs for Android and iOS platforms. Depending on the platform you are developing for, you can use Sony’s Lifelog API, Google Fit APIs or Apple’s HealthKit. You can then analyse and process the sensor data to provide useful insights and visualizations that can help your users live a healthier lifestyle.
Sony and VerbaVoice are collaborating on a project to provide live audio transcriptions and translations on SmartEyeglass. VerbaVoice, a company providing accessibility services through technology, have created an app, LiveCap, which converts speech into text and, also, provides language translations. These captions are superimposed on the SmartEyeglass. If you wear the SmartEyeglass at a speaker event, you could receive the speech as text on your SmartEyeglass, or even translated into a language you can understand.
Today we have released a brand new Theme Portal web page where users can find a wide range of themes for their Xperia™ devices. And if you’re a theme designer, you can get your theme added to the site to make it easier to find. Read on to learn about the Theme Portal and how to submit your theme!
Drone Control, the new drone controlling app from Sony, was released as open source today for Xperia™ devices and SmartEyeglass. The app works with Parrot drones to get real time flight data and battery life info, for example, superimposed on SmartEyeglass for an augmented reality (AR) experience.
Today, we’re introducing the next generation SmartBand 2, which features an advanced heart rate sensor, and provides notifications through vibration and color LEDs. SmartBand 2 is compatible with any device running Android 4.4 (KitKat) onwards and iOS 8.2. Get the full specs for SmartBand 2 after the jump.
Today the flagship devices Xperia Z3+, Xperia Z4 Tablet, and Xperia Z4 Tablet WiFi have been added to our Open Device project. With this addition, it is now possible for developers to build AOSP versions for the first Xperia devices based on the 64-bit Qualcomm© Snapdragon 810 processor.
Could you benefit from knowing your Lifelog users favourite locations and predicting their movement between those locations? In this post, Sony’s Master Engineer, Håkan Jonsson, explains how to apply machine learning to location data retrieved from the Sony Lifelog API. Find out about machine learning, clustering, how to access Sony’s Lifelog API and check out Håkan’s own notebook.
Developer World team has selected the next hero open source developer who has made the most accepted commits to our projects on the SonyXperiaDev GitHub. During the months of May and June, Brazilian developer Humberto Borba contributed the most to our projects. Read on for more details about Humberto, and learn how you could become the next hero open source developer.
If This Then That, or IFTTT, is a web-based service that offers users the ability to automate digital activities by connecting services and apps. Working together with Sony, IFTTT have integrated the Lifelog API, enabling the use of data capture and API endpoints from Sony’s activity tracker, giving users even greater creative control.
Update – In response to queries regarding the technicalities of the post, we’d like to clarify what we’ve released here. We provided build instructions based on the AOSP project under the branch android-m-preview, showing the Android M platform changes so far. That build gives an early preview of Android M for custom ROM developers to play with. Be aware that at this point the API levels are still at Lollipop MR1, what we show is Android M Developer Preview purely from a platform perspective.
Note that the Android M Developer Preview for Nexus devices from Google supports the latest API level. Xperia devices flashed using the build instructions linked from this post, do not support those APIs.