The Xperia™ XZ Premium has now been added to Sony’s Open Devices program. This means you can now build and flash your own version of Android N on Xperia XZ Premium devices. This software is meant for developers and can be unstable due to its early stage.
We’ve just released new build guides on how to build Linux mainline kernel for Xperia devices and how to build a minimal version of Linux for Xperia devices. With these assets, you can experiment with IoT prototyping, or join the development of support for Xperia devices in the Linux kernel. Learn about this, and find out how the Open Device program got its start after the jump.
We’re excited to let you know about our upcoming public SDK, allowing the creation of AR effects for use with Sony’s own AR effect app on Xperia devices. The SDK should be available by the end of this month.
Developer World recently caught up with Martin Fodor, a founding member of AndroPixel Studios (APS), to discuss the beginnings of the company, what success means to them, and their methods for creation and promotion of Xperia themes. Check out the interview below.
The Opera Company of San Sebastian, Opus Lirica, has just announced it is first in line to use SmartEyeglass technology with Opera Touch SL to create a more immersive opera experience for its audiences. The service connects SmartEyeglass to a server to create a new cultural space where the Opera Touch system can provide subtitles, translations, artist info, current action, and even live scoring, to SmartEyeglass for an enhanced opera experience within your line of sight.
Sony and VerbaVoice are collaborating on a project to provide live audio transcriptions and translations on SmartEyeglass. VerbaVoice, a company providing accessibility services through technology, have created an app, LiveCap, which converts speech into text and, also, provides language translations. These captions are superimposed on the SmartEyeglass. If you wear the SmartEyeglass at a speaker event, you could receive the speech as text on your SmartEyeglass, or even translated into a language you can understand.
Could you benefit from knowing your Lifelog users favourite locations and predicting their movement between those locations? In this post, Sony’s Master Engineer, Håkan Jonsson, explains how to apply machine learning to location data retrieved from the Sony Lifelog API. Find out about machine learning, clustering, how to access Sony’s Lifelog API and check out Håkan’s own notebook.
If This Then That, or IFTTT, is a web-based service that offers users the ability to automate digital activities by connecting services and apps. Working together with Sony, IFTTT have integrated the Lifelog API, enabling the use of data capture and API endpoints from Sony’s activity tracker, giving users even greater creative control.