Are you active on Stack Overflow? This Q&A site for professional and enthusiast programmers is where you’ll find Support Engineer Robert McCain and the rest of our Support Engineering team here at Sony, actively monitoring the site for new questions. Stack Overflow has tons of technical questions and answers regarding Sony’s API and SDK-related resources. And if you can’t find the answer you need, posting and tagging your question here is the best way to get it answered.
SmartWatch 2 is one of Sony’s most popular devices among developers, as it enables a lot of innovative use cases related to remote control, notifications and much more. If you haven’t developed your own SmartWatch 2 app yet, or if you’re looking tweaking your existing SmartWatch 2 app, we’ve gathered our top 5 tips for developers in this post.
Did you know that SmartEyeglass can be used as a Bluetooth headset for the host smartphone or tablet? In our new SmartEyeglass audio I/O guide, published today, you can learn how your app can send and receive audio data with the SmartEyeglass built-in microphone and speaker. This is actually done by using the Android Bluetooth Hands-Free Profile API (Android BT HFP API), much as you would for any standard Bluetooth headset.
Sony’s SmartWatch 3 is now on sale in selected markets, with other markets to follow during the coming weeks. You can start developing apps right away for SmartWatch 3, using the Android SDK together with the extended set of APIs from the Android Support Library and Google Play Services. SmartWatch 3 apps can display notifications, handle voice actions, access innovative sensor technology, and much more.
You can now find instructions on how to use device configurations from Sony to build AOSP KitKat and flash it on an unlocked Xperia device, in a new guide we’ve created. This guide includes a step-by-step instruction that takes you from preparations of your environment, to what tools you should download and install, and then how to configure the code. At last, we’ll explain how to build an AOSP image and flash it to your device. Please note that you should be familiar with Android development to use the instructions, and that the software created is not intended for daily usage and there are important limitations. Head over to the How to build AOSP KitKat and flash it on an unlocked Xperia device guide to get started!
Recently, I had the opportunity to talk at XDA DevCon in Manchester about Sony Mobile’s approach to AOSP. We’ve seen a lot of engagement on the subject – lots of comments, lots of questions – so we wanted to share more details and clarify a few points about the work we do to provide binaries and source code to community developers. The binaries and source code are then used as a base when community developers are compiling their own custom ROMs.
Did you know that you can develop apps for many of Sony TVs and home entertainment systems, such as Blu-ray players, home theater systems and media players? Many of Sony’s latest devices come with HTML5 capable browsers, and from 2015, many TVs will support Android TV and be Google Cast-ready. At developer.sony.com, you can get an overview of all the developer opportunities available for Sony TV and home entertainment platforms.
Do you know that Sony’s has a number of different types of open source projects available on the SonyXperiaDev GitHub? These projects range from our AOSP (Android Open Source Project) for Xperia devices, to open sourced developer tools such as ChkBugReport and research projects such as EvolutionUI. Read more about some of these projects in detail after the jump.
The Sony Developer Program is excited to be part of this year’s xda:devcon in Manchester, UK, from the 26th to 28th of September. xda:devcon, run by XDA Developers, is all about taking this mobile developer collaborative forum for enthusiasts, hackers and developers from a virtual setting to a live experience.
This is the second article in our touchscreen technology series. In our previous touch article, we explained the components of a touchscreen system and how these parts work to translate a touch input to graphical user feedback. In this article, we’ll continue with the topic of touch responsiveness, and explain the input lag that is experienced when using touch. Read on for more details.