Did you know that SmartEyeglass can display advanced 3D graphics, overlaid on the user’s field of vision? In this tutorial Sony Software Engineer Ahmet Yildirim describes how this can be done using Open Graphics Library (OpenGL). Ahmet has also created a 3D model viewer sample app that we are making available as open source.
With the Wikitude Android SDK and Sony’s Camera Add-on API, you can easily create stunning augmented reality experiences for the Xperia™ Z1 and Xperia™ Z1 Compact. It’s entirely up to you if you want to augment your magazine, implement a game, or display geo-content in the user’s vicinity like Wikitude Places does it. Learn how to start using augmented reality in your own apps, after the jump!
Although our Xperia™ devices have become much more powerful the last couple of years, there might still be cases when your application needs more processing power. With OpenCL™, you can use the power of the GPU to handle resource intensive tasks in your app. This article is a short introduction to OpenCL, and how to get it up and running on your Sony Xperia device.
With an increasing number of Android™ devices coming out with on-screen buttons, such as Sony Xperia Z1 and Xperia Z Ultra, it’s important that your app can offer a more immersive viewing experience by hiding essential UI elements. It’s really not that complicated – we’ll show you how you can provide your app users with maximum screen estate on devices running Android 4.3 or earlier versions.
We recently published some developer tutorials on how to add SmartWatch 2 (SW2) support to the first SmartWatch (MN2) extensions and how to create a SmartWatch 2 app extension. Now, we’d like to bring you additional information on best practices for your SmartWatch 2 app development, covering topics that range from how much data your app extension sends to recommended resolutions to display considerations. Keeping these five important tips in mind during your development can help minimise potential bugs, make your app extension more compatible with the new SmartWatch 2 features, and ensure overall that your SmartWatch 2 app will work smoothly.
With a SmartWatch 2 extension, your users can utilise the features of your app without even picking their smartphone or tablet up from their bag or pocket. At the same time, you can get a lot of extra visibility for your app, since it will be exposed to all users of the SmartWatch 2. So are you ready to develop a new app extension or extend your existing app to support the Sony SmartWatch 2? We’ve put together a quick tutorial that shows you from start to finish, what programs you need, files you should use, and tools you can make use of, and the steps involved in creating your own app extension for Sony SmartWatch 2. Read more after the jump. (Updated 12 Nov, 2013)
If you’re both a developer of camera apps and a fan of the recently-announced, camera-centric Sony Xperia Z1, check out this Camera Add-on API tutorial, which will show you how to integrate your camera app with the Xperia™ Z1 using the Camera Add-on API. Read on to discover how to easily take advantage of the Camera Add-on API, and make it easier for your users to quickly launch and utilise your camera app for more frequent use.
Now that the Sony SmartWatch 2 is available, you might be wondering how to make your existing SmartWatch app compatible with the new SmartWatch 2. For app extensions developed for the first SmartWatch that use the Notification API or Control API included in the Smart Extension APIs, some minor updates are needed in order to be optimised for SmartWatch 2. Read on for the important details for adding compatibility to make your app extension usable for both SmartWatches.
On recent Xperia™ devices, you may have noticed our new and eye-catching Lockscreen which transforms into horizontal blinds when your finger touches the screen. Now we’ll show you how you could add a similar graphical effect to your own app using a custom Android™ ViewGroup. The BlindsView Tutorial will provide you with powerful tools for creating some very eye-catching graphical effects and transitions. Read on for the full BlindsView tutorial by Johan Henricson, software project manager at Sony.
As our smartphones become more powerful, we can do more advanced things that previously required a high-end PC. One way to make use of the robust processing power of your smartphone is Computer Vision – the ability for a device to acquire, process, analyse and understand images the same way images are perceived by human eyes. Basically, we can use the powerful CPU in modern smartphones to interpret the images captured through the camera. Examples of use cases are face detection and recognition or simple post-processing of photographs. The best approach to using Computer Vision on Android is through library called OpenCV. Read on as Erik Hellman, research engineer at Sony, explains more.