The Sony Developer Program is excited to be part of this year’s xda:devcon in Manchester, UK, from the 26th to 28th of September. xda:devcon, run by XDA Developers, is all about taking this mobile developer collaborative forum for enthusiasts, hackers and developers from a virtual setting to a live experience.
This is the second article in our touchscreen technology series. In our previous touch article, we explained the components of a touchscreen system and how these parts work to translate a touch input to graphical user feedback. In this article, we’ll continue with the topic of touch responsiveness, and explain the input lag that is experienced when using touch. Read on for more details.
With the recent update to the SmartWatch 2 software, users are able to customise the watch face of SmartWatch 2. Better yet, the latest version of the Sony Add-on SDK gives you, as a developer, access to the Widget API, which allows you to develop clocks and widgets for the SmartWatch 2 watch face. To further inspire your wearables development, check out our interview with Alexander “Azya” Zakharyan who has increased downloads on Google Play by specialising in watch-related SmartWatch 2 apps.
This past weekend at Maker Faire Bay Area in San Mateo California, Sony unveiled “MESH” – an exciting new development concept. MESH (derived from the concept of Make, Experience, SHare) is a platform of hardware blocks that connect to each other through wireless technology such as Bluetooth Smart. Each block contains software that can be programmed via a simple Graphical User Interface (GUI) to define a function. MESH makes it easy, fun and convenient for anyone without engineering or coding skills to build their own inventions.
Starting today, we’re introducing a series of articles that explore the inner workings and evolution of the touchscreen system on smartphones. Normally, one doesn’t think much about the ability to simply touch a screen to interact directly with what is displayed, now that it has become a fundamental design integrated into smartphones. To start off this series, Sony engineers, Magnus Johansson and Alexander Hunt, explain how smartphone touchscreen systems work on a general level. Read more after the jump.
Game developers, did you know that Sony Computer Entertainment America LLC (SCEA) recently released its powerful Authoring Tools Framework (ATF) for free? ATF has been used by most Sony first party game studios to make custom game developer tools for a long time, and is now available open source under an Apache version 2.0 license.
Would you like to know more about Sony’s unique software features on Xperia Z2 running Android 4.4.2? Then you should check out our two latest videos where you will discover Xperia Themes, a quick access camera and true full screen experience video viewing. Read on for even more features – and don’t forget to watch the videos.
Did you know that you can control a Sony camera wirelessly from an app on another device? Sony’s Camera Remote API beta SDK includes all the tools that you need to create innovative remote control apps for many Sony cameras, including Sony action cameras, interchangeable lens cameras and lens style cameras. Why not create the remote viewfinder app to get professional looking group shots, even though you’re not behind the camera yourself?
Will you be in or close to southern Sweden by late April? Why not come and develop games at the Sony office on the weekend of Ludum Dare. Since 2002, Ludum Dare (Latin for “to give a game”) has been a game development competition where participants develop games from scratch in a single weekend, either online or at a hosted location. And this time you can meet and collaborate with other developers by joining the on-site hackathon at the Sony office in Lund, Sweden. Follow the instructions in the post to sign up.
Several of the latest Xperia devices, including Xperia Z2, feature a self-contained sensor co-processor. This sensor co-processor makes it possible to continuously collect data in the background, without using the main application processor. This will enable up to ten times lower power consumption when using always-on features in sensors, compared to a system without a sensor co-processor. Learn how to use this in your app!