Today we released the Camera Remote API beta SDK v2.10, adding support for Sony’s current flagship interchangeable lens digital camera A7R II. There is also support for the latest Cyber-shot RX Series; the compact RX100 IV and the high performance, high-zoom RX10 II, as well as the high-zoom compact cameras HX90 and WX500.
Have you created a theme using the new Theme Creator tool? Before publishing your theme, it’s important to verify that it works on as many devices as possible. With the Remote Device Lab (beta), which is a free web service, you can test your theme on real Xperia devices in order to it for a variety of different screen sizes, densities, and resolutions.
In October last year we launched Sony’s Remote Device Lab (beta), a free web service that allows you to test and verify your app on real Xperia devices. After each session the devices are factory reset, which means that you can safely try out your app on a variety of devices with different specifications. If you haven’t done so already, head over to Remote Device Lab (beta) to try it out. Read on to learn more about this service.
Today we’re updating the Lifelog API with two new guides, which can help you turn data into valuable insight for your users. With these new guides, you can now learn how to calculate daily totals from the values returned from the API, and how to turn energy expenditure data into calories burned.
Today we are announcing the commercialisation of SmartEyeglass Developer Edition SED-E1, a transparent lens eyewear that connects with compatible smartphones to superimpose information onto the user’s field of view. SmartEyeglass Developer Edition SED-E1 will be available for sale in ten countries as of March 2015, with pre-order starting already today in UK and Germany. As of today, we are also making the official version of the SmartEyeglass SDK available, enabling developers to create unique hands-free use cases.
With true augmented reality available in SmartEyeglass, this wearable is perfect for use in professional markets, as well as in niche consumer segments, to help solve specific tasks in specific use cases. At CES in Las Vegas a couple of weeks ago, some examples of this were shown in a collaborative demo made by APX Labs and Sony. Check out the video above to see how SmartEyeglass could be used in business verticals, to help perform certain work duties.
Today we are happy to release the Lifelog API, which gives you as a developer secure access to lifestyle data of your users. This way, you can provide your users with insight and inspire smarter choices based on their physical activities, app usage and location. Of course, the user has full control and must approve to share data with your app.
Today we have released a new version of the SmartEyeglass SDK (Developer Preview) that includes a new AR rendering API. That means you can now render text and graphics to stay overlaid onto fixed real-world positions seen through SmartEyeglass. If the user turns to look in another direction, the text and graphics will stay overlaid the fixed position as long as the object is still in view.