With Windows Machine Learning, developers will be able to use pre-trained machine learning models, ensuring real-time analysis of data for lower costs.
The Project Kinect for Azure will combine the depth sensor with Azure AI services to help developers make more precise devices that use less power. Microsoft's Azure IoT Edge service uses AI, Azure services and custom logic to deliver cloud intelligence locally on cross-platform IoT devices.
Microsoft says it's used some of the advances made through HoloLens - which in turn, was informed by the original Kinect - to enhance Project Kinect's spatial understanding, object recognition, skeletal tracking, and more. The edge devices, in this case, are on-premises servers that can act as Azure IoT Edge devices.More news: Jay-Z Receives Investigative Subpoena from the SEC
Microsoft also is making available a limited preview of Project Brainwave "on the edge", officials said today.
Azure, Microsoft's cloud computing service has always been tangentially related to the company's burgeoning AI ambitions, but that all changes today. Mashable's Raymond Wong will be sharing live updates before and during the event. In addition, Microsoft said that the Visual Studio App Center has now been integrated with GitHub to allow developers to more easily automate their DevOps processes. While primarily focused on its own platforms and services, Microsoft did announce a few tidbits for those in the Apple ecosystem. It includes Microsoft's "most advanced" time-of-flight sensor (used for 3D imaging) and some other new sensors. Edge activities are already logged on Windows 10 PCs from the Edge app on your phone, but coming soon is the ability to see that same timeline directly on your phone for resuming web browsing activities and more.
Microsoft is enabling any developer to reach and engage new audiences with Sets, an easier way to organize your information and get back to what you were doing.