Hey Google, Is That You On The Phone? Also: Cortana buddies up with AlexaMichael Heiss ⋅ May 9, 2018 Much like the flowers and the unpredictable weather, settling into spring also marks the onset of the developers’ conferences in the tech industry. While these events, staged annually by Facebook, Microsoft, Apple and Google, rarely include hardware news or product introductions, observers are instead afforded insight as to what consumers will be able to do with smartphones, smart speakers, streaming devices and other IoT technology. For example, news from Microsoft Build revealed that the company’s forthcoming Your Phone app will allow both iOS and Android devices to sync content with Windows 10. Perhaps more importantly, Amazon Alexa and Cortana are about to enhance their interoperability. While the availability date was not announced, users will eventually be able say, “Cortana, open Alexa,” and then command an Alexa device. Working in the other direction, the wake words will be “Alexa, Open Cortana.” [Alexa, Google Assistant On A Collision Course] Google kicked of its own I/O conference this week at the Shoreline Amphitheatre in Mountain View, Calif. CEO Sundar Pichai expounded on Google’s core mission, noting that he sees the industry as being at “an important inflexion point in computing.” The triad that flowed through Pichai’s presentations centered around making “information more accessible, more useful and beneficial to society.” Throughout the exposition of these was the power that machine learning and AI brings to the table, particularly with regard toward health and life balance. (An example given was a retinal scanner being used to look inside an eye, and then comparing a condition to a known set of variables, with the ability to predict such things as a cardiac incident up to 48 hours early so that treatment may be expedited.) A more daily application for AI, which will be embodied in app and OS updates, is the ability to have AI not only recognize and “tag” people in images, but also ask the user if he or she wants to email or text the photo to that person. [Amazon-Google Feud Reignites] As Google, Apple and Amazon — and to a lesser degree, Cortana and Samsung’s Bixby — battle for supremacy, the Google Assistant received a great deal of attention. Installation in over 500 million devices and control of over 5,000 products was stated, the latter particularly aimed at establishing a universe of devices to rival Alexa. Key among the new features was an expansion of Continued Conversation so that the wake word does not need to be constantly repeated during an extended command conversation with the ability to perform multiple actions. A Pretty Please function will be aimed at improving behavior in device use by children, making it necessary to say “please” somewhere in the “Hey Google” command. One of the few hardware-related announcements was the formal introduction of a Smart Display, designed to add a compact display screen to the Assistant-equipped smart speaker. Lenovo’s version was previewed at CES, and when these units ship starting in July, models from JBL and LG will also be on the docket. Of course, much of this, whether embodied through an app, a smart speaker or an Android TV, is dependent on the Google Assistant. Additional features will include interactivity with Google Maps so that the user will have live visual, along with graphical maps. The Google Duplex feature will take advantage of deep learning and voice recognition to place phone calls in response to a command. For example, a user can ask a device to secure a restaurant reservation, and the device not only makes the call, but carries on a natural language conversation with the person on the other end. It can respond to available dates and times, choose a suitable time, thank the person on the phone, add the event to the user’s Google Calendar, and send a text or email confirmation. Similarly, improvements to Google Photos and Google Lens will also be OS-independent for advanced recognition and related actions. Unsurprisingly, much detail at I/O was paid toward the features of the next Android OS, code-named “Android P.” To continue the growth of Android-based phones, the new OS will also be centered on “intelligence, simplicity and well-being.” AI will be used to predict and measure app usage to extend battery life, while new Do Not Disturb and Shush modes will silence the phone when appropriate. A Wind Down feature will dim the screen from color to black and white to ease the user into bedtime. Finally, Android P will also improve the capability of Android TV. Google reported a two-time, year-over-year growth for Android TV, with over 100 device partners encompassing not only smart TVs, but set-top boxes and carrier/cable devices as well. Android TV will be given a major refresh with easier setup, auto install and carry-over of identification information for apps already on an Android phone and password auto fill. Among the first of the new Android TV devices will be the JBL Link Bar, a full featured soundbar to be delivered later this year. In addition to the full Android TV and Google Assistant feature set, it will include three HDMI inputs, a remote that complements the ability to use voice command, Bluetooth connectivity and a subwoofer. While Chromecast was not addressed during the keynote, a following developers’ session did address some recent speculation about its future. Recent sightings of a Federal Communications Commission application by Google for a device similar in size and form factor to the current Chromecast models had raised expectations about potential Chromecast replacements. Squashing that notion, the device in question was revealed as the ADT-2. Rather than a consumer product, this is an Android TV dongle available only to developers. Presumably from that we will see more Android TV options, whether built-in to TVs or soundbars, or perhaps as updated versions of Android TV devices such as Nvidia Shield, Xiaomi’s Mi Box or Sling’s AirTV. On a lighter note, Google’s addressed two high-profile Android bugs. Pichai made it a point to note that the cheeseburger emoji now has the cheese correctly places above the burger, rather than below. (“After all, I’m a vegetarian,” he admitted.) Likewise, the beer stein emoji now has the foam inside the mug on top of the liquid, not sitting in air atop the stein. Whew. SubscribeFor more stories like this, and to keep up to date with all our market leading news, features and analysis, sign up to our newsletter here.