The simple reason for this argument is that the display makes such devices much more useful. Sure, you could have Alexa or Google Assistant tell you there’s a Starbucks 1.5 miles away from you. But wouldn’t it be nice to actually see where it is on a map? Or if you wanted to know the time, you could just, you know, look at the screen. Or if you wanted to know who the artist of the song is but couldn’t be bothered to interrupt the track, you could do the same. That extra visual layer is really useful, especially for quick, glanceable information.
Of course, you could’ve made this same argument months ago when the Echo Show debuted. But these new Google Assistant displays are so much better in almost every way. For example, when you make a search query, it won’t just spit out a short generic answer with the transcript showing up on-screen; it’ll actually appear in a way that makes sense. So if you search for “cornbread recipe,” the display will offer an array of recipes to choose from. Tap on one and you’ll be presented with a lovely step-by-step recipe guide, all without having to install any additional skill or action.
Or if you ask a Google Assistant smart display to play relaxing music, it won’t pick out a random playlist and start playing a song you don’t want (something that happens quite frequently with the Echo). Instead, it’ll offer a visual selection of playlists, which you can then scroll through and pick the one you want. Perhaps my favorite feature is when you ask for directions. It will not only show you the map on the screen but also send those same…