The 2021 edition of The Game Awards, which is the annual Oscars-like award show for video games, has been announced to take place on December 9. According to The Verge, the show will take place at the Microsoft Theater in Los Angeles, and unlike last year, there will be an in-person audience, though it will be an invite-only event. Details about specific health and safety protocols for the in-person event will be shared “in the coming weeks,” according to a press release. The awards, which will be hosted by creator and executive producer Geoff Keighley, will also be live-streamed on more than 40 different platforms, so people will be able to watch from home if they want. The show organisers have also promised that there will be “free playable game content” available as part of an “immersive digital experience. Like in previous years, The Game Awards will also feature “first-look world premieres and new game announcements.” There was some big news in 2020, including the announcements of Master Chief in ‘Fortnite’ and Sephiroth in ‘Super Smash Bros. Ultimate’, and Microsoft also brought a huge surprise to the 2019 show with its reveal of the Xbox Series X. As per The Verge, there aren’t any details about what announcements to expect this year, but if past years are any indication, there could be some fun surprises in store.
Google Lens takes visual search to a new level
With a new, more advanced AI system, Google Lens can now understand and answer questions about an image. According to Mashable, Google has unveiled what seems like a really useful and almost scarily advanced new way to search with images. Google Lens already lets you search based on an image. For example, if you take a picture of an elephant, you’ll probably get Google Lens search results back for “elephant”. But now you can tap a picture you’ve taken, or one that you’ve saved in your library, and ask a question about it. Take the elephant: Just tap the photo for the ability to “Add Questions” and a text box will pop up where you can plumb Google for more information about that specific image, like “What kind of elephant is this?” or “How many of these elephants are left in the world?” That involves so many layers of AI processing it’s actually hard to comprehend. It understands what’s in the picture, it understands your question, and it understands how your question relates to the picture. And, of course, it (ostensibly) gives you the answers you’re looking for. Making it all possible is a new, more advanced AI system called Multitask Unified Model (or MUM), announced in May, that is beginning to power Search.