Recent report from the New York Times suggests that Google has teamed up with the DeepMind research lab and the Brain AI team to develop an AI powered life coach, which will be able to respond to questions with an emotional coefficient
Say your goodbyes to those flashy life coaches and their flashy Instagram advertisements, Google is building a brand new life coach for you, this one, though not so flashy, will have some great life advice.
Google’s large language AI model ‘Bard’ will soon gain more functionalities as the company plans to integrate the AI into 21 different products or ways, one of which includes creating an AI powered life coach that offers helpful life advice to people.
Releasing a report on the topic, New York Times mentions that Google has teamed up with the DeepMind research lab and the Brain AI team to develop this AI powered life coach to an extent where it can even respond to critical questions with an emotional coefficient, making its advice sound more human-like.
Both Google’s Bard AI and OpenAI’s ChatGPT have the ability to produce human-like responses, they both however have a set of problems and can often get ‘AI hallucinations’, which makes them give out wrong answers or even state facts that are not really true.
Other Bard AI use cases currently being explored by Google include a tool capable of producing scientific and creative content, and even a feature that will help journalists write eye-catching and effective headlines.
These tools being currently explored will have a lot of use cases and might receive a great response upon release, however, they will definitely affect workers as generative AI continues to go towards automation.
Google, which is said to be behind Microsoft in the race towards generative AI is surely catching up, working on various real-life AI applications and trying to release them as early as possible. Microsoft, however, is also not slowing down and is currently bullish on integrating AI into all its products, even planning to add an image generator in MS Paint!
Read More: