Google is implementing new artificial intelligence algorithms to monitor Google Search queries, trying to identify “personal crisis” situations where searches seem to indicate an urgent problem for that user. The plan is to give them more useful results, while hiding selections that could do more harm than good.
For example, if a person has been sexually assaulted, the tendency to use keywords relevant to your experience (e.g. rape, beatings) in Google Search in an attempt to find help may return links to explicit content with scenes of violence and rape, adding to the psychological trauma of the victim. Instead, the AI system dubbed MUM should detect these situations by customizing Google Search listings by highlighting links to sites where victims of abuse can find help, while also hiding truly harmful content.
Google says the system, which is set to be implemented in the coming weeks, will be able to detect a wide variety of searches associated with personal crisis situations, such as suicidal thoughts, substance abuse problems, sexual abuse, domestic violence, etc. At the same time, the artificial intelligence engine is designed to detect the context of the search, as an alternative to simply matching keywords, directing users to the information they need in the shortest possible time.
Unfortunately, like any newly introduced technology, the functionality will be available first from the English-language version of Google Search, and it may take some time for the AI model to be “trained” and to process queries made in other international languages. .