Google image search no longer reliable? Why an AI-generated image of a famous musician is not a good sign

Google shows AI-generated image of famous musician as top result on Google.  Why isn't that a sign?  (Image: Google)

Google shows AI-generated image of famous musician as top result on Google. Why isn’t that a sign? (Image: Google)

If you just have fun and type “Israel Kamakawiwo’ole” into Google search, the first result you won’t see is a picture of the famous singer. Also no album cover or a picture of a live performance.

What you will see, however, is a man sitting on the beach laughing and playing the guitar. If you just look cursorily, you probably won’t notice any errors.

But upon closer inspection it becomes clear that this is not the musician who wrote “Somewhere over the rainbow”. It is an image generated by an AI.

Left and top right are two AI-generated images.  Below right a real photo of Israel Kamakawiwo'ole.






Left and top right are two AI-generated images. Below right a real photo of Israel Kamakawiwo’ole.

The problem with AI-generated images on Google

The image originally came from a post in the Midjourney subreddit, but Ethan Mollick, a professor at Wharton University in Pennsylvania, noticed that it now appears as a Google result.

Link to Twitter content

How do I recognize an AI-generated image? In the case of Mr. ‘Ole, you can see that the depth of field effect is unevenly distributed, the texture of his shirt is flawed, and he is missing a finger on his left hand.

Other than that, the AI-generated man doesn’t look much like the real Israel Kamakawiwo’ole.

In general, it can be said that generative AI still has some problems in depicting people realistically. Faces and hands are often depicted incorrectly. Sometimes in such a way that some parts of the image don’t make sense to the human eye.

You can usually still see the difference well.

Editorial opinion


Jan Stahnke

Where is the problem? The problem is that the image is not labeled as an AI work and could therefore cause confusion. After all, Israel Kamakawiwo’ole died in 1997. So there are no current pictures of him. The likelihood is therefore high that more and more pictures will be created by him (or other artists) in the coming years.

If Google is already showing the “fake” image as a top result just because of a Reddit post, you can certainly imagine what Google will look like in a few years if it stays as it is.

Read:  Meta cracks a huge user milestone that is second to none

But there is a simple solution that Google could implement.

What can Google do? The solution is simple: Google should label AI-generated content. It would be best if a method were developed that automatically recognizes AI-generated content.

This could be done by AIs trained specifically for this purpose, or by processes that at least search image captions and the like for references to AI-generated content and mark the image accordingly on Google.

Why is that necessary? At the moment we are still at the beginning of the AI ​​boom and now we still know how to distinguish generated images from real ones. But the further the years go by, the more likely it is that generative technology will become better and therefore less easy to recognize.

The image of famous people in people’s minds could change over time to such an extent that some only know AI-generated images of real people.

Would that be so bad? Everyone has to decide that for themselves. Personally, I don’t like the idea of ​​having to look three times in the future if I want to know what someone looks like or used to look like.

how do you see it? Feel free to write us your opinion in the comments below!

The Best Online Bookmakers October 12 2024

BetMGM Casino

BetMGM Casino

Bonus

$1,000