{"id":28621,"date":"2024-02-24T02:24:22","date_gmt":"2024-02-23T20:54:22","guid":{"rendered":"https:\/\/farratanews.online\/google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis\/"},"modified":"2024-02-24T02:24:22","modified_gmt":"2024-02-23T20:54:22","slug":"google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis","status":"publish","type":"post","link":"https:\/\/farratanews.online\/google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis\/","title":{"rendered":"Google explains Gemini\u2019s \u2018embarrassing\u2019 AI pictures of diverse Nazis"},"content":{"rendered":"

[ad_1]\n<\/p>\n

\n

Google has issued an explanation for the \u201cembarrassing and wrong\u201d images generated by its Gemini AI tool. In a blog post on Friday, Google says its model produced \u201cinaccurate historical\u201d images due to tuning issues. The Verge<\/em> and others caught Gemini generating images of racially diverse Nazis and US Founding Fathers earlier this week.<\/p>\n<\/div>\n

\n

\u201cOur tuning to ensure that Gemini showed a range of people failed to account for cases that should clearly\u00a0not<\/em>\u00a0show a range,\u201d Prabhakar Raghavan, Google\u2019s senior vice president, writes in the post. \u201cAnd second, over time, the model became way more cautious than we intended and refused to answer certain prompts entirely \u2014 wrongly interpreting some very anodyne prompts as sensitive.\u201d<\/p>\n<\/div>\n

\n
\n

Gemini\u2019s results for the prompt \u201cgenerate a picture of a US senator from the 1800s.\u201d<\/em><\/figcaption>Screenshot by Adi Robertson<\/cite><\/p>\n<\/div>\n<\/div>\n
\n

This led Gemini AI to \u201covercompensate in some cases,\u201d like what we saw with the images of the racially diverse Nazis. It also caused Gemini to become \u201cover-conservative.\u201d This resulted in it refusing to generate specific images of \u201ca Black person\u201d or a \u201cwhite person\u201d when prompted.<\/p>\n<\/div>\n

\n

In the blog post, Raghavan says Google is \u201csorry the feature didn\u2019t work well.\u201d He also notes that Google wants Gemini to \u201cwork well for everyone\u201d and that means getting depictions of different types of people (including different ethnicities) when you ask for images of \u201cfootball players\u201d or \u201csomeone walking a dog.\u201d But, he says:<\/p>\n<\/div>\n

\n
\n

However, if you prompt Gemini for images of a specific type of person \u2014 such as \u201ca Black teacher in a classroom,\u201d or \u201ca white veterinarian with a dog\u201d \u2014 or people in particular cultural or historical contexts, you should absolutely get a response that accurately reflects what you ask for.<\/p>\n<\/blockquote>\n<\/div>\n

\n

Raghavan says Google is going to continue testing Gemini AI\u2019s image-generation abilities and \u201cwork to improve it significantly\u201d before reenabling it. \u201cAs we\u2019ve said from the beginning, hallucinations are a known challenge with all LLMs [large language models] \u2014 there are instances where the AI just gets things wrong,\u201d Raghavan notes. \u201cThis is something that we\u2019re constantly working on improving.\u201d<\/p>\n<\/div>\n[ad_2]\n","protected":false},"excerpt":{"rendered":"

[ad_1] Google has issued an explanation for the \u201cembarrassing and wrong\u201d images generated by its Gemini AI tool. In a blog post on Friday, Google says its model produced \u201cinaccurate historical\u201d images due to tuning issues. The Verge and others caught Gemini generating images of racially diverse Nazis and US Founding Fathers earlier this week. …<\/p>\n","protected":false},"author":1,"featured_media":28622,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"_links":{"self":[{"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/posts\/28621"}],"collection":[{"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/comments?post=28621"}],"version-history":[{"count":0,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/posts\/28621\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/media\/28622"}],"wp:attachment":[{"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/media?parent=28621"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/categories?post=28621"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/farratanews.online\/wp-json\/wp\/v2\/tags?post=28621"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}