Google to Release ‘Improved’ Gemini AI Image Generator

  • Editor
  • February 27, 2024
    Updated
Google-to-Release- Improved-Gemini-AI- Image-Generator

In a recent turn of events, Google has announced plans to launch an improved version of its Gemini AI image generator. This development comes in the wake of criticism and accusations of bias, particularly regarding the tool’s depiction of white people in generated images. The controversy highlighted the challenges AI technologies face in ensuring fairness and accuracy in their outputs.

“We have taken the feature offline while we fix that,” Hassabis said Monday during a panel at the Mobile World Congress conference in Barcelona. “We are hoping to have that back online very shortly in the next couple of weeks, a few weeks.” He added that the product was not “working the way we intended.”

Google initially responded to the backlash by disabling the image generation feature of people in Gemini, acknowledging the tool’s shortcomings in creating accurate historical or any other images without bias.

In an official statement on their blog, Google expressed, “We did not want Gemini to refuse to create images of any particular group. And we did not want it to create inaccurate historical — or any other — images. So we turned the image generation of people off and will work to improve it significantly before turning it back on. This process will include extensive testing.”

No matter how many official statements Google publishes to defend itself, People seem to be not forgiving at all.

This decision underscores Google’s commitment to addressing the concerns raised by users and critics alike. Demis Hassabis, CEO of Google DeepMind, further elaborated on the issue during a panel discussion at the Mobile World Congress in Barcelona.

Demis Hassabis confirmed that the company took the feature offline to rectify the identified problems and assured that the improved version of the Gemini AI image generator is expected to be re-launched in the coming weeks.

 

The initial controversy stemmed from user-shared results that depicted historical scenes with exclusively white individuals being re-imagined with diverse casts. Critics accused Google of embedding a bias against white people into Gemini, labeling the tool as “woke” and politically motivated.

The people are saying that there is no room for improvement now:

Google’s investigation into the matter revealed two key issues: a failure in tuning the AI to accurately showcase a range of people without distorting historical accuracy, and an overly cautious model that refused to respond to certain prompts, leading to embarrassing and incorrect outputs.

The upcoming improved Gemini AI aims to correct these issues by undergoing extensive testing to ensure that the AI can generate images that are diverse, accurate, and sensitive to the nuances of historical and cultural representation.

Google’s proactive steps to refine and enhance Gemini’s capabilities reflect its dedication to developing technology that is both inclusive and respectful of all communities.

Here are some of the last words Google’s senior vice president said in a blog post amidst recent controversies:

“I can’t promise that Gemini won’t occasionally generate embarrassing, inaccurate, or offensive results — but I can promise that we will continue to take action whenever we identify an issue.”

As the tech giant works towards reintroducing an enhanced Gemini AI image generator, the tech community and users eagerly await to see how Google’s efforts will contribute to setting new standards for AI ethics and accuracy in digital representations.

For Latest AI-related news, visit our AI news at AllAboutAI.com.

Was this article helpful?
YesNo
Generic placeholder image

Dave Andre

Editor

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *