Google has upgraded its AI Overviews feature, formerly known as Search Generative Experience, after users reported numerous unusual and incorrect responses. Utilizing Google’s Gemini model, AI Overviews provides natural-sounding answers to search queries. However, following its expansion after Google’s I/O event, users shared screenshots of some peculiar suggestions, including adding non-toxic glue to pizza and jumping off the Golden Gate Bridge as a cure for depression.
To combat these issues, Google implemented over a dozen "technical improvements" to enhance response accuracy, including a better detection system for nonsensical queries and limitations on using user-generated content.
During the Computex event in Taiwan, Nvidia CEO Jensen Huang highlighted the transformative potential of generative AI and accelerated computing. He emphasized that these technologies are reshaping industries and unlocking new opportunities for innovation. Nvidia, a leader in AI hardware, continues to drive advancements with its GPUs, crucial for powering generative AI applications and services.
“The future of computing is accelerated,” Huang said, pointing to Nvidia's role in pushing technological boundaries.
A PGIM report warns that global electricity consumption by data centers could rise from 2% to over 20% by 2030. Data centers, vital for running and cooling servers, are projected to more than double their power usage by 2026, equating to Japan's current electricity consumption. The report attributes this increase to the intensive workloads required for training large language models and urges data center operators to balance computational demands with sustainable power sourcing.
At Computex 2024, AMD introduced new AI-focused hardware to challenge Nvidia's dominance. The company launched the Instinct MI325X accelerators, targeting AI workloads in data centers, and previewed its upcoming EPYC server processors, codenamed Turin, set for release in late 2024. AMD's accelerated release schedule aims to match Nvidia's pace in the AI hardware market.
Hitachi and Microsoft have entered a multiyear, multibillion-dollar partnership to integrate Microsoft’s AI and cloud services into Hitachi’s Lumada digital solutions. This collaboration will incorporate Azure OpenAI Service, GitHub Copilot, and productivity tools to enhance Lumada’s offerings. The partnership is expected to generate $18.9 billion in revenue for Lumada in 2024.