Enterprises are increasingly exploring the potential of generative AI models, yet many struggle to move beyond the proof-of-concept stage and operationalize their learnings at scale. The AI Summit New York 2023 shed light on the key obstacles companies face and strategies to overcome them.
1. Data Challenges: Panelists emphasized the critical role of data in training generative AI models. Sesh Iyer, Managing Director at BCG, highlighted the need for curated data with proper metadata, and Gaurav Dhama from Mastercard stressed the importance of organizing knowledge bases to feed large language models effectively.
2. Governance and Confidence: Establishing the right governance structure is crucial to managing risks associated with generative AI. Lucinda Linde, Senior Data Scientist at Ironside, noted a confidence problem among top leaders due to concerns about security, copyright issues, and hallucinations associated with large language models.
3. Talent Shortage and Business Focus: Obstacles also include a shortage of skilled professionals, enterprises struggling to align with business value and return on investment (ROI), and the fluctuating costs of generative AI, introducing uncertainty. The relatively new nature of generative AI makes the journey akin to navigating an unpaved road.
4. Copilot Phase and Security Concerns: Panelists, including Vik Scoggins from Coinbase, acknowledged that due to risks, generative AI might remain in the copilot phase for a while, especially in heavily regulated industries. Security vulnerabilities, especially with code, require careful usage and higher skills from programmers.
5. Internal Deployment and Productivity Gains: Linde recommended deploying generative AI internally first to enhance employee productivity and efficiency, emphasizing the importance of building confidence within the organization before using it in customer-facing applications. Despite the learning curve, the productivity gains can range from 10% to an impressive 80-90%, according to Iyer.
6. Diversification and OpenAI's Drama: Using multiple models was considered a best practice by the panel, highlighting the recent events surrounding OpenAI CEO Sam Altman's departure and return as a cautionary tale. Diversification ensures resilience and the ability to choose models based on specific strengths.
7. Designing a Generative AI Stack: When designing a generative AI stack, factors such as accuracy, latency, and cost should be carefully considered, as mentioned by Iyer. Companies need a cascade of systems to effectively implement generative AI.
8. Competitive Edge: Panelists agreed that the competitive edge lies not just in the model but in the data used to power generative AI. Dhama emphasized the importance of intersecting business insights with operations to derive full value from generative AI.
Conclusion: Despite the challenges, the consensus at the AI Summit NY 2023 was that generative AI is worth the investment. By addressing data challenges, establishing robust governance, and focusing on internal deployment, enterprises can navigate the complexities and unlock the substantial benefits of generative AI.