Talk

GenAI - Chances and Risks | DSC Adria '24

Explore Generative AI's potential and challenges, from enterprise use to risks like prompt injections, with insights on best practices of AI governance.

SplxAI - Kristian Kamber
SplxAI - Kristian Kamber

Kristian Kamber

SplxAI - Date
SplxAI - Date

DATE

Jul 5, 2024

SplxAI - Time
SplxAI - Time

TIME & LENGTH

25 min

SplxAI - Status
SplxAI - Status
SplxAI - Status

STATUS

Available on demand

SplxAI - Language
SplxAI - Language
SplxAI - Language

LANGUAGE

English

SplxAI - DSC Adria 2024
SplxAI - DSC Adria 2024
SplxAI - DSC Adria 2024

This talk at DSC Adria '24 of Kristian Kamber, Co-Founder and CEO of SplxAI, explores the transformative potential of generative AI, from automating tasks to advanced communication and decision-making. Key topics include its current enterprise applications, such as enhancing customer service and automating administrative tasks, alongside security challenges like prompt injections, data leaks, and production readiness. Kris shares insights on securing AI systems through penetration testing, input/output validation, and collaborative team efforts. The talk highlights the rapid adoption of GenAI in emerging markets, the need for continuous security updates, and the pressing demand for responsible AI governance to unlock its full potential.

Securing Generative AI: Strategies to Combat Prompt Injection and Data Leaks

Brand Reputation Risks: AI vulnerabilities can lead to incidents like chatbots recommending competitors, undermining trust and damaging brand image.

Proactive Security Measures: Penetration testing, input/output validation, and third-party guardrails are vital for protecting AI systems in production.

Emerging Threats: Invisible characters, multimodal attacks, and governance gaps highlight the ongoing need for vigilance and continuous improvement.

Available on demand

Available on demand

Available on demand

Supercharged security for your AI systems

Don’t wait for an incident to happen. Make sure your AI apps are safe and trustworthy.

SplxAI - Background Pattern

Supercharged security for your AI systems

Don’t wait for an incident to happen. Make sure your AI apps are safe and trustworthy.

Supercharged security for your AI systems

Don’t wait for an incident to happen. Make sure your AI apps are safe and trustworthy.

SplxAI - Background Pattern
SplxAI - Accelerator Programs
SplxAI Logo

For a future of safe and trustworthy AI.

Subscribe to our newsletter

By clicking "Subscribe" you agree to our privacy policy.

SplxAI Logo

For a future of safe and trustworthy AI.

Subscribe to our newsletter

By clicking "Subscribe" you agree to our privacy policy.

SplxAI - Accelerator Programs
SplxAI Logo

For a future of safe and trustworthy AI.

Subscribe to our newsletter

By clicking "Subscribe" you agree to our privacy policy.