ITSPmagazine Podcasts

Hallucinations - Fear and Loathing in Silicon Valley | Cyber Cognition Podcast with Hutch and Len Noe

Episode Summary

Hallucinations is a relatively new problem unique to Large Language Models (LLMs) that introduces a whole new set of obstacles and challenges.

Episode Notes

Hosts: 

Hutch

On ITSPmagazine  👉 https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/hutch

Len Noe, Technical Evangelist / Whitehat Hacker at CyberArk [@CyberArk]

On Twitter | https://twitter.com/hacker_213

On LinkedIn | https://www.linkedin.com/in/len-noe/

______________________

Episode Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?

👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

______________________

Episode Introduction

Hallucinations is a relatively new problem unique to Large Language Models (LLMs) that introduces a whole new set of obstacles and challenges.

News

1. Rabbit consumer AI agent – https://www.theverge.com/2024/1/10/24033498/rabbit-r1-sold-out-ces-ai

2. Research paper released self driving – https://arxiv.org/pdf/2401.01624v1.pdf

3. OpenAI removes prohibition on uses for “military and warfare” — https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt/

Deep Dive

1. ChatGPT generates false legal citations - https://arstechnica.com/tech-policy/2023/06/lawyers-have-real-bad-day-in-court-after-citing-fake-cases-made-up-by-chatgpt/

2. Public Bard hallucination - https://www.reuters.com/technology/google-ai-chatbot-bard-offers-inaccurate-information-company-ad-2023-02-08/

3. Bug bounties saturated with garbage - https://www.malwarebytes.com/blog/news/2024/01/how-ai-hallucinations-are-making-bug-hunting-harder

______________________

Resources

______________________

For more podcast stories from Cyber Cognition Podcast with Hutch, visit: https://www.itspmagazine.com/cyber-cognition-podcast

Watch the video podcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllS12r9wDntQNB-ykHQ1UC9U