All
Search
Images
Videos
Shorts
Maps
News
Copilot
More
Shopping
Flights
Travel
Notebook
Report an inappropriate content
Please select one of the options below.
Not Relevant
Offensive
Adult
Child Sexual Abuse
Length
All
Short (less than 5 minutes)
Medium (5-20 minutes)
Long (more than 20 minutes)
Date
All
Past 24 hours
Past week
Past month
Past year
Resolution
All
Lower than 360p
360p or higher
480p or higher
720p or higher
1080p or higher
Source
All
Dailymotion
Vimeo
Metacafe
Hulu
VEVO
Myspace
MTV
CBS
Fox
CNN
MSN
Price
All
Free
Paid
Clear filters
SafeSearch:
Moderate
Strict
Moderate (default)
Off
Filter
8:47
AI Model Penetration: Testing LLMs for Prompt Injection & Jailbreaks
21.6K views
8 months ago
YouTube
IBM Technology
2:10
Poetic Prompts Can Jailbreak AI? New Study Shows 62% of Chatbot
…
431 views
4 months ago
YouTube
WION
6:41
AI Jailbreaking Demo: How Prompt Engineering Bypasses LLM Securi
…
3.1K views
Sep 26, 2024
YouTube
Packt
7:51
Find in video from 03:26
What is Prompt Injection?
What Is Prompt Injection Attack | Hacking LLMs With Prompt Injecti
…
5.3K views
Jun 20, 2024
YouTube
Simplilearn
15:47
Find in video from 04:54
Indirect prompt injection attack
How AI jailbreaks work and what stops them. (GPT, DeepSeek, Lla
…
11.1K views
Oct 21, 2024
YouTube
Microsoft Mechanics
Safeguarding AI against ‘jailbreaks’ and other prompt attacks
5 months ago
microsoft.com
2:49
French hackers show how easy it is to 'jailbreak' Musk's Grok 3 • FRA
…
87.5K views
Feb 20, 2025
YouTube
FRANCE 24 English
10:05
We tried to jailbreak our AI (and Model Armor stopped it)
4.4K views
5 months ago
YouTube
Google Cloud Tech
1:00
What is Jailbreaking an AI? | What is by Digit EP7 | #jailbreakai #genera
…
2K views
Aug 18, 2024
YouTube
Digit
0:53
Find in video from 00:18
What is Prompt Injection Attacks?
How to mitigate GenAI security threats with Azure AI Content Safe
…
1.4K views
Aug 15, 2024
YouTube
Microsoft Azure
Study reveals poetic prompts could jailbreak AI
4 months ago
Mashable
Christianna Silva
This AI Chatbot is Trained to Jailbreak Other Chatbots
Jan 3, 2024
vice.com
AI Jailbreak | IBM
Nov 12, 2024
ibm.com
Microsoft: 'Skeleton Key' Jailbreak Can Trick Major Chatbots Into Beh
…
Jun 26, 2024
pcmag.com
3:01
[2025] How to use ChatGPT dan prompt - Unlock ChatGPT ( ChatG
…
33.8K views
11 months ago
YouTube
TenorshareOfficial
10:30
Find in video from 01:35
Jailbreaking Explained
How Jailbreakers Try to “Free” AI
258.9K views
Sep 28, 2024
YouTube
Sabine Hossenfelder
30:14
[2025] AI & Cybersecurity for Hacker & Developer: Love & Hate Story
…
3.3K views
3 weeks ago
YouTube
Jose Maria Alonso
9:03
Private & Uncensored Local LLMs in 5 minutes (DeepSeek and Dolphin)
612.2K views
Feb 5, 2025
YouTube
David Bombal
5:28
Notion AI with ChatGPT Hack!
55.1K views
Dec 26, 2024
YouTube
Tool Finder
1:52
What is a prompt and how do I write one?
8.4K views
7 months ago
YouTube
Microsoft 365
6:28
Episode 4: Indirect Prompt Injection Explained | AI Red Teaming 101
2.7K views
9 months ago
YouTube
Microsoft Developer
Jailbreak AI | IBM
Mar 11, 2025
ibm.com
2:29
Azure AI Content Safety Prompt Shields
2.4K views
Jun 4, 2024
YouTube
Microsoft Developer
It's Surprisingly Easy to Jailbreak LLM-Driven Robots
Nov 11, 2024
ieee.org
29:47
AI + Metasploit = Terrifyingly Easy Hacking is here (demo)
133.9K views
7 months ago
YouTube
David Bombal
41:28
Find in video from 15:00
Social Engineering and Jailbreaks
How Microsoft Approaches AI Red Teaming | BRK223
10K views
May 27, 2024
YouTube
Microsoft Developer
0:47
These AI Prompts Will 10x Your Coding Journey | Learn Faster & C
…
32.9K views
Mar 10, 2025
YouTube
GeeksforGeeks
The 2026 Guide to Prompt Engineering | IBM
9 months ago
ibm.com
AI-powered Bing Chat spills its secrets via prompt injection attac
…
Feb 10, 2023
arstechnica.com
ChatGPT: cómo activar el modo DAN para hacer jailbreak y usar la
…
May 24, 2024
xataka.com
See more videos
More like this
Feedback