- Популярные видео
- Авто
- Видео-блоги
- ДТП, аварии
- Для маленьких
- Еда, напитки
- Животные
- Закон и право
- Знаменитости
- Игры
- Искусство
- Комедии
- Красота, мода
- Кулинария, рецепты
- Люди
- Мото
- Музыка
- Мультфильмы
- Наука, технологии
- Новости
- Образование
- Политика
- Праздники
- Приколы
- Природа
- Происшествия
- Путешествия
- Развлечения
- Ржач
- Семья
- Сериалы
- Спорт
- Стиль жизни
- ТВ передачи
- Танцы
- Технологии
- Товары
- Ужасы
- Фильмы
- Шоу-бизнес
- Юмор
AI is now choosing Bombing Targets now and it's wrong 10% of the time
An AI is choosing who gets bombed in real wars right now — and it has a confirmed ten percent error rate. Israel's military AI system Lavender makes a targeting decision in 20 seconds. A human analyst takes days. The Israeli military knew about the error rate before deployment. They used it anyway.
Before AI, the Israeli Air Force literally ran out of bombing targets and struck the same buildings twice. Then The Gospel AI changed everything — generating 100 bombing targets every single day where a team of 20 human officers could only identify 50 in an entire year. In the first 35 days of the Gaza war alone, Israel struck over 15,000 targets. That operational pace was only possible because a machine was doing the choosing — not a human. In February 2026, a strike hit a girls' school in southern Iran. Iran says 168 people were killed — most of them children. The school was on the US military target list. The military facility was next door. The AI was processing intelligence data that was four years out of date. Nobody stopped it. One Israeli officer who operated Lavender described his entire job as — twenty seconds per target, dozens per day — quote: "I had zero value as a human. I was just a stamp of approval." The CEO of Palantir, one of the most powerful AI weapons companies on the planet, stated publicly that these AI targeting systems are now the equivalent of tactical nuclear weapons against an enemy with only conventional ones. The age of a human deciding who gets bombed is ending. A machine is taking over. And it is wrong ten percent of the time.
🔔 Subscribe to Science Unseen for more mind-blowing science facts every day.
#shorts #science #facts #shorts #AItargeting #AIwarfare #Lavender #TheGospel #Palantir #militaryAI #weaponstech #scienceunseen #mindblowingfacts #hiddenfacts #AIweapons #modernwarfare #shockingscience #wariniran #Gazawar #artificialintelligence #bombingtargets
Видео AI is now choosing Bombing Targets now and it's wrong 10% of the time канала Science Unseen
Before AI, the Israeli Air Force literally ran out of bombing targets and struck the same buildings twice. Then The Gospel AI changed everything — generating 100 bombing targets every single day where a team of 20 human officers could only identify 50 in an entire year. In the first 35 days of the Gaza war alone, Israel struck over 15,000 targets. That operational pace was only possible because a machine was doing the choosing — not a human. In February 2026, a strike hit a girls' school in southern Iran. Iran says 168 people were killed — most of them children. The school was on the US military target list. The military facility was next door. The AI was processing intelligence data that was four years out of date. Nobody stopped it. One Israeli officer who operated Lavender described his entire job as — twenty seconds per target, dozens per day — quote: "I had zero value as a human. I was just a stamp of approval." The CEO of Palantir, one of the most powerful AI weapons companies on the planet, stated publicly that these AI targeting systems are now the equivalent of tactical nuclear weapons against an enemy with only conventional ones. The age of a human deciding who gets bombed is ending. A machine is taking over. And it is wrong ten percent of the time.
🔔 Subscribe to Science Unseen for more mind-blowing science facts every day.
#shorts #science #facts #shorts #AItargeting #AIwarfare #Lavender #TheGospel #Palantir #militaryAI #weaponstech #scienceunseen #mindblowingfacts #hiddenfacts #AIweapons #modernwarfare #shockingscience #wariniran #Gazawar #artificialintelligence #bombingtargets
Видео AI is now choosing Bombing Targets now and it's wrong 10% of the time канала Science Unseen
AI choosing bombing targets AI error rate in war AI targeting system AI vs human targeting Gaza AI bombing Gospel AI military Lavender AI Israel ai facts ai vs human can AI decide who gets bombed future of warfare future war how AI targets people mind blow mind blowing facts modern warfare technology science facts science unseen shocking facts shocking science shorts unseen science war tech
Комментарии отсутствуют
Информация о видео
28 апреля 2026 г. 16:25:15
00:02:06
Другие видео канала




















