I'm really hoping you don't need me to tell you that using machine learning ("AI") systems to identify potential targets and then to suggest that they be targeted when at home with their family members, including children, is one of the most abhorrent, unethical, inhumane things I've ever seen. There is absolutely no excuse for developing these systems. Technology is never neutral. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes