The surveillance tech industry is gaining traction despite ethical concerns, with startups like Conntour raising $7 million to refine AI search engines for security cameras. The co-founder, Matan Goldner, insists on selective client choices to ensure moral and legal use of their technology.
Conntour’s platform uses advanced vision-language models to allow users to query video feeds through natural language, achieving a Google-like search experience specifically tailored for security footage. The system can monitor and automatically generate alerts based on preset rules, making it highly adaptable and scalable.
The company claims their system can handle thousands of camera feeds with minimal computing power, using multiple models and logic systems to optimize efficiency. It offers deployment flexibility, suitable for both on-premises and cloud-based environments, and compatibility with existing security systems or as a standalone platform.
However, the quality of surveillance footage remains a challenge, leading Conntour to implement confidence scores in their results. The biggest technical hurdle they face is integrating large language model capabilities while maintaining efficiency across thousands of feeds.
As AI continues to evolve its role in our daily lives, who gets to pick which ethical standards we follow? Just another twist in humanity’s complex relationship with technology.







