As a concept, autonomous behavior analysis makes facial and body recognition seem quaint. Think about it: in the latter situation, the machine merely identifies you as you or at least some database version of you, however accurate. In a situation where the computer is analyzing your actions/movements, it’s making a deeper and arguably more dangerous determination. In ascribing behavior to you, the program is doing more than recalling information, but creating information. The software is adding to the profile. This is the difference between profiling and profile-creating, the opening of an entire new world of fiction.
Autonomous behavior analysis is a real thing, of course. The Center for Engineering and Industrial Development just unveiled a recently installed prototype in central Mexico. The CIDESI test system consists of four cameras installed around a laboratory., all connected to a central server. Basically, those cameras together track an individual as they enter the facility and move within it while making determinations as to whether that person is behaving “unusually.” If the software determines they are, the event is logged and it sends out a notification.
The CIDESI press release is modest about applying the technology, suggesting that such a system might be used at traffic intersections to identify accidents and such. But the questions are endless: could AI-sourced behavior analysis be used as evidence in court? Are people going to start getting picked off in airports because some software determines they started walking funny in the security line? Just imagine how deep a behavior database might get, every month or year enabling some deeper and more long-term look.