Kepler Vision Technologies today announced an upgrade to its Kepler Night Nurse (KNN) solution – automatically blurring the faces of anyone detected by its smart sensor system. This upgrade will provide an additional level of guaranteed privacy for both patients and care staff.
Through the use of deep learning and computer vision, KNN’s video monitoring system is able to detect when elderly patients fall, when they are in physical distress, and when patients suffering from dementia wander into areas that are not supposed to – automatically alerting staff when these patients need assistance. Replacing old sensor systems such as bed mats, motion sensors and wearables like necklaces and bracelets, the software allows staff to immediately respond to patients, and eliminates 99% of false alarms.
Kepler Vision Technologies notes unlike traditional camera systems, which require staff attention to monitor, with KNN live video feeds are analysed by the software without being seen by a human being until they need to confirm the cause of a fall or physical discomfort. To better enable fall prevention practices, care staff are able to view a photo shot 30 seconds before a patient falls down in order to find out the reason for the fall. With this new addition, all staff and patients in shot have their faces blurred to guarantee privacy, while still allowing to interpret the care-situation in the room.
The addition of automatic face blurring to the KNN solution adds an extra level of privacy for anyone in these environments, similar to the automatic anonymisation solutions used by Google Maps or Amazon Rekognition.
Dr. Harro Stokman, CEO of Kepler Vision Technologies, said: “As machine learning tools continue to proliferate throughout the healthcare sector, ensuring these systems provide benefits without compromising patient’s privacy and dignity is of paramount importance. While our Kepler Night Nurse system already provides privacy because the video feeds it monitors are only ever “seen” by the algorithm, automatically blurring the faces in this closed system will provide an extra level of privacy for both patients, and staff who are concerned that KNN could be used to spy on them. We look forward to announcing further upgrades to the KNN that will improve its monitoring capabilities, ease of use, and ensure continued transparency.”
To further cement its position as a leader in the ethical applications of AI, Kepler Vision recently worked with AI bias tester Code4Thought to make certain its machine learning solution stood up to the highest level of independent scrutiny. Based on the ISO-29119-11 standard – guidelines designed to test black-box AI-based systems to ensure accuracy and precision – Code4Thought’s analysis proved that KNN’s algorithm is able to make correct predictions regardless of a patient’s gender. It also set out guidelines to follow to protect against AI bias in all future updates.