NSW Education Department Unaware of Microsoft Teams Biometric Data Collection

NSW Education Department Unaware of Microsoft Teams Biometric Data Collection

In March 2025, the New South Wales Department of Education discovered that Microsoft Teams had begun collecting students’ voice and facial biometric data without prior knowledge. This occurred after Microsoft enabled a Teams feature called ‘voice and face enrollment’ by default, which creates biometric profiles to enhance meeting experiences and transcriptions via its CoPilot AI tool.

The NSW department learned of the data collection a month after it began and promptly disabled the feature and deleted the data within 24 hours. However, the department did not disclose how many individuals were affected or whether they were notified. Despite Microsoft’s policy of retaining data only while the user is enrolled and deleting it within 90 days of account deletion, privacy experts have raised serious concerns. Rys Farthing of Reset Tech Australia criticized the unnecessary collection of children’s data, warning of the long-term risks and calling for stronger protections.

The incident has sparked a broader debate about the use of biometric data in educational settings and the potential risks associated with its collection. With the increasing integration of AI technologies into classrooms, the ethical and legal implications of such data collection are becoming more pressing. Privacy advocates are urging for stricter regulations to ensure that students’ personal data is protected, especially when it comes to sensitive biometric information.

While Microsoft has stated that the data is deleted within 90 days of account deletion, critics argue that the default activation of such features poses a significant risk to user privacy. The NSW Department of Education’s response, while swift, has drawn criticism for its lack of transparency and communication with affected students. This breach highlights the need for greater oversight and accountability in the use of educational technology, particularly when it involves the collection of biometric data from minors.

The situation also raises questions about the broader implications for data privacy in the education sector. As schools and universities continue to adopt digital tools to enhance learning experiences, the importance of safeguarding student data cannot be overstated. The incident serves as a reminder of the potential risks associated with the use of AI-driven platforms and the necessity of clear guidelines to protect the rights and privacy of students.