By 2023, the number of users of digital voice assistants, like Alexa, Siri, Cortana, and more, from manufacturers like Amazon, Apple, Microsoft, and Google, is expected to grow to eight billion. Smart speakers with this digital technology have already found their way into workplace environments, not only for simple tasks like adding meetings to calendars, but also for more advanced business applications and customer support. Wider use of these devices in the workplace—ubiquity in individual offices and common areas like break rooms and conference areas—bring increased security concerns.
Real-life news stories have already shown how easy it is for smart speakers to record a private conversation without the speakers’ consent or awareness—and even go a step further, to accidentally send that recording to a totally unrelated person in a speaker’s contacts list. What would happen if a hacker gained access to these “smart” digital assistants? Researchers have developed apps for these devices to simulate how bad actors could trigger eavesdropping or phishing tasks in the background, without the user’s awareness, while performing seemingly innocent tasks. Moreover, these research apps could make it seem like the device wasn’t recording, or they would provide passwords to access the user’s confidential files.
Widely-used apps already collect a user’s health statistics or track speakers’ vocal tones and sentiments while “listening” to conversations; smart speakers can monitor all of this information about employees, and even forecast a person’s future actions—and even verify those predictions. In a business environment where confidentiality is paramount, smart assistant technology can lead to catastrophic breaches or hacking incidents, with users totally unaware that they had triggered the processes unintentionally.
With GDPR codifying privacy, transparency, and disclosure in the European Union, the U.S. legal system is seeing increased calls for similar laws, such as California’s Consumer Privacy Act (CCPA). Before any private data gets collected by smart speakers, users would be required to understand and agree to what the device is doing. Rules like these have influenced Alexa, Google, and other smart assistant apps to allow users to log in and see their data files. With legal disclosure obligations like these, light is shed on how device data may be collected for investigations or legal matters.
As awareness grows about the importance of these background data files, digital assistant data now plays a bigger role in the courtroom: voice data and recordings have been considered in murder cases, and health device data has shown up in relation to insurance fraud or other criminal charges. It’s a natural progression that corporate data gathered by smart speakers, even if the recording were unintentional, could be analyzed in litigation and document requests. Break room conversations, office complaints, or private business information could be at play in discovery.
Plan ahead for privacy.
It’s not just documents—it’s anything that your smart device might know. Data privacy compliance affects every industry operating in our modern era. Every corporation and law firm needs clear, comprehensive information governance consulting and solutions.
BIA helps clients design and implement privacy management for corporate security architecture as well as litigation. In addition to our legal experts who can consult in GDPR, CCPA, and other state and industry regulations, we also have IT and cyber expertise to support all your needs.
Whether you need a top-to-bottom risk management assessment of your data, or whether you just want to keep your smart speakers from spilling the beans, BIA’s advisors will help you avoid costly penalties and save energy on eDiscovery before any litigious leaks occur.
Speak with BIA’s data privacy experts.
Read more about how you can use BIA to help manage your data and prepare a comprehensive compliance plan for your organization.