What duty do clinicians have to inform clients? How can we balance confidentiality with the reality of how commonly these devices are involved in therapy? Can telehealth therapy be completely confidential and data secure? We discuss our shift in clinical responsibility, best practices, and how we can minimize exposure of clinical data to ensure the confidentiality our clients expect and deserve. Who’s in the Room? Siri, Alexa, and ConfidentialityĬurt and Katie chat about how therapists can maintain confidentiality in a world of AI assistants and smart devices. For more information, see our ethics policy. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. It has been updated to include the latest Alexa privacy settings on the Alexa app and on the web.
Update November 21st, 2019, 11:20AM ET: This article was originally published on May 28th, 2018. But if you’re nervous about what the Echo has been listening to you say, it may be worth browsing to make sure it hasn’t recorded something you don’t want transmitted elsewhere. While it does automatically create a voice profile for each new user it recognizes (or ones you’ve manually added), the company says it deletes acoustic models if it has not recognized any particular user for three years.įinally, if you click on the link for Amazon’s Manage Your Alexa Data on the left-hand menu, you can set the app to automatically delete voice recordings after three or 18 months.įor heavy Alexa users, going through all of these commands to find egregious conversations to delete might be too much work. The company, of course, cautions that doing so “may degrade your Alexa experience.” As noted above, Amazon keeps these recordings to personalize the Alexa experience to your household, and it uses them to create an acoustic model of your voice. For example, to wipe out your entire Alexa history, you can set the date range for “All History” and then click on the link “Delete All Recordings for All History.” This will bring you to the “Review Voice History” page where you can perform the same actions that you could on the app. Here, you can view, listen, and clear your Alexa voice prompts as needed by clicking on the “Privacy Settings” tab and then on the “Review voice recordings” link in the “View, hear, and delete your voice recordings” section. If you prefer to do this on a desktop, you can also manage your Alexa history by going to Amazon’s dedicated Alexa Privacy page. However, it comes with a warning that anybody with access to your Alexa device can give that order. You can even toggle on a feature that lets you delete recordings by giving Alexa a verbal instruction. You can also check off several of the recordings listed and just delete those selected recordings. Directly below the date range drop-down menu is a link that lets you filter by device so that you can delete just what was recorded by a specific device. There is a variety of ways to delete your recordings. by tapping the “Delete All Recordings for.” link below the drop-down. This also enables you to delete all of the recordings for that day, week, month, etc. You can filter the results by date - like by the day, week, month or using a custom range - using a drop-down menu. Click on the Settings menu, then tap on “Alexa Privacy.” Here, you’ll be able to browse all of the commands you’ve ever asked of Alexa, from timers to music requests to general internet queries. Tap the hamburger icon on the top left side of the screen to open the menu options. On the appįirst, open the Alexa app on your smart device. If you’re curious about what Alexa has been hearing and recording in your household, here’s a quick way to check.
Still, it’s not uncommon for smart speakers to pick up a random part of your everyday conversations and misunderstand it as a wake word (especially if you changed the Alexa trigger to a more common word, like “Computer”). For these instances, Amazon claims that the devices were likely triggered by false positive commands. In 2018, users reported that their Echo speakers began spontaneously laughing, while a family in Portland said their device recorded and sent conversations to a colleague without their knowledge. While this is supposed to help the assistants learn to give you better answers, this feature-not-a-bug has landed Amazon in a string of bizarre headlines.
Digital assistants like Amazon Alexa and Google Assistant are designed to learn more about you as they listen, and part of doing so is to record conversations you’ve had with them so they can learn your tone of voice, prompts, and requests.