River Landing at Sandy Ridge expects to begin relaxing some restrictions regarding visitation and is working to resume some of our pre-COVID operations. As we begin to relax these restrictions, we will continue to maintain stringent employee and visitor screenings. These precautions are in line with guidance from the NC Department of Health and Human Services, our local health department, other state and federal agencies. Our approach will be cautious and coordinated to ensure the safety and wellbeing of our residents and staff
Because these changes will be made gradually and in multiple stages, please reach out to us for more details at firstname.lastname@example.org.
You’re not being paranoid if everyone is out to get you or if someone is eavesdropping on all your private conversations.
In fact, during the past year it has come to light that those smart voice assistants with friendly names like Alexa and Siri have been spying on their human owners.
So do you have something to worry about? More important, what can you do to protect yourself?
If you’re using Amazon’s Alexa, Apple’s Siri or Google’s Assistant, the services are always listening to you and your family. The voice assistants have to in order to respond to a trigger word — “Alexa!” — when you’re looking for answers or want something done.
And, yes, they do record what you say and usually keep those recordings.
Ostensibly, these smart speakers with built-in smart microphones do not have nefarious intentions.
They are listening only so they can get better at doing your bidding, and the past recordings of your dulcet tones help improve their accuracy in understanding what you’re saying. The recordings can be used to train the systems’ voice recognition to better anticipate context, even tell the difference between your questions and your partner’s commands.
At least that’s what the companies claim.
The problem is those recordings are stored in the cloud, with information — what you listen to, where you go, what news you tune in — that might be shared with other companies. Furthermore, personal, very personal, audio recordings of people’s very private habits have been shared not with impersonal computer algorithms but with live human operators.
The human listeners were hired to make transcripts of the recordings to assess and help improve the programs’ accuracy. Some of those human listeners are the ones who spilled the beans on Alexa and Google Assistant.
Once the revelation became public, Apple and Google said they would temporarily suspend their human listening programs. Amazon decided to let users opt out of the human reviews.
The issue is definitely serious. It’s not just about targeted advertising.
Companies use such information, accurate or not, to determine how much you pay for insurance, what rate you’ll get on a home mortgage or whether you’ll get a job. So protecting your digital privacy is reasonable.
Here’s how to do it without giving up the convenience of a smart speaker.
Warning: Some of the steps take patience.
A quick solution is simply to turn the microphone on these smart speakers off.
Then you can delete Google Assistant’s recordings individually or in groups. Better, you can set it to regularly delete such files by going to the My Activity page and select Choose to delete automatically.
All of the companies continually update their programs, so some of these settings may change in the future. In each of these cases, the companies will try to dissuade you from deleting recordings or opting out of the monitoring. They claim that disabling the recordings and clearing out your recordings on a regular basis will reduce the accuracy of the voice assistants in understanding what you are saying.
However, with continual testing I’ve found it doesn’t make much of a difference. The programs are not that sophisticated yet, so don’t be afraid to delete recordings.
Still, manually deleting recordings, like clearing the cache in your computer’s web browser, can be an onerous task. Of course, you can always ask your smart speaker, “Alexa, remind me to clear my recordings every Wednesday at noon.” Problem solved.
This article was written by John R. Quain for AARP.