Amazon Collects and Stores Everything You Say to Alexa — Here’s How to Delete

Source: The Mind Unleashed

If you’re one of the millions of people who own a device with Amazon Alexa, the company’s mobile spying device helpful personal assistant, whatever you say may be recorded – especially if someone in the house is named Alexis, Alex, or Lexi.

According to Amazon’s Alexa terms of use, the company collects and stores most of what you say to Alexa – including the geolocation of the product along with your voice instructions, reports CNBCTodd Haselton.

Your messages, communication requests (e.g., “Alexa, call Mom”), and related instructions are “Alexa interactions,” as described in the Alexa Terms of Use. Amazon processes and retains your Alexa Interactions and related information in the cloud in order to respond to your requests(e.g., “Send a message to Mom”), to provide additional functionality (e.g., speech to text transcription and vice versa), and to improve our services. -Amazon Terms of Use

Does Alexa record everything? Not according to Amazon, which says that devices such as the Echo only begin “listening” when it hears its wake word, “Alexa.” Could Alexa be remotely switched on or hacked to surveil a target? Well – we know they’ve been hacked to eavesdrop, and we know the government has been using personal cell phones as “roving bugs” for years – so it stands to reason that an Amazon listening device could be used against its owner.

Wrong, Brian@leonidasmoderus

Alexa, are you spying on me?

Alexa: *coughs* No, of course not.

Two weeks ago a New Hampshire judge ordered Amazon to turn over two days worth of Amazon Echo recordings in a January, 2017 double murder of two women, for example, in the hopes that it may yield useful evidence in the case. The search warrant, obtained by Tech Crunch, says that there is “probable cause to believe” that the Echo picked up “audio recordings capturing the attack.”

And in other spooky news, in May of this year an Amazon Echo recorded a conversation between a husband and wife, then sent it to one of the husband’s phone contacts. Amazon claims that during the conversation someone used a word that sounded like “Alexa,” which caused the device to begin recording.

“Echo woke up due to a word in background conversation sounding like ‘Alexa,’” said Amazon in a statement. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

The wife, Danielle, told KIRO7, however that the Echo never requested her permission to send the audio. “At first, my husband was like, ‘No, you didn’t,’” Danielle told KIRO7. “And he’s like, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did!’”

StockCats@StockCats

“Ok Google, ask Alexa if Siri is listening in on my conversations”

Cortana – “if you aren’t doing anything wrong there’s nothing to worry about”

288 people are talking about this

How to delete what Amazon has recorded:

According to CNBC’s Haselton, you can delete Alexa conversations maintained on Amazon servers (which we’re sure are totally gone forever);

You can listen to and delete this information through the Amazon Alexa app on your iPhone or Android phone. Here’s how:

  • Open the Alexa app on your phone.
  • Tap the menu icon on the top-left corner of the app.
  • Tap “Settings” at the bottom.’
  • Tap “Alexa Account” at the top of the page.
  • Select “History.”

You’ll see a list of all of your interactions with Alexa, just like the picture above. You can tap each one to listen to the recording, or to delete the recording from Amazon’s cloud.

Unfortunately, if you use Alexa as frequently as I do, the list of recordings is prohibitively long to actually move through and delete each voice request one by one. So, if you want to delete everything at once, do this:

  • Visit Amazon’s Device page
  • Select the menu button to the left of the Echo device you’d like to manage. (The menu button looks like three little dots stacked on top of one another.
  • Tap “Manage Voice Recordings” You’ll see a prompt like this:
  • Tap “Delete.”

This lets you delete all of the recordings sent to Amazon by a specific Echo. –CNBC

You may or may not get a visit from a Bezos robot dog if you do this, so proceed at your own risk.


By Tyler Durden / Republished with permission / ZeroHedge.com

 

HALF Of Americans’ Photos Are Now Stored In Facial Recognition Databases

Source: Activist Post

By John Vibes

According to a new report by Georgetown Law’s Center on Privacy and Technology, half of Americans now have photos of themselves stored in facial recognition databases. The vast majority of these citizens are not suspects in crimes, nor do they have criminal records.

The report indicated that over 117 million adults are stored in facial recognition databases, and any of their photos can be used at any time in a “virtual lineup,” where they can be picked out by law enforcement as potential suspects.

According to the American Civil Liberties Union (ACLU), many police departments use photos from Facebook, photos from protests, and even videos of average people walking down the street taken from cameras posted up around urban centers. It was even indicated in the report that drivers license photos are used to populate these databases, meaning that almost anyone could be a potential suspect in one of these lineups.

 The report’s findings, along with revelations from the ACLU on police monitoring in Baltimore, suggest that the technology may be violating the rights of millions of Americans and is disproportionately affecting communities of color, advocates said.

Alvaro Bedoya, executive director of Georgetown’s privacy and technology center says that this new technology presents a massive privacy risk.

“Face recognition, when it’s used most aggressively, can change the nature of public spaces. It can change the basic freedom we have to go about our lives without people identifying us from afar and in secret,” he said.

The investigation poured through a year’s worth of police reports to determine how widespread facial recognition software has become.

Neema Singh Guliani, ACLU’s legislative counsel, pointed out that government agencies have free rein to do whatever they want with this technology, and that they offer absolutely no transparency.

“In the case of face recognition, there appear to be very few controls or safeguards to ensure it’s not used in situations in which people are engaged in first amendment activity,” Guliani said.

One of the most disturbing aspects of the recent report is the fact that these facial recognition databases have become “overwhelmingly made up of non-criminal entries.”

This can get extremely dangerous because this technology is far from perfect and, in fact, mistakes are made all of the time.

Even the FBI admits that one out of every seven searches of its facial recognition database are incorrect, meaning that innocent people are singled out on a regular basis. However, independent investigation revealed that the number was far higher, with close to 90% of thoseidentified by facial recognition technology being innocent people.

Considering the fact that this is fairly new technology, it is obvious that there will be plenty of bugs and imperfections. When dealing with a personal computer or video game system this is acceptable, but when dealing with people’s lives the risk is far too great.

John Vibes is an author and researcher who organizes a number of large events including the Free Your Mind Conference. He also has a publishing company where he offers a censorship free platform for both fiction and non-fiction writers. You can contact him and stay connected to his work at his Facebook page. John is currently battling cancer naturally, without any chemo or radiation, and will be working to help others through his experience, if you wish to contribute to his treatments please donate here.

This article may be freely shared in part or in full with author attribution and source link.

Image Credit: Kaspersky.com