Giving Compass' Take:

• Colin Horgan at Mashable discusses how Google, Amazon and even Apple want to use recordings from their devices to collect data from the home. This can be useful to society in some ways i.e. crimes, however the question of invading privacy is a big one. 

• What happens when useful technology devices become too personal? How can we separate privacy and artificial intelligence and when does "collecting data" become too invasive? 

Here's an example on AI voice assistants trained to improve health care. 


A few days before Christmas 2018, a German man downloaded his personal information from Amazon. In response, Amazon sent him 1,700 Alexa voice recordings. The only problem was they belonged to someone else.

The customer shared the recordings with German tech magazine C’t, which chose not to name him. “Suddenly, we found ourselves in the intimate sphere of strangers without their knowledge,” the magazine’s staff wrote. According to NPR, the magazine claimed that on the recordings “a man could be heard in various parts of his home, even in the shower,” and that “there were alarm clock and music commands, weather questions and also comments related to work or living habits.”

When Reuters contacted Amazon, the company said the mistake was “the result of human error” and “an isolated single case.” This is likely true: Amazon wouldn’t have much interest in sharing Alexa recordings between customers. But in the very near future, Amazon, as well as other tech giants investing in computer assistants (Google, Apple, and Facebook among them) will be analyzing those recordings more closely for its own purposes. And while they’ll still be interested in what we say to Alexa, Assistant, or Siri, they’ll be just as interested in what we’re not saying — the background noise, the “intimate sphere” of our homes.

That’s where the next privacy battle will be waged.

Read the full article on how the next privacy war will be in our homes by Colin Horgan at Mashable