So apparently Amazon's Alexa doesn't spy on you, but she (he? Them? It?) will record your conversations and send them to people in your contact list (no Mom, I SWEAR I was talking about my friend's overly intrusive and controlling mother, not you!). A couple in Portland learned this the hard way when one of the several IoT device's in their home recorded their conversation and transmitted it to one of their employees in Seattle.
Reports indicate once they were made aware of the issue they immediately unplugged all of their devices and requested a refund from Amazon for each device, a battle they have not yet won. Amazon, while apologetic, and acknowledging the device was programmed to inform the user prior to packaging and sending the recording, has not made any indication they will process the requested refund, or of any wrong-doing on the speaker's end.
The statement released from Amazon, and originally published by ZDNet, states: Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right'. As unlikely as this string of events is, we are evaluating options to make this case even less likely.
With several other mitigating factors involved, (was Echo plugged into a speaker? Is this an issue with the Alexa speaker as a product line, this specific model, or this specific speaker? Does the distance between endpoints matter, like the original KIRO story suggests?), we might never know the full extent of how the error occurred, however, the technology is clearly flawed. With several other incidents of the speakers acting erratically owners may want to ensure they are out of earshot of their devices, before discussing any sensitive information.
Viva La Technology, right!?!