The new Amazon mobile app for the iPhone is excellent – well-designed, a great mix of iPhone and Amazon visuals, and easy to use. Despite the lack of Gold Box integration, I’ll use it all the time.
And the idea of Amazon Remembers – a place for you to take photos and have them stored online – that’s neat, though obviously overlapping with dozens of other photo storage services. It also offers product identification – take a picture of something and have people tell you what it is.
Remembers, though, breaks the social contract: it makes pictures that reasonable users might assume will be private, and makes them public.
As has been documented online, when you take a picture and upload it to Amazon Remembers, it’s sent out to Mechanical Turk users for product identification. That makes sense – product identification is hard and mturk is great for these kinds of problems – except that
- People might use Amazon Remembers for non-product images. The marketing materials talk about how you _can_ use it for product images, but not that it’s the only use of the service. Here’s the first page of the Remembers tab in the app:
And here’s the pre-picture screen:
No content on either screen assumes that it must be a product, or that other people besides you will see it.
For the user who does click “What happens to my photos?”, you get partial information:
This does say that people look at it, but it’s not clear who those people are.
- Every picture is then visible to strangers via Mechanical Turk. There’s no obvious pre-screening of images to make sure they’re acceptable. Keep in mind that Mechanical Turk respondents are not Amazon employees – they simply can’t be expected to keep private images private and not to violate people’s privacy. To test this, I took an image of a young-looking, non-recognizable Dakota Fanning and added a fake name and address to it: I then took a photo of that image and added it to Amazon Remembers. Here’s the image:
And here’s the Mechanical Turk hit that appeared <15 seconds after the image was sent up:
So to review: I posted a picture of a minor with a name and an address, and that picture showed up on Mechanical Turk, in front of strangers, immediately. (To their credit, the Turkers who saw this image did recognize it as Dakota Fanning, and sent me to this head shot.)
The violation is that people will assume that memories are private. This isn’t the same thing as Amazon’s Customer Images, for example, where customers know when they’re using the feature that the images show up on the product page.
The principle is pretty simple: if you’re going to take something that people might assume to be private and make it public, you have to be explicit about it. This approach is either poorly-thought-through or sneaky, but either way, it’s a violation of customer’s trust in Amazon. This problem could be seriously reduced with much clearer messaging, or eliminated with employee human review (which almost certainly won’t happen).
I don’t know how large this problem is – I’ve looked at ~30 submitted images, and only one was an obviously personal image, and had no (obvious) identifying info (though there could be data encoded in the image). But the people picking up the app now are likely early adopters, and are perhaps more likely to be well-informed or to read the fine print.
(Disclosure: I worked at Amazon for four years, working on customer-facing features that dealt with similar issues.)