Fandango and Facebook Just Violated My Privacy

I just bought tickets from Fandango to see a movie with my girlfriend later this week. On the order confirmation screen, I noticed a Facebook-looking message peek its head and then quickly disappear. I whipped-out one of my clever hacking tools and made it appear again:

Fandango Publishing Facebook Story

Yes, I see the “No Thanks” link, but the whole dialog was visible for no more than two seconds. Definitely not enough time for me to read it, process what was going on, and act appropriately.

Instead of defaulting to publishing a story in my Facebook profile, Fandango should default to asking me if it can do such a thing and wait for an explicit confirmation either way. After confirming, then maybe it could play this trick again the next time I buy a ticket without me wondering what just happened.

I’m afraid there are going to be more grievous misuses to come.

Update: this new Facebook feature has been getting a lot of press and backlash. So much so that Facebook has now changed the design from opt-out to opt-in…sort of. See my Google Reader Shared Items (via the right-hand navigation) to follow the story.

Closed-Ended Feedback

One of eBay’s strengths is its Feedback Forum where users are able to give open and honest feedback to each other after transactions are completed. The feedback results are open to all who want to view them. Users can discern whether or not they should buy from or sell to others based on prior feedback from the community. It is recorded using a score metric which can be set to “positive,” “neutral,” or “negative.” As well, feedback includes an open-ended text entry so that users can articulate specifics behind their score selection.

eBay assumes that a user is inherently neutral–that is, he is not good or evil. His lifetime Feedback Score starts off at zero (0). Whenever he receives a positive feedback post, his score increases by one; when he receives a neutral feedback post, it remains unchanged; and when he receives a negative feedback post, it decreases by one. All the while, a Positive Feedback percentage is calculated–much like a test grade in school.

eBay Feedback Profile

The beauty behind the eBay feedback model is its ability to convey whether or not a user can be trusted and also to what degree the entire community agrees in that trust. All things equal, if given a choice to purchase from a user with a Feedback Score of 2 or another of 157, most will choose to buy from the latter. The only problem with the Feedback Score display is the star graphic, which is somehow tied to it. I have been using eBay for almost a decade and its variations still mean nothing to me. It only adds noise.

The more serious flaw is the open-ended text feedback. Users have to manually skim over textual entries to get a feel for why someone has been given a particular score. Many entries add no or little value to the scores they describe. When every seller and buyer on eBay has “A+++++++++++++!!!!” entries, the playing field is leveled inappropriately. Good textual feedback typically falls in one of four categories: customer service, promptness of delivery, quality of good sold, and whether the purchaser would buy from him again.

eBay Positive Comments

If the textual responses were closed-ended instead, the feedback system could provide a clearer picture into why a user is getting the Feedback Score he is getting by calculating totals in each category. For example, this particular user had a history of sending imitation products. Most users still gave positive feedback because everything else was stellar, including situations where products were returned. If the quality of good sold category had a low score, those only interested in genuine products would steer away from this seller. Feedback would be specifically aggregated and useful.

Another benefit to closed-ended feedback is the prevention of flame wars, where users participate in mutual verbal attacks on character. Flame wars are subjectively blind and often heated by emotion rather than reason.

eBay Negative Comment

They divide communities, and make them unappealing to outsiders. Closed-ended feedback options avoid flame wars by keeping discussions objective.

Good metrics are devoid of emotion, and good metrics result in better decisions.

Analogies of a Parking Violation, Part Two: Governing Communities

Community governance was the second nerd thought that came to mind as I was soaking and scraping the parking violation sticker off my vehicle. Rules and guidelines exist within any reasonable community. The fun is in how strict they are and how they are enforced. My community’s homeowners association enforces rules centrally–it is the one that calls the shots and levies punishments. The community doesn’t have much say in individual cases. Online communities, however, do handle individual cases as a community, which results in better monitoring, better decision making, and better enforcement.

Many sites have terms of use. Sites assume that typical uses are valid but provide a way for users to report misuses. Facebook, for example, has a “Report This Photo” link whenever you view an album image. If an image is reported, it is inspected by a Facebook team member who makes a final decision on whether the photo stays or goes (source: http://www.facebook.com/help.php?tab=safety#ansj7).

Facebook Report This Photo

This technique was first popularized by the dating site HOTorNOT around 2000. One of the site’s founders, James Hong, originally hired his parents to screen flagged photos so he could continue coding. James quickly realized that the enforcement model had two problems with it. First, it didn’t scale well as the number of photos on the site increased exponentially–he needed to hire more people. Second, his parents were looking at inappropriate pictures eight hours a day.

Over the past seven years, the site has slowly matured from a centralized moderation system to a decentralized one consisting of volunteers. There is a nice explanation on Wikipedia of the site’s implementation of the principles found in The Wisdom of Crowds–a book that discusses how decentralized decision making results in better decisions. Although effective, the system requires volunteers who are willing to subject themselves to potentially vile images. In addition, as addressed in The Wisdom of Crowds, judgments rendered by appointed individuals do not reflect the values of the community accurately. We need a solution that relies on the community to make judgments. Digg is a popular news and media aggregation site that thrives on democracy. Readers vote for or against published content. Higher-ranked content gets more exposure, while other content gets buried. One of its weaknesses though is its susceptibility to mob-effect.

A community-based solution to the “Report This Photo” feature would be a voting mechanism that would kick-in if an image had been flagged as a violation of the terms of use. When community members would stumble upon a flagged image, they would be given the option to vote for or against it. Once a certain threshold had been met (albeit relatively low), the image could be flagged as appropriate or inappropriate permanently. Inappropriate images would be blurred beyond human recognition or removed completely. After time, those who voted in-line with the community’s final decisions could be given weighted votes to expedite future judgment calls. Such weighted voters would have more influence on future cases not because they are considered experts on morality but because they make judgments that best reflect the entire community.

In order to provide a truly decentralized judgment system and avoid mob-effect, the vote tally would need to remain hidden. Taking this idea further, “Report This Photo” could simply be a facade to the voting system so that flagging and voting is truly blind. Views of the photo without the link being clicked would be a vote for the photo to remain on the site and views with a “Report This Photo” click would be a vote to remove the photo from the site. Obviously, views would be tracked once per user. (Maybe Facebook is doing all of this already but just hasn’t updated its help documentation.)

If my real-life housing community was self-governed (like my parents’ neighborhood), monitoring and reporting would be handled by the community. If what I was doing was truly an inconvenience to the community, it would act appropriately. Punishment would still need to be levied by the association but the punishment would be more in-line with what the community deems appropriate.