Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OKC experimented with the matching process -- the express purpose of their site. Facebook experimented with mood manipulation -- something they have no permission to do -- regardless of the fine print. Facebook is a very bad landlord, but people don't want to move.


But Facebook already displays posts based on an algorithm; it doesn't do it chronologically, and it's actually almost impossible for me to find a specific post in my timeline. Especially from the mobile site, it's very common that I see something in the moment between opening the app and my timeline updating, and never being able to see the update again.

Facebook is already making the decision as to whether to show you a post or not, and they make that decision based on what they think you will want to see and what you will engage with, all of which is fuzzy and subjective. They already manipulate things you see to encourage engagement; for example, if their algorithm shows that your engagement is dropping and you're about to leave, they'll show other people your profile and say 'do you know this person?', because someone adding you as a friend boosts your engagement.

So if you think that showing you posts which are happy or sad is manipulative, be aware that Facebook is already filtering your potential-friends list and showing you people solely to boost engagement (either yours or theirs). Looked at another way, this means that it's possible their algorithm isn't currently showing you people you might know because it's not as beneficial for them to make that connection for you yet.

So the purpose of Facebook's site is to get people to interact and generate behaviour, and now they're experimenting with that; which posts do we show? which do we hide? They haven't shown them all for ages, so this is just a tweak to their algorithm that they were testing.

What's really interesting here is how this could actually be used for good. Someone feeling crappy? Show them fewer negative posts and more positive posts. Maybe that will help. Show negative posts less often, and make society in general a little more positive.


Facebook didn't experiment with "mood manipulation" any more than OKC. Facebook experimented with changing the site layout and which content and how much to show. Exactly the same stuff as all sites test.

Facebook did theorize that the changes it was making would affect people's moods in a certain way. That might make it sound like intentional mood manipulation.

But OKC's changes also affected people's moods. More than Facebook's, I would guess.

All these sites are constantly experimenting on humans. That's what changing the site content means. Everything that affects us affects our moods and everything else.


I'd also note that the proxy they are using ("apparent mood of subsequent content") hasn't been shown to correlate with the subjects'/users' actual mood.

All the FB experiment actually shows is that by manipulating the mood of content seen, you can affect the mood of content produced.

Here's a couple of contrarian hypotheses: "when some users see more 'happy' content, they feel worse about themselves in comparison, but post more 'happy' content to pretend that isn't so." Or, "when some already-sad users see more 'sad' content, this does not affect their mood directly, but does give them tacit permission to share how they are already feeling. Subsequently, their mood actually improves."


They experimented through misrepresentation. It's the latter that's the problem, not the former.

If your bank experimented with not completing transactions or your email provider experimented with not delivering emails, the problem wouldn't be that they experimented.


'match percentage' is a fuzzy subjective measure. To a large extent it's asking of OkCupid thinks two people having a conversation is a good idea, which is basically impossible to be a lie.

And they're doing it to fight against users being hideously misrepresentative, which is hilarious.


The percentage is insanely fuzzy.

I often endure rage fits from one of my buddies who shows me example after example of cases where, for example, he answered "often" to a question a lady answered "usually," resulting in a mismatch.

The only good thing about it are key individual questions that let you judge someone's intelligence and determine if they're racist. The scalar number is a crock.


If your pal is having "rage fits" about an online dating site, he shouldn't be on there in the first place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: