Mark Zuckerber staring down a leaning tower of Trust yelling, we don't need trust!Facebook leadership recently decided to conduct a psychology experiment, with you as the test subject. They didn’t tell you this, but in essence, they tried to manipulate your emotions. It’s okay though, because buried in the 9,000 words of their legal agreements, contracts and use policies, it said somewhere they may use the information they receive from you, for “research” (UPDATE: Forbes discovered this term was added months after the study. So it turns out, maybe you didn’t agree to it after all.)

The experiment sought to influence subscriber’s emotions, based on what Facebook allowed those users to see in their content stream. If a user saw mostly negative updates, would they, in turn, begin post negative content? What if the stream content was mostly positive?

One Facebook employee and author of the study offered this moderate apology:

I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.

Yet Facebook leadership has deflected back to it’s terms and conditions, saying what they did was covered by terms and conditions all users approved. Perhaps what they did, therefore, is legal. But is it ethical?

At the heart of the matter is this: can you trust Facebook leadership to treat you with respect or, at least, not cause you emotional harm? Following their string of recent trust breaches, it seems Facebook leadership does not value the trust of subscribers.

Question: What do you think, was this an unethical breach of trust and should subscribers expect leaders they can trust?