Chrissy Clary

ponder. conspire. digitize.

Category: Uncategorized

How does trust work?

Anyone with a little determination can build a website, set up a Twitter account, and position themselves as a trustworthy corporation. We as users may be spending too little time considering the objectivity and sources of the information we receive and believe.

My son (a twelve-year-old gifted student and self-proclaimed gamer) and I were discussing trust on the Internet. He is in middle school and in the early stages of developing his junior researcher skills.

Hoping he had learned a thing or two about researching an idea and trying to find reputable information, I asked him how he knows if something he finds on the Internet should be trusted.

He confidently explained that the best way to know if something is true is to look at the search results page. If there are several entries that appear to support a particular idea, then most likely it is true. He also shared with me that one of his teachers reassured his class that Wikipedia is a trusted resource because they only let people with PhDs become editors on the site. I thought “Oh, crap.”

To set the record straight, a PhD is not in fact a requirement to be a Wikipedia editor. According to the Who Writes Wikipedia? page on the nonprofit’s site, all it really takes to be a Wikipedian is a little bravado: “Yes, anyone can be bold and edit an existing article or create a new one, and volunteers do not need to have any formal training” (Wikipedia, 2017).

When I tried to explain that much of what we see is being presented to us algorithmically based on a code developed by someone at a corporation such as Google, my son assured me that the people at Google can be trusted. I asked him why he would trust people he has never met, and he responded, “Why would they lie to us?”

Michael Patrick Lynch discussed this idea in his 2016 book, The Internet of Us, writing that “an organism’s default attitude toward its receptive capacities—like vision or memory—is trust.” Meaning it is natural for people to trust what they see or what someone is telling us, our default is to believe.

Lynch also explains that if we are not knowledgeable on a topic, we may choose to look for answers from someone with expertise in the domain, but he stresses the need for reference checks.

Of concern is how we consume information today. With more than 60 percent of adults getting their news from social media, according to a study by the Pew Research Center, it is important that users understand that not all facts are checked and that there is no governing body for the content published to the Internet.

Lynch makes the point that “digital media gives us more means for self-expression and autonomous opinion-forming than humans have ever had. But it also allows us to find support for any view, no matter how wacky” (2016). This last bit is the concerning part because it suggests that no matter what your particular truth is, “no matter how wacky,” you could find verification for the idea on the Internet.

On November 19, 2016 Mark Zuckerberg, founder and CEO of Facebook, posted a message to Facebook users about “misinformation” on the site and outlined what the corporation plans to do moving forward to reduce the prevalence of fake news within the social media network:

The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.

I don’t believe we have the maturity as a species to reason through all the information that is being presented to us in a logical, fact-finding way. We used to pay people to do that for us—they were called journalists. Today, we are still believing information the same way we always have and perhaps are taking it at face value and swallowing it whole. Is this ignorance or laziness? Both? Neither?

Next Read: Who and What Do We Trust?


Lynch, M., P. (2016). The Internet of Us: Knowing More and Understanding Less in the Age of Big Data. Liverlight Publishing Company. New York.

Wikipedia. (May 4, 2017). Who Writes Wikipedia? Wikipedia. Retrieved from

Zuckerberg, M. (November 19, 2016). Facebook. Retrieved from

What is the effect of
algorithmically delivered news?

About 6-in-10 Americans get news from social mediaAlgorithmically delivered news that is based on user actions is reducing the diversity of the news we receive increasing the echoing of ideas.

I was lucky enough to work in a real newsroom right before the industry started to tumble. Every day, the editors gathered at 10 a.m. to discuss what would make the paper the next day. It was not unusual for a heated argument to erupt over what story should be featured. These people believed in what they were doing, they wore their ethics on their sleeves, and they cared deeply about the community they influenced. They were the gatekeepers.

Today, much of our news is delivered through social media platforms. According to the Pew Research Center, 62 percent of US adults get their news from social media (Gottfried and Shearer, 2016):

News plays a varying role across the social networking sites studied. Two-thirds of Facebook users (66%) get news on the site, nearly six-in-ten Twitter users (59%) get news on Twitter, and seven-in-ten Reddit users get news on that platform. On Tumblr, the figure sits at 31%, while for the other five social networking sites it is true of only about one-fifth or less of their user bases (Gottfried and Shearer, 2016).

Reddit, Facebook and Twitter users most likely to get news on each site
The thing is, there is no daily editor meeting at Facebook. There are no local groups of people or community members deciding what is important. The news is being delivered to you algorithmically based on a variety of data points gathered by that particular platform to identify your likes, friends, interest, and actions. Such formulas for news delivery only take you into account. Does this method really give you the information you need to be a healthy, contributing part of a local community? Isn’t that what the editors were doing?

In a video posted in the Facebook Newsroom, Adam Mosseri, VP of product management for News Feed, explains how News Feed works. He also comments that the goal is to “connect people with the stories that matter most to them” (Mosseri, 2016).

In 2016, Facebook updated the News Feed algorithm. Now, “what you see will depend more on who your friends are, what they share, what you click on (Sunstein, 2017).”

News Feed uses data on its users to make decisions about what those users most likely want to read. Thus, News Feed is helping us sort through thousands of articles and delivering exactly what we want, when we what it (Mosseri, 2016). That doesn’t sound so bad, right?

In an attempt to deliver the news you are most likely to interact with, News Feed appears to be strengthening the echo-chamber effect. With regard to media, the echo-chamber effect occurs when when your opinions and preferences are echoed back at you.

As a point of reference, an echo chamber can be described as “a bounded, enclosed media space that has the potential to both magnify the messages delivered within it and insulate them from rebuttal” (Jamieson and Cappella, 2010).

In an opinion piece for Wired Magazine, Kartik Hosanagea, a professor at the Wharton School of the University of Pennsylvania, calls echo chambers problematic because “social discourse suffers when people have a narrow information base with little in common with one another (Hosanagea, 2016).”

Is his book #Republic, Cass Sunstein cites research by Facebook employees that appears to indicate that the algorithms are responsible, in part, for our political echo chambers: “Evidence shows the algorithm suppresses exposure to diverse content by 8 percent for self-identified liberals and 5 percent for self-identified conservatives” (2017).

In his book The Internet of Us, Michael Patrick Lynch raises the concern that only reading about the things we already agree with is giving rise to “group polarization – that we are becoming increasingly isolated tribes (2016).”

Wasn’t the Internet supposed to open us all up to new people and cultures? It appears the opposite is happening. We are being profiled based on our online actions. Without proactive steps on the part of the user to contradict these affects, it is possible that our scope of knowledge and understanding will shrink.

Next Read: What data do they have on us?


Adam Mosseri explains how the Facebook News Feed works. (April 22, 2016).
Facebook Newsroom. Retrieved from

Lynch, M., P. (2016). The Internet of Us: Knowing More and Understanding Less in the Age of Big Data. Liverlight Publishing Company. New York.

Gottfried, J. & Shearer, E. (May, 26, 2016) News Use Across Social Media Platforms 2016. Pew Research Center, Journalism & Media. Retrieved from

Hosagar, K. (November, 25, 2016). Blame the Echo Chamber on Facebook. But Blame Yourself, Too. Wired. Retrieved from

Lynch, M., P. (2016). The Internet of Us: Knowing More and Understanding Less in the Age of Big Data. Liverlight Publishing Company. New York.

Love, Choice and Listening: Tips for reducing frustration in the design process

I had an employee once who diligently worked on each design and architecture plan. She functioned as part architect and part UX strategist for the team. Working from home, she would spend hours perfecting one unadulterated proposal. The problem came in when she would present her plan to the customer. The meetings dissolved into an agitated debate over best practices and customer wants.

Often she had not spent much time hashing out needs and wants of the customer. Instead she trusted only what she learned from books and experts. She would become agitated with the customer and their stupidity. How could they not realize the gift she had just presented? Did they not understand the amount of time and knowledge that had been invested?

This is not a unique story in the business of communication design, but here are a few ideas for overcoming.

  1. Don’t fall in love with your work. For the creative type that can be easer said than done. But you will find, in the long run, that if you are open to critique and criticism your work will improve expediently. Everyone needs a good editor and if you are open to the feedback of both experts and laymen your perspective will be enhanced along with your talent. Yes it does sting a little, especially when you are first starting out, but over time your skin will become thicker.
  2. Always design two options. Not 3 or 15, just two. To many options can paralyze your decision maker. Design two options that you are happy with and proud of. When you present the two options to the customer they may feel like they have some control and voice in the design process leading to increased buy in. Together you can work through both designs and select the best bits from each. Designing a second option is also important for stretching yourself as a designer. While you may think the first design was perfect, you may be pleasantly surprised at what you come up with when you force yourself to create a second version.

For more on choosing: Too Many Choices: A Problem That Can Paralyze, written by Alina Tugend and published by The New York Times

  1. For goodness sake, you are not designing the thing for you so listen to the people who hired you. And maybe don’t just listen, watch and observe a little too. Get up close and cozy with your customer’s process, who they are and what they need. It is your job to create a solution to their problem so make sure you fully understand the problem and then apply your expert opinion and best practices to that problem. Just don’t forget that while your understanding of UX or communications may far exceed theirs, their understanding of the business they run and the needs of their customers far exceeded yours.

More on listening: Why You Should Listen to the Customer, by Braden Kowitz, published by The Wall Street Journal

© 2018 Chrissy Clary

Theme by Anders NorenUp ↑