Chrissy Clary

ponder. conspire. digitize.

Category: Strategy (page 1 of 2)

The Dilemma of the
Digital Strategist

Since 2001, I have been building devices designed to communicate using the Internet. I’m a digital strategist with a portfolio of websites and digital marketing campaigns.

I have grown curious about the effects of my work, and the work of others like me, on people around the world.

In an attempt to scratch this itch, I’ve been asking a lot of questions and digging around in the literature. The bumper sticker on my worn-out Honda CRV reads, “I’m not lost, I am exploring.” That concept is the lens I used as I explored the idea that the Internet has had and is having a profound influence on how we think, what we know, and how we relate to one another.

One concern I have centers around ethics, or the potential lack thereof, within the community of people working in digital strategy.

Corporations and large organizations now hold astounding amounts of data on all of us. With machine learning, algorithmic targeting, and carefully curated messaging, they are trying to deliver you messages that you will find engaging (Davis and Patterson, 2012).

Eric Schmitt explains how it works at Google in this interview on YouTube.

Here, Schmitt says that at Google, the idea “is to get right up to the creepy line but not cross it.” He talks about the “creepy line” but does not draw that line for us. How do we know where that line is? In many cases, companies are not clarifying their ethics regarding online communications. Instead, the line is a proprietary idea held and perhaps only partially understood within that organization.

Don’t get me wrong, I love googling just as much as the next person. My concern is about the ethics practiced by many organizations regarding data mining and targeting. The dilemma is becoming clear. Digital strategist are working to expand the Internet without defined ethical guidelines.

Perhaps a larger concern for me is with respect to my own ethical practices. This exploration has brought me to the realization that my ethics are based on practices of the past even as I help to build the communication machines of the future.

As a digital leader at my organization I have a responsibility to help establish and communicate our ethics and attitudes on this subject. Having a core understanding of how my decisions affect others will aid me in drawing a creepily line of my own.

I start by attempting to answer the following questions:

  • What is the effect of algorithmically delivered news?
  • What data do they have on us?
  • How do we know what to trust?
  • Who do we trust?
  • Where does the conversation around digital marketing ethics sit today?

This list is incomplete; there are so many more questions, but it would be impossible to list them all and try to answer them all. My journey is only beginning.

Now Read: What is the effect of algorithmically delivered news?

 

References:

Davis, K. & Patterson, D. (September 20, 2012). Ethics of Big Data. O’Reilly Media, Inc.

Eric Schmitt discusses how algorithmic targeting works at Google (2011). Retrieved from https://www.youtube.com/watch?v=uB-2n6KSYWk

 

What data do
they have on us?

Corporations are gathering large amounts of data on us: data about our actions, our interests, what we search for and read, and how we interact with others. Of concern is the idea that interpretations of our right to digital privacy are in the hands of the corporations and government organizations that own the data, leaving us little control over our own digital data points.

Have you ever heard of a data broker? This term refers to companies that gather information on consumers through a variety of channels and then sell that information to corporations. The corporations use the data to build consumer profiles. They then use those profiles to market to those consumers in a targeted, personalized way.

According to Brian Naylor’s article on the NPR website, data brokerage is not anything new, “but the Internet upped the ante considerably” (2016).

Naylor explains how it works:

Once these companies collect the information, the data brokers package and sell it—sometimes to other brokers, sometimes to businesses — that then use the information to target ads to consumers. And it’s a lucrative industry. One of the largest brokers, Acxiom, reported over $800 million in revenue last year” (2016).

This raises concerns around the right to privacy. We do have that right, right?

The Fourth Amendment reads:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized (National Constitution Center, 2017).

Unfortunately, the Founding Fathers did not anticipate digital media or the Internet, so the Fourth Amendment does not include anything about large corporations selling, buying, storing, or mining your data.

The American Civil Liberties Union expresses its concerns about the privacy of our data on its website:

Technological innovation has outpaced our privacy protections. As a result, our digital footprint can be tracked by the government and corporations in ways that were once unthinkable. This digital footprint is constantly growing, containing more and more data about the most intimate aspects of our lives. This includes our communications, whereabouts, online searches, purchases, and even our bodies. When the government has easy access to this information, we lose more than just privacy and control over our information. Free speech, security, and equality suffer as well (2017)

To compound this concern, we are freely sharing intimate information on websites and applications owned by large corporations. That information is being archived, mined, and sold. Based on data gathered, corporations can use machine learning to understand and make assumptions about who we are and what actions we are likely to take.

Curious about my own data floating around on the Internet, I spent 30 minutes googling myself. I found information on my age, design style, address, career and education history, cities I’ve lived in, homes I’ve owned, things I am interested in, projects I’ve worked on, awards I’ve won, court records, and my relationship and marital history. Click on the items in the chart below to see what my search revealed. 

This search did not take into account the mounds of search and activity data about me that corporations have stored but not made public. If you are interested Google will share an archive of your data from the Google apps you use. You can delete those accounts and clear your browser history, but I was unable to find a statement indicating that if you delete your account or clear your history that data will be erased from Google’s files (Google Support, 2017).

In the book The Internet of Us, Michael Patrick Lynch notes, “In some cases—many cases in fact—we trade information in situations where trust has already been established to some degree.” For instance, if you have an affinity for a particular brand you may be more comfortable completing an online form and giving over your personal information than if you had never heard of the company (2016). Can we assume that humans who give their data up freely trust the organizations to which they are giving that data?

We may well be trusting blind. We have little visibility into what data is being stored on us and even less visibility into how those data are being mined and harvested. Even if we did know what data these corporations have, most likely we still would not know what assumptions the corporate algorithms are making about us.

Alas, in an article on Backchannel.com, David Weinberger explains how machine learning works and how little we can hope to understand:

Clearly our computers have surpassed us in their power to discriminate, find patterns, and draw conclusions. That’s one reason we use them. Rather than reducing phenomena to fit a relatively simple model, we can now let our computers make models as big as they need to. But this also seems to mean that what we know depends upon the output of machines the functioning of which we cannot follow, explain, or understand (2016).

Next Read: How Does Trust Work?

References: 

ACLU. (2017). Privacy & Technology. Retrieved from https://www.aclu.org/issues/privacy-technology

Google Support. (2017). Retrieved from https://support.google.com/accounts/answer/3024190?hl=en

Lynch, M., P. (2016). The Internet of Us: Knowing More and Understanding Less in the Age of Big Data. Liverlight Publishing Company. New York.

National Constitution Center. Amendment iv. Retrieved from https://constitutioncenter.org/interactive-constitution/amendments/amendment-iv

Naylor, B. (July 11, 2016). Firms Are Buying, Sharing Your Online Info. What Can You Do About It? NPR. Retrieved from http://www.npr.org/sections/alltechconsidered/2016/07/11/485571291/firms-are-buying-sharing-your-online-info-what-can-you-do-about-it

Weinberger, D. (April 18, 2016). Alien Knowledge: When Machines Justify Knowledge. Backchannel. Retrieved from https://backchannel.com/our-machines-now-have-knowledge-well-never-understand-857a479dcc0e

What the heck are technical requirements and how do I get some?

So, you want to build a website or an application, and some tech/geek is at your door asking for technical requirements. What do you do?

Don’t panic.

A project manager with strong technical skills could help you navigate the task, but just in case you find yourself in a new found project manager role, here are four tips:

1technical-requirements21. Keep an eye on the prize
The key to getting the technical requirements right is to focus on your project goals first.

Clearly identify what your outcomes should be and get feedback from your stakeholders. If you don’t know what your goals or outcomes should be, your project will be all over the place and it will have a hard time getting off the ground.

Continue reading

What is Purpose Driven Communications?

In a nutshell it is communicating with an identified purpose. It is when you decide — prior to hiring designers, prior to identifying your target audience and prior to developing a quip tag line — to clearly identify and support the goals of your organization through the use of communication techniques. To be clear we are not talking about your communication goals we are talking about the goals of the organization you are communicating for.I know that sounds obvious and basic, and maybe it is. But I have seen, all too often, professional communicators who work reactively, not proactively and who tend to communicate purely for the sake of communicating.

If every decision you make does not strategically support the goals of your company than you run the risk of wasting money and energy. Once you identify the purpose you will have an easier time developing targeted, effective messages and identifying the audience to which you should deliver those messages.

 

Crowd Accelerated Innovation

“TED’s Chris Anderson says the rise of web video is driving a worldwide phenomenon he calls Crowd Accelerated Innovation — a self-fueling cycle of learning that could be as significant as the invention of print. But to tap into its power, organizations will need to embrace radical openness. And for TED, it means the dawn of a whole new chapter.” — Ted.com

Watch the video. Then step back and ask yourself “how can you as a communicator help your organization accelerate innovation?” (Hint: think outside the box)

Older posts

© 2018 Chrissy Clary

Theme by Anders NorenUp ↑