Serving communities with positivity

The Ferguson debate continues, and although I don’t think I can add anything else to the conversation, I do think Tufekci’s article on Medium raises another issue.  Yes, Ferguson has become the poster child for race relations and the role of police in the United States. That’s not necessarily a bad thing in itself, but now the city has an international reputation of being a high-crime area of unrest with a violently racist police force.  That is such a shame because I know there is more to that area than the negative connotation it receives from the media.  Behind the crime, violence and injustice lies a city of everyday people. For every racist police officer, there’s also a person trying to reach across racial boundaries. For every crime committed, there’s also someone working to keep Ferguson’s youth out of jail.  Ultimately, Ferguson is the place these people call home.

So, when the media only pays attention to an area when something bad is happening, I think they do a huge disservice to that community. As journalists, we are charged to tell the whole truth and nothing but the truth, but parachute crime reporters aren’t doing that. They’re focusing on the bad when there’s a lot of good happening as well. 

This mindset has become particularly important to me as I’ve tackled my job at the Durham VOICE this semester.  The area we cover is no stranger to crime and poverty. But yet the people living there still call it home. And the residents continually tell us how they wish the media would find some positive things in the community.  That’s the duty we’re charged with at the VOICE. The residents don’t want to hear anymore about the bad stuff happening; that’s old news to them. They want to read about the positive things happening. 

That’s how you serve a community and do the people justice.  Yes, the crime stories need to be told, but let’s balance it out with some positivity.  Ferguson is so much more than snipers and tear gas.

Moving too fast for Incubate

Morgan wrote a blog post about a new app called Incubate, which basically lets you receive delayed messages (text, video or voice) for up to 25 years after it was originally sent.  I had never heard of this before, and when I first read about it, my first instinct was to think what a cool idea it is.  It’s kind of like the digital version of a time capsule. Or like one of those letters your teacher used to make you write to yourself at the beginning of the year, and you’d get it back at the end.  Morgan made the point that it may be a marketing ploy to ensure the app’s survival for the next couple of decades.

But then I got to thinking. Technology and the platforms on which we engage in it are rapidly changing, now overturning at a rate of about five years.  And it’s only going to continue spiraling faster. We can imagine where the future will go, but ultimately no one knows for sure, whether it be 10 or 20 years down the road.  So what happens when apps become a thing of the past?  You guessed it…goodbye, Incubate.  The app can hold messages for up to 25 years, but who knows if we’ll have anything similar to a smartphone at that time.  Apps on iPhones will probably be equivalent to the brick phone 25 years from now.  Unless a tech guru does some major overhaul of the company, there will be no technological platform to support Incubate. How are you going to receive text messages or videos if there is no smartphone to receive them?  It’ll be a bit like trying to download a VHS onto your laptop–not going to happen.

So, Incubate designers had better think fast or they’ll quickly become a thing of the past before they even reach a chance at success.  It’s the blessing and the curse of technology.

'It was bad enough when the grand kids knew more about technology than me.'

Didn’t you see my status?

It’s fairly common knowledge that Facebook uses a specific algorithm to determine which items show up on your news feed.  Although us mere mortals outside of the site don’t know exactly how it all works, we do know that it has to do with a variety of factors, including word usage and how many people like or comment on the status.  Caleb Garling, a writer from The Atlantic, tried out a little experiment where he posted a fake status but used words indicating that something big and important was happening. 

FB Algs

Once Facebook picked up on phrases like “big news,” “so excited to begin” and “all of your support,” it began putting Caleb’s status at the top of his friends’ news feeds.  As his friends continued to like and comment, Facebook showed the status to more and more people. 

It’s an interesting set-up and makes a lot of logical sense.  You’d much rather hear about someone’s big life change instead of what was for dinner, right?  But coming from personal experience, Facebook’s algorithm has often been a disappointment to me.  I’ve noticed that I see posts from the same exact people over and over again…usually those annoying people from high school ranting about how life sucks.  Sometimes there’s a good variety of posts on my news feed, but other times it feels like I only have 10 Facebook friends. It can get frustrating.

I’d be curious to see what would happen if the site created some sort of personal algorithm for each user. Rather than tracking the phrases of what generally interests people, the site could track phrases related to statuses you tend to like or comment on.  This would be a sort of personalization for the user and would also guarantee updates that interest the user. I imagine our social networking communities would become smaller and more tailored to what we care about. This would be a positive change for the user, but it would limit the global characteristics of a social networking site.  To every positive, there will be a negative, but we have to determine if the change is worth it.

#droptheplus

It’s happening…and boy am I glad!  I saw an article on Buzzfeed last night about one Australian model who is supporting the campaign against the term “plus-size.” The model, Stefania Ferrario, called the term misleading and damaging to young girls.  She herself is labeled as plus-size and argued that it is not an empowering term for her to be called. Yes, yes and YES!

You might recall when I wrote about how harmful this particular term is. This label is given to so many normal-size women and unfortunately is often interchanged with the word “fat.”  Plus-size has a negative connotation to it, and mislabeling/trying to glorify women who are not actually plus-size does more harm than good.  It is so encouraging to see someone from inside the fashion industry take a stand on this.  And not only that, but supporters are joining in on social media with #droptheplus. The campaign is a great start, but lighting the fuse on social media has the potential to reach millions of people all over the globe. The fashion industry uses mass communication and social media to its advantage, but now critics of the industry can use the same tool to fight back.  This wouldn’t have been possible even five years ago. The change a simple hashtag can bring about is amazing and will undoubtedly put some steam behind the campaign.

CAsJHUlUgAA1ngQ

Analyzing my tweets

After Sarah’s presentation today in class, I decided to see what all the fuss is about and try out Analyze Words.

Here are my results:

Analyze Words

Overall I think it was pretty accurate.  My emotional style is usually upbeat because I have a tendency to put my best foot forward on social media.  I wouldn’t want my tweets to be all negative.  That’s the type of account I wouldn’t care to follow myself.  That being said, many of my tweets were categorized as worried and depressed.  I know I use Twitter once in a while to rant about something or put in my two cents on current events.  But depressed?  I never viewed my tweets like that.  A little harsh don’t you think, Analyze Words?

As for my social style, I was glad to see my tweets categorized as plugged in and personable.  Since I’m hoping to soon be using my account as a professional journalist, those are two categories I think are important.  If my readers look at my tweets, I would hope they’d see my thoughts as in-the-know and relatable.  It’s a way to connect with readers on a more personal level.  I was glad to see my thinking style as analytic and sensory for the same reasons.  I would hope my tweets could add some sort of insight to a situation, even if I am just ranting.  Honestly, I would have expected more in-the-moment tweets since that’s when I have the most to say.  My tweets aren’t usually planned.  I tweet when experiences or moments inspire me to say something.

I’ve got to say this site is pretty cool.  There were a couple of questions I had about the analysis, but I think the most helpful aspect of it is that it gets you thinking about what you write.  It’s an easy way to get some outside feedback on your account.

Getting the next generation to look up

On Tuesday our class talked about how new technology has affected our attention spans and essentially rewired our brains.  This could turn into a major problem, but honestly, will it convince us to put down our smartphones? Absolutely not. So where do we go from here? How can we avoid this problem? Let me suggest that perhaps the solution should start at home.

The human brain is mostly developed during childhood, especially before the age of 10.  In the first months of a child’s life, each neuron in the child’s brain is attaching itself to about 15,000 other neurons.  This is incredible stuff, but it also means the way a child is raised from infancy is vitally important to its growth into adulthood.  Ultimately, parents, guardians and caregivers of children play a large role in how the child develops.  Studies have shown that children who are exposed to fast-paced, colorful, and highly stimulating media tend to be at a higher risk of attention problems. This goes hand-in-hand with new technology, such as iPhone/iPad games, video games, stimulating television shows and the general need for information at the tip of one’s fingertips.  If a child is allowed to play video games for several hours a day or if a 10-year-old is given unlimited access to a smartphone and Instagram, then yes, I would argue that this causes a rewiring of the brain.  However, if a child’s technology use is monitored and the guardian encourages other types of entertainment, then a lot of damage would be avoided. Our generation may be addicted to our technology, but we still have the ability to put our phones down once in a while because we remember a time when they weren’t always in our hands.  By limiting a child’s technology use, you can give him/her experiences outside of a screen.

Unanswered questions of science

Earlier this week Clay shared a video on his blog with Dr. Daniel Siegel explaining the harmful effects of looking at screens before bedtime.  The photons from the light of the screen discourage your brain from releasing melatonin and thus disrupt your ability to fall asleep.  Scientifically, this all makes sense, and I think we can all comfortably agree that this information is true.  But I would argue that Dr. Siegel left out a vital part of the discussion: how do the size of the screen and the amount of photons play a role in disrupting sleep?

I’m really asking this question based on personal experience.  When I watch a show on my laptop before bed, I have noticed that it’s harder to unwind and fall asleep after I’ve finished watching.  Usually I attribute this to a gripping plot that has stimulated my brain in some way, and while that’s bound to be partially true, I now know the photons are another cause.

laptop

But here’s where I start to question.  I scroll through Facebook, Twitter and Buzzfeed every night on my smartphone before I go to sleep.  It’s a way for my body to relax and get sleepy.  Several times I’ve caught myself dozing off with my phone still in my hand and my Buzzfeed app still open.  I’m willing to admit this has turned into a bad habit–to the extent that I have trouble falling asleep if I don’t first scroll through my phone.  And yes, this ritual technically takes away some of the time I could have spent sleeping, as Dr. Siegel also mentions. 

But why does my laptop keep me awake and my smartphone help put me to sleep? Is it because the limited amount of photons coming from a phone screen as opposed to a laptop screen are not enough to bother me?  Or is it just a fluke of my inherent sleep patterns working along with the bedtime habit I’ve created?  I would love to know these answers, and I think they’re important aspects that have been left out of the discussion.  If I had to take a wild guess though, I’d say it’s all of the above.