A New User Experience at the Newseum’s New Media Gallery

Today, I went to DC’s Newseum, a museum exploring the ways in which the news-reporting industry has grown and evolved over the years. I went with my class, #internet (how digitalcocoon-esque,) and was fairly unsure of what to expect. We visited the HP New Media Gallery, and it was a social media extravaganza. I felt as if I’d been dropped into a digital empire from a neo-noir film, except a new governor had been put in place to clean up the hardware rubbish, eliminate cyborg corruption and Web 3.0 the whole dang thing (can I start using that term yet?)

The HP New Media Gallery explores the new ways in which we receive our news, interact with those sources, and communicate with others about the stories we care about. Without necessarily having a big definition of what “New Media” is attached to a daunting digital display, the back wall of the round viewing room is surrounded by four large screens, each projecting a different visual with one audio track, each video seamlessly interacting with the audio, pieces to the same puzzle. 

visitors answer daily poll questions about their use of New Media
The “user” of the New Media Gallery (formerly known as “museum go-er”) can decide which screen to pay attention to, whether it’s milestones in social/new media infiltration (2011: Tweeting is Allowed in Supreme Court) or headshots of the founding fathers and maiden mothers of New Media like Facebook’s Mark Zuckerberg or Huffington Post’s Arianna Huffington. Foregoing a definition, the Newseum attempts rather to achieve an “essence” of New Media. Smart if you ask me, as the meaning, function, and impact is bound to continually change. “What Does New Media Mean to You” flashes before the user in hot pink block-faced letters. “I love it” “A world of information” “Participation.” The four screens can be a bit distracting–do I focus here, my peripherals are calling me over here, clips of an interactive Angry Birds game entices my gaze over there. But yet, the user walks away with a full understanding of the aura that is “New Media.” Incidentally, this is exactly the way in which the user interacts with the Internet. The content is a bit distracting. A tad fragmented. But overall, the user gets the jist of what’s going on.

top news stories, from a New Media perspective
An interactive wall of bouncing orbs feature 30 news stories, all of which found a voice through viral media sources. The user is invited to explore the evolution of the Apple brand or, how an event like Michael Jackson’s death sprouted from inception to public awareness. Users are asked simple Yes or No polls at the end of each story. Two teenage boys giggle and prod at the story of Justin Bieber’s rise to fame. “Do you think Justin Bieber would have been discovered without YouTube?” An awkward two-step towards the screen and a bursting “NO!” comes from one boy as he casts his vote (I humbly disagree.)

an interactive experience in the Apple takeover
The HP New Media Gallery at the Newseum is a perfect example of a successful interactive exhibit. The actual phenomena explored were put to use, as the New Media Gallery exists on the web. Pictures of visitors are shared online when they virtually “check-in,” a stream visible anywhere with an internet connection. Inside the exhibit, a Twitter stream is on display broadcasting those talking about the exhibit.

The user experience was fluid, visitors entered the exhibit with curious eyes, some skimming the surface, others diving into the interactive XBOX Kinect games designed for the gallery. The Newseum is located at 555 Pennsylvania Avenue in DC. Coming highly recommended to you from the digital cocoon.


I Am Almond Milk (and Other Thoughts From a Gen-Y)

In the adventure of my journey into the workforce, my age is no longer a hindrance. It’s not a hurdle in the race. It’s in some ways (and in some ways not) an advantage. When a rainstorm hits the horse track and makes the turf sloppy, my lack of age-acquired experience doesn’t make me an underdog. In fact, these conditions make it so that almost anything goes. A less experienced maiden horse may win the race over the those who have been around the track a few times. The breadwinners know how to run a mile and how to do it well; but when a storm hits, some simply cannot adapt to the new conditions.

When a CEO is told that that their business needs to have an online presence, they most likely respond with, “let’s hire an intern to do it.” Ah, the intern. An unpaid existence. Once thought to involve coffee-runs and sending appointment-based e-mails, the intern now builds a web-presence. They set up facebook pages, they send out timely, consumer-centric tweets. They build the face of a brand for their own kind–the Gen Y-ers.

It’s evident that the ways in which consumers interact with their favorite brands and companies is very different than it used to be. If businesses want to appeal to a huge portion of their demographic, they must have a web-presence. People aren’t looking in the yellow pages for a dog groomer. They’re googling. They’re facebooking their friends for recommendations. They’re reading first-hand reviews on Yelp. If businesses aren’t there to monitor their presence, they could be exposed, their reputations tarnished in the permanent prints of the web.

As the rules of the game change, there is inevitably somewhere, a game-changer. The new generation of consumers are reaching for a new mode of interaction, and on the other side, there must be a new generation of producers to give them what they want. Sure, this may not apply to certain older industries like banking. But when my generation begins investing in their 401Ks and managing their stock portfolios, do you think they’ll want to be talking on the phone to a broker? Or navigating a convoluted and fragmented sitemap? I think they’ll be itching for a means to consume and interact similar to the tools with which they grew up.

By no means am I putting down the generation above me. Moreso, I’m responding to some of the negativity I hear surrounding the seemingly fruitless search for employment. In addition, I’m also reacting to the editorial pieces I’ve read bashing our generation’s lack of compliance with societal standards or lack of traditional ambitions. I just want to put it out there–the world is changing! So are the qualifications for leadership, the definition of success, and the means with which to achieve these. The jobs are out there. They may not be the jobs your parents had out of college, but they are there. And if you can’t seem to find them, there is this extremely beautiful quality of our time that we have the luxury of enjoying. We can create our own jobs with a little passion, innovation, and a whole lot of crazy.

I’ll always remember an article I read in BOP Magazine when I was about eight years old. For those of you who don’t know, BOP Magazine was where I got my fix of Leonardo Dicaprio glamour shots and Jonathan Taylor Thomas interviews (“My one wish? World peace” oh the wisdom… ) A fellow pre-teen wrote into BOP to let the world know that Backstreet Boys were famous before ‘Nsync, and so they were the better of the two. A wise-beyond-her-years Alison, 12 from Maine retorted, “That’s like saying last week’s old milk is better than today’s new carton.” To my much wiser, well-versed and experienced elders: I do not mean to call you old milk, necessarily. Nor do I mean to say that my peers and I are the freshest carton out the fridge. Rather, we’re…a new kind of milk. And we’re not all the same. I may be the Almond Milk. My roommate the Soy, my sister the Lactaid. Really what I’m getting at is that we’re worth something, and we’re not lazy. We’re just going about this thing we call life a little differently. And also as a really quick side note and concluding thought from a 20-something who enjoyed the luxury of a paycheck on her last internship: start paying your (qualified) interns 🙂

Becoming an Adult: More than Ditching the Neon and Wayfarers

When I arrive past the dust of my Millennial youth, I hope I will remember the meat of things as more than neon-tinted vision, text-message based love affairs and rainbow displays of wayfarer sunglasses. Instead of plot lines in a life story, these ephemeral phenomena will set the tone of a realized youth. They’ll serve as the glowing Instagram filter coating the everyday forks in the road. The golden aura lighting this age of possibility. Behind the irreverent tweets and the ever-revolving viral memes, I’ll see not a transition into adulthood. Rather, I’ll see the image of my youth comfortably yet ambiguously straddling the line between digital girl and analog woman.

Where generation priors’ analog woman may be established in reputation, a master of her niche, the digital “girl” may appear fragmented, spread over various social networks. Pinterest boards on jewelry organization, twitter afterthoughts on the white house correspondents dinner, Facebook mobile uploads of an epic sushi dinner. Different mediums call for different correspondence, multiple modes of self-expression. I’ve heard baby boomers say, “I just don’t have the time to keep up with that many outlets.” There is nothing wrong with this statement. It’s a lot to keep up with. But my peers and I don’t really have a choice. For many of us, engaging with this many outlets is not only second nature, it’s something we’ve evolved a need for.

The internet, open-access and the nature of our “beta world” have conjured up an environment foreign to many of our elders. They call my generation the digital natives, and we’ve grown up in a technological petri dish our entire lives. For us, the so-called “digital self” was not a new persona or presence that had to be developed and understood after establishment in the analog world. By the time we had our first AIM screen names, we knew pretty much nothing about the “adult world.” We ventured through adolescence, developing our analog selves alongside our digital.

So for us, developing into an adult is somewhat of a gray area. Qualities that defined adulthood in the past are changing. The foundation of our persons are rooted in a completely different realm than our parents. So the question of youth versus adulthood is a tough one. The line, blurry and obscure. There is some underlying classification of the digital as the eager, progressive, wide-eyed youngen ready for revolution, whereas the analog is a stuffy biz exec, talking at a boardroom, following the standard protocol of a 20th century business model. Neither the digital nor the analog should be constrained to an age group, a limited arena or path. Each has its place in our developing world. But when it comes to our identities, could we possibly be both? Could I at once be a digital woman as easily as an analog girl? Is Instagramming keeping us younger, starry-eyed, and illusioned even longer, past the years of Spring Fair yearning and late-nights in the library? More importantly, is this kind of digital-social behavior hindering our transition into the adult world?

If you ask me, I say no. The virtual realms we interact in everyday are certainly changing how we’re growing into adulthood, but I wouldn’t say they “hinder” our development. The conceptions of leadership, maturity and achievement are changing, and they’re changing conditionally with the ways in which we are actively altering our progression into adulthood. I don’t think posting pictures or making a wise-crack observation about a movie is a sign of self-importance and thus, immaturity. I think it’s an exploration in expression, and signature to this age’s obsession with “sharing.” Sharing feelings, sharing links, sharing e-books, etc. Share I will, while I feel the need; and to be involved in my youth the need feels present. More importantly, it feels like a beautiful time to be a young adult, writing the conditions of our stories as we go along…

Emotional Addicts: Get Your Fix by Remixing Your iPhone App Folders

Once I crossed over into technological adulthood and started organizing my iPhone apps, I couldn’t understand those amateurs who just throw apps around without purpose. I’ve re-organized the system a few times as I acquire more and more apps, but it’s become pretty intuitive which apps fall into Tools versus Information, and which apps get the bonus bump up to Social. As I was about to show a friend my sleek setup, he absolutely upstaged me. After reading an article on verb or action-phrased folder names (Play, Listen, Look Up) versus function-based (Productivity, Social Media) he was inspired to change his folder names to something a bit more intuitive. Games are found in a folder called “Weeeee,” utilities that don’t give him a huge reaction like Calculator or Reminders go in “Meh” (incidentally his largest folder which has yielded also: Meh Vol. 2) Viewing apps like HBO Go and Hulu are labeled “Ahh,” and my personal favorite, social tools like Facebook and Twitter in “Ooh.”

If I categorize my apps this way, I’m literally attaching an emotional response to the software associated with those feelings. The apps on the screen are clustered according to their potential to elicit a mental response. So every time I want that “Ooh” feeling of social connection or digital gazing, my thumb gravitates to that folder. As my muscle memory takes over, I’ll find my physical self navigating to the Ooh folder when I subconsciously want to feel social warmth. If I find another app, a new social tool (say: Instagram) that has that same power, I’ll put it in the Ooh folder. When I’m craving more “Ooh,” I’ll click it again, having not just a new app easily accessible, but a familiar feeling. As this association goes deeper, we become stage 5 clingers to our phones (and in general, technology.) It begins to sound like an addiction doesn’t it? Of course, that’s what happens when we begin to associate our internal emotions with anything external.

We can try to technologically detox. We can give up Facebook for lent, vow to check our e-mail only 3 times a day, and limit mindless trips without direction into the interwebs. We can try. And some will succeed. But the real question is what will grow faster: our willpower? Or the attractiveness of our technologies.

Tip of the Tongue, Now Located in Your Macbook!

You know when there’s a thought on the tip of your tongue, but you just can’t get it out? Holding tightly to some magnetic tastebud, the unspeakable words torture you, your brain unable to get its act together, take the elevator down to your vocal cords and push that baby out. What was I going to say? What was I going to do? It’s frustrating.

One time, I had a pretty magical tip-of-the-tongue moment. Irate with my inability to remember what I was about to say or do, I kept fiddling away on my keyboard at a probably over-analyzed gchat conversation. I subconsciously hit my CTRL+V paste shortcut and what I saw amazed me. It was a sentence I’d overzealously typed to my partner in conversational crime and decided to hold onto for later use. However, it was not just any mildly out-of-line thought. It was that verbal foliage that had bloomed in my head, at one time something important I felt the need to say. Yet it withered away in absent-minded distraction–my brain decided to let it go once it was copied into my computer. That tip-of-the-tongue thought–it wasn’t really on the tip of my tongue. It wasn’t even in my subconscious once it made its way to the screen via my fingertips. It was simply a sentence stored in my other subconscious memory system. My computer. My virtual-mental cloud, for when my actual mind is clouded.

I often talk of my third hand(s). My iPhone, which has taken the place of so many of my previously cognitive-based functions over the past 5 years. My laptop, which has been a portal into different worlds, constantly affecting conscious thoughts of my own reality. But now, the locus of my consciousness isn’t just centered in my own hardware. My human capability is not only enhanced or aided by the power of my external technological parts. I’ve made a shift, where my subconscious thoughts are located in some intangible cloud that I can’t necessarily see or touch. My brain has lost that ability to hold onto to a thought of my own creation, which I’ve decided to store in the ever-changing copy-paste pocket of my computer.

It makes me nervous of course. What other functions of my cognition have been lost, or rather, replaced by my constant use of technology? Is the plasticity of my brain finding purpose through technical adaptation versus humanistic mental work? Can the ways in which my brain functions actually be changing as my technological tools become more powerful, more present, and even more weaved into my everyday activities?

In short: I have a feeling the answer is yes–my brain is changing. What will the ultimate effect of it all be? Well if things keep up this way, then–wait I was going somewhere with this…it’s right on tip of my tongue…