Microsoft Dresses Up as Apple for Halloween: Get Ready for Windows 8

The last time Microsoft gave me something to talk about that wasn’t a wisecrack was back in 2007. The technological climate was begging for a revolution. The iPhone was on the brink of release and iPads were barely a rumor. Microsoft began to release promo videos for their new prototype, the “Surface.”

Microsoft engineers boasted about multi-touch computing, a new wave of interaction which would bring users a more (quite literally) hands-on approach to interacting with hardware. The surface promised a whole new set of tools, and a whole new way to interact with them. Surface was essentially an oversized table-top version of today’s popular tablet computers such as Apple’s iPad or Samsung’s Galaxy.

Microsoft tackles design

Since 2007, the focus of Microsoft’s computing revolution has shifted, from the Surface’s hardware to it’s latest operating system software, Windows 8 (due October 26th.) The Surface has been downsized from the table-top prototype into what many are calling the PC’s answer to the iPad. Windows 8 however, is an operating system that will run on both the Surface tablet and Windows desktops. Sam Moreau, the director of user experience at Microsoft tells FastCompany that Windows 8 is tackling the OS wars from a design-centric perspective, a field previously dominated by Apple. They call the interface “Metro,” and it’s a re-design of the Windows platform from a world of organizational toolbars and framed windows to a full-screen experience painted in grids of vibrantly colored tiles, each leading the user into a different digital space (see: games, weather, messaging, mail.)


This re-design is going to fundamentally change the way PC users interact with and consume their technologies. This is not simply a face-lift, but something more like an organ transplant. When PC users find their muscle memory directing their mice towards the ever-present Start Toolbar, they’ll come up empty-clicked. Instead, Internet Explorer is a square blue-tile sitting between a picture of your girlfriend and another tile showing  real-time updates of the weather in your town. This is the type of organization familiar to mobile devices like smartphones and tablets. It’s sleek. It’s clean. It’s simplified. It’s totally not Windows.

Isn’t design Apple’s territory?

It remains to be seen though whether PC users like developers, and data analysts will flock to or run from a Windows 8 operating system. The preference for PCs often comes from 1) The open ecosystem of software development in the Windows world which allows for the types of tools and software that developers and analysts prefer. 2) The highly transparent nature of the Windows environment that allows users to find, edit, and configure almost anything. Apple on the other hand, appeals to a fluid and aesthetically-appealing user experience ideal for designers. An operating system where many files are hidden so as to minimize a user’s chances of “messing up” anything. I often hear from Apple-haters, “I have work to do, I can’t just be looking at a pretty screen all day.” If Apple is the place where desktops bloom in flowers and morning dew, Microsoft is where cogs turn and command lines are bred.

It feels like Microsoft is trying to pick up where it left off when Apple re-joined the personal computing game as a competitor in 2002 with its OSX operating system (the sleek interface Mac users today have grown to love.) Back then, Apple held only a 2.9% of the personal computing market share under their belt. To re-invent the entire system would not off-put that many users, and felt more like a last resort move to either go big or go home. Today, Microsoft owns around 92% of the operating system market share. A number that is important when considering not only how many users have held out on the switch to Mac despite its cleaner, more modern interface, but also how many will be affected by Windows 8.

Apple & Microsoft share a common enemy: Google 

Microsoft has maintained a sense of consistency in that its audience is loyal to its offerings. Gamers and developers use Microsoft, artists and designers use Apple. It feels like a sort of left-brain right-brain distinction we’ve made about the two brands over the years. But as the rise of Google as a common enemy has proven, both Microsoft and Apple are getting their toes stepped on. Google Maps has dominated a market Microsoft used to own, one that Apple is only beginning to respond to now (without much success.) Google’s free tools such as Gmail, Drive (formerly Docs) and Calendar are eliminating the need for and cost of Microsoft Office. Google’s Android platform for smartphones is constantly competing for a majority market share against Apple’s iPhone. It’s no secret that each of these huge companies have had their fair share of flops (Google+, Microsoft’s Zune, Apple’s iTunes Ping) which means nobody is safe from failure, and it could be anyone’s chance to step up and revolutionize the digital game once again. At least this is what Microsoft is hoping for.

Adapting to a New Windows

When any seemingly ubiquitous piece of software or platform changes, many users are up in arms. I remember the day Facebook introduced its signature “Newsfeed” feature as the summer of 2006 came to a close. My peers and I were furious, “What are these statuses about? We have away messages for that. I couldn’t care less about the shopping spree my best friend from third grade went on today.” But Facebook was about 10 steps ahead of us. They weren’t simply trying to replace the away message, they were elevating the facebook status to an interactive forum for conversation. They were changing the face of digital self-expression, where our personalities are often interpreted through our facebook activity, the camera always recording. They were developing a new environment in which we’ve all become voyeurs and exhibitionists, constantly viewing content (many times in silence) or narcissistically boasting our own activities and whereabouts. Facebook literally reinvented how we interact with our social networks from the digital realm to our real lives.

This seems to be the re-invention challenge Microsoft is looking to tackle with the release of Windows 8. They are well aware of the risks they are taking, as Moreau calls it “the ultimate design challenge. You’ve got 24 years of Windows behind you. There’s a responsibility to preserve it, but also to evolve–knowing that when you change something, you’re changing how computing works.” To me, this feels less like a computing revolution, and more like Microsoft’s attempt to “go big” and join the rest of us. The real question is what will happen to the toolbar-loving users they leave behind?

A New User Experience at the Newseum’s New Media Gallery

Today, I went to DC’s Newseum, a museum exploring the ways in which the news-reporting industry has grown and evolved over the years. I went with my class, #internet (how digitalcocoon-esque,) and was fairly unsure of what to expect. We visited the HP New Media Gallery, and it was a social media extravaganza. I felt as if I’d been dropped into a digital empire from a neo-noir film, except a new governor had been put in place to clean up the hardware rubbish, eliminate cyborg corruption and Web 3.0 the whole dang thing (can I start using that term yet?)



The HP New Media Gallery explores the new ways in which we receive our news, interact with those sources, and communicate with others about the stories we care about. Without necessarily having a big definition of what “New Media” is attached to a daunting digital display, the back wall of the round viewing room is surrounded by four large screens, each projecting a different visual with one audio track, each video seamlessly interacting with the audio, pieces to the same puzzle. 

visitors answer daily poll questions about their use of New Media
The “user” of the New Media Gallery (formerly known as “museum go-er”) can decide which screen to pay attention to, whether it’s milestones in social/new media infiltration (2011: Tweeting is Allowed in Supreme Court) or headshots of the founding fathers and maiden mothers of New Media like Facebook’s Mark Zuckerberg or Huffington Post’s Arianna Huffington. Foregoing a definition, the Newseum attempts rather to achieve an “essence” of New Media. Smart if you ask me, as the meaning, function, and impact is bound to continually change. “What Does New Media Mean to You” flashes before the user in hot pink block-faced letters. “I love it” “A world of information” “Participation.” The four screens can be a bit distracting–do I focus here, my peripherals are calling me over here, clips of an interactive Angry Birds game entices my gaze over there. But yet, the user walks away with a full understanding of the aura that is “New Media.” Incidentally, this is exactly the way in which the user interacts with the Internet. The content is a bit distracting. A tad fragmented. But overall, the user gets the jist of what’s going on.

top news stories, from a New Media perspective
An interactive wall of bouncing orbs feature 30 news stories, all of which found a voice through viral media sources. The user is invited to explore the evolution of the Apple brand or, how an event like Michael Jackson’s death sprouted from inception to public awareness. Users are asked simple Yes or No polls at the end of each story. Two teenage boys giggle and prod at the story of Justin Bieber’s rise to fame. “Do you think Justin Bieber would have been discovered without YouTube?” An awkward two-step towards the screen and a bursting “NO!” comes from one boy as he casts his vote (I humbly disagree.)

an interactive experience in the Apple takeover
The HP New Media Gallery at the Newseum is a perfect example of a successful interactive exhibit. The actual phenomena explored were put to use, as the New Media Gallery exists on the web. Pictures of visitors are shared online when they virtually “check-in,” a stream visible anywhere with an internet connection. Inside the exhibit, a Twitter stream is on display broadcasting those talking about the exhibit.

The user experience was fluid, visitors entered the exhibit with curious eyes, some skimming the surface, others diving into the interactive XBOX Kinect games designed for the gallery. The Newseum is located at 555 Pennsylvania Avenue in DC. Coming highly recommended to you from the digital cocoon.

I Am Almond Milk (and Other Thoughts From a Gen-Y)

In the adventure of my journey into the workforce, my age is no longer a hindrance. It’s not a hurdle in the race. It’s in some ways (and in some ways not) an advantage. When a rainstorm hits the horse track and makes the turf sloppy, my lack of age-acquired experience doesn’t make me an underdog. In fact, these conditions make it so that almost anything goes. A less experienced maiden horse may win the race over the those who have been around the track a few times. The breadwinners know how to run a mile and how to do it well; but when a storm hits, some simply cannot adapt to the new conditions.

When a CEO is told that that their business needs to have an online presence, they most likely respond with, “let’s hire an intern to do it.” Ah, the intern. An unpaid existence. Once thought to involve coffee-runs and sending appointment-based e-mails, the intern now builds a web-presence. They set up facebook pages, they send out timely, consumer-centric tweets. They build the face of a brand for their own kind–the Gen Y-ers.

It’s evident that the ways in which consumers interact with their favorite brands and companies is very different than it used to be. If businesses want to appeal to a huge portion of their demographic, they must have a web-presence. People aren’t looking in the yellow pages for a dog groomer. They’re googling. They’re facebooking their friends for recommendations. They’re reading first-hand reviews on Yelp. If businesses aren’t there to monitor their presence, they could be exposed, their reputations tarnished in the permanent prints of the web.

As the rules of the game change, there is inevitably somewhere, a game-changer. The new generation of consumers are reaching for a new mode of interaction, and on the other side, there must be a new generation of producers to give them what they want. Sure, this may not apply to certain older industries like banking. But when my generation begins investing in their 401Ks and managing their stock portfolios, do you think they’ll want to be talking on the phone to a broker? Or navigating a convoluted and fragmented sitemap? I think they’ll be itching for a means to consume and interact similar to the tools with which they grew up.

By no means am I putting down the generation above me. Moreso, I’m responding to some of the negativity I hear surrounding the seemingly fruitless search for employment. In addition, I’m also reacting to the editorial pieces I’ve read bashing our generation’s lack of compliance with societal standards or lack of traditional ambitions. I just want to put it out there–the world is changing! So are the qualifications for leadership, the definition of success, and the means with which to achieve these. The jobs are out there. They may not be the jobs your parents had out of college, but they are there. And if you can’t seem to find them, there is this extremely beautiful quality of our time that we have the luxury of enjoying. We can create our own jobs with a little passion, innovation, and a whole lot of crazy.

I’ll always remember an article I read in BOP Magazine when I was about eight years old. For those of you who don’t know, BOP Magazine was where I got my fix of Leonardo Dicaprio glamour shots and Jonathan Taylor Thomas interviews (“My one wish? World peace” oh the wisdom… ) A fellow pre-teen wrote into BOP to let the world know that Backstreet Boys were famous before ‘Nsync, and so they were the better of the two. A wise-beyond-her-years Alison, 12 from Maine retorted, “That’s like saying last week’s old milk is better than today’s new carton.” To my much wiser, well-versed and experienced elders: I do not mean to call you old milk, necessarily. Nor do I mean to say that my peers and I are the freshest carton out the fridge. Rather, we’re…a new kind of milk. And we’re not all the same. I may be the Almond Milk. My roommate the Soy, my sister the Lactaid. Really what I’m getting at is that we’re worth something, and we’re not lazy. We’re just going about this thing we call life a little differently. And also as a really quick side note and concluding thought from a 20-something who enjoyed the luxury of a paycheck on her last internship: start paying your (qualified) interns :)

Becoming an Adult: More than Ditching the Neon and Wayfarers

When I arrive past the dust of my Millennial youth, I hope I will remember the meat of things as more than neon-tinted vision, text-message based love affairs and rainbow displays of wayfarer sunglasses. Instead of plot lines in a life story, these ephemeral phenomena will set the tone of a realized youth. They’ll serve as the glowing Instagram filter coating the everyday forks in the road. The golden aura lighting this age of possibility. Behind the irreverent tweets and the ever-revolving viral memes, I’ll see not a transition into adulthood. Rather, I’ll see the image of my youth comfortably yet ambiguously straddling the line between digital girl and analog woman.

Where generation priors’ analog woman may be established in reputation, a master of her niche, the digital “girl” may appear fragmented, spread over various social networks. Pinterest boards on jewelry organization, twitter afterthoughts on the white house correspondents dinner, Facebook mobile uploads of an epic sushi dinner. Different mediums call for different correspondence, multiple modes of self-expression. I’ve heard baby boomers say, “I just don’t have the time to keep up with that many outlets.” There is nothing wrong with this statement. It’s a lot to keep up with. But my peers and I don’t really have a choice. For many of us, engaging with this many outlets is not only second nature, it’s something we’ve evolved a need for.

The internet, open-access and the nature of our “beta world” have conjured up an environment foreign to many of our elders. They call my generation the digital natives, and we’ve grown up in a technological petri dish our entire lives. For us, the so-called “digital self” was not a new persona or presence that had to be developed and understood after establishment in the analog world. By the time we had our first AIM screen names, we knew pretty much nothing about the “adult world.” We ventured through adolescence, developing our analog selves alongside our digital.

So for us, developing into an adult is somewhat of a gray area. Qualities that defined adulthood in the past are changing. The foundation of our persons are rooted in a completely different realm than our parents. So the question of youth versus adulthood is a tough one. The line, blurry and obscure. There is some underlying classification of the digital as the eager, progressive, wide-eyed youngen ready for revolution, whereas the analog is a stuffy biz exec, talking at a boardroom, following the standard protocol of a 20th century business model. Neither the digital nor the analog should be constrained to an age group, a limited arena or path. Each has its place in our developing world. But when it comes to our identities, could we possibly be both? Could I at once be a digital woman as easily as an analog girl? Is Instagramming keeping us younger, starry-eyed, and illusioned even longer, past the years of Spring Fair yearning and late-nights in the library? More importantly, is this kind of digital-social behavior hindering our transition into the adult world?

If you ask me, I say no. The virtual realms we interact in everyday are certainly changing how we’re growing into adulthood, but I wouldn’t say they “hinder” our development. The conceptions of leadership, maturity and achievement are changing, and they’re changing conditionally with the ways in which we are actively altering our progression into adulthood. I don’t think posting pictures or making a wise-crack observation about a movie is a sign of self-importance and thus, immaturity. I think it’s an exploration in expression, and signature to this age’s obsession with “sharing.” Sharing feelings, sharing links, sharing e-books, etc. Share I will, while I feel the need; and to be involved in my youth the need feels present. More importantly, it feels like a beautiful time to be a young adult, writing the conditions of our stories as we go along…

Emotional Addicts: Get Your Fix by Remixing Your iPhone App Folders

Once I crossed over into technological adulthood and started organizing my iPhone apps, I couldn’t understand those amateurs who just throw apps around without purpose. I’ve re-organized the system a few times as I acquire more and more apps, but it’s become pretty intuitive which apps fall into Tools versus Information, and which apps get the bonus bump up to Social. As I was about to show a friend my sleek setup, he absolutely upstaged me. After reading an article on verb or action-phrased folder names (Play, Listen, Look Up) versus function-based (Productivity, Social Media) he was inspired to change his folder names to something a bit more intuitive. Games are found in a folder called “Weeeee,” utilities that don’t give him a huge reaction like Calculator or Reminders go in “Meh” (incidentally his largest folder which has yielded also: Meh Vol. 2) Viewing apps like HBO Go and Hulu are labeled “Ahh,” and my personal favorite, social tools like Facebook and Twitter in “Ooh.”

If I categorize my apps this way, I’m literally attaching an emotional response to the software associated with those feelings. The apps on the screen are clustered according to their potential to elicit a mental response. So every time I want that “Ooh” feeling of social connection or digital gazing, my thumb gravitates to that folder. As my muscle memory takes over, I’ll find my physical self navigating to the Ooh folder when I subconsciously want to feel social warmth. If I find another app, a new social tool (say: Instagram) that has that same power, I’ll put it in the Ooh folder. When I’m craving more “Ooh,” I’ll click it again, having not just a new app easily accessible, but a familiar feeling. As this association goes deeper, we become stage 5 clingers to our phones (and in general, technology.) It begins to sound like an addiction doesn’t it? Of course, that’s what happens when we begin to associate our internal emotions with anything external.

We can try to technologically detox. We can give up Facebook for lent, vow to check our e-mail only 3 times a day, and limit mindless trips without direction into the interwebs. We can try. And some will succeed. But the real question is what will grow faster: our willpower? Or the attractiveness of our technologies.