Microsoft Dresses Up as Apple for Halloween: Get Ready for Windows 8

The last time Microsoft gave me something to talk about that wasn’t a wisecrack was back in 2007. The technological climate was begging for a revolution. The iPhone was on the brink of release and iPads were barely a rumor. Microsoft began to release promo videos for their new prototype, the “Surface.”

Microsoft engineers boasted about multi-touch computing, a new wave of interaction which would bring users a more (quite literally) hands-on approach to interacting with hardware. The surface promised a whole new set of tools, and a whole new way to interact with them. Surface was essentially an oversized table-top version of today’s popular tablet computers such as Apple’s iPad or Samsung’s Galaxy.

Microsoft tackles design

Since 2007, the focus of Microsoft’s computing revolution has shifted, from the Surface’s hardware to it’s latest operating system software, Windows 8 (due October 26th.) The Surface has been downsized from the table-top prototype into what many are calling the PC’s answer to the iPad. Windows 8 however, is an operating system that will run on both the Surface tablet and Windows desktops. Sam Moreau, the director of user experience at Microsoft tells FastCompany that Windows 8 is tackling the OS wars from a design-centric perspective, a field previously dominated by Apple. They call the interface “Metro,” and it’s a re-design of the Windows platform from a world of organizational toolbars and framed windows to a full-screen experience painted in grids of vibrantly colored tiles, each leading the user into a different digital space (see: games, weather, messaging, mail.)


This re-design is going to fundamentally change the way PC users interact with and consume their technologies. This is not simply a face-lift, but something more like an organ transplant. When PC users find their muscle memory directing their mice towards the ever-present Start Toolbar, they’ll come up empty-clicked. Instead, Internet Explorer is a square blue-tile sitting between a picture of your girlfriend and another tile showing  real-time updates of the weather in your town. This is the type of organization familiar to mobile devices like smartphones and tablets. It’s sleek. It’s clean. It’s simplified. It’s totally not Windows.

Isn’t design Apple’s territory?

It remains to be seen though whether PC users like developers, and data analysts will flock to or run from a Windows 8 operating system. The preference for PCs often comes from 1) The open ecosystem of software development in the Windows world which allows for the types of tools and software that developers and analysts prefer. 2) The highly transparent nature of the Windows environment that allows users to find, edit, and configure almost anything. Apple on the other hand, appeals to a fluid and aesthetically-appealing user experience ideal for designers. An operating system where many files are hidden so as to minimize a user’s chances of “messing up” anything. I often hear from Apple-haters, “I have work to do, I can’t just be looking at a pretty screen all day.” If Apple is the place where desktops bloom in flowers and morning dew, Microsoft is where cogs turn and command lines are bred.

It feels like Microsoft is trying to pick up where it left off when Apple re-joined the personal computing game as a competitor in 2002 with its OSX operating system (the sleek interface Mac users today have grown to love.) Back then, Apple held only a 2.9% of the personal computing market share under their belt. To re-invent the entire system would not off-put that many users, and felt more like a last resort move to either go big or go home. Today, Microsoft owns around 92% of the operating system market share. A number that is important when considering not only how many users have held out on the switch to Mac despite its cleaner, more modern interface, but also how many will be affected by Windows 8.

Apple & Microsoft share a common enemy: Google 

Microsoft has maintained a sense of consistency in that its audience is loyal to its offerings. Gamers and developers use Microsoft, artists and designers use Apple. It feels like a sort of left-brain right-brain distinction we’ve made about the two brands over the years. But as the rise of Google as a common enemy has proven, both Microsoft and Apple are getting their toes stepped on. Google Maps has dominated a market Microsoft used to own, one that Apple is only beginning to respond to now (without much success.) Google’s free tools such as Gmail, Drive (formerly Docs) and Calendar are eliminating the need for and cost of Microsoft Office. Google’s Android platform for smartphones is constantly competing for a majority market share against Apple’s iPhone. It’s no secret that each of these huge companies have had their fair share of flops (Google+, Microsoft’s Zune, Apple’s iTunes Ping) which means nobody is safe from failure, and it could be anyone’s chance to step up and revolutionize the digital game once again. At least this is what Microsoft is hoping for.

Adapting to a New Windows

When any seemingly ubiquitous piece of software or platform changes, many users are up in arms. I remember the day Facebook introduced its signature “Newsfeed” feature as the summer of 2006 came to a close. My peers and I were furious, “What are these statuses about? We have away messages for that. I couldn’t care less about the shopping spree my best friend from third grade went on today.” But Facebook was about 10 steps ahead of us. They weren’t simply trying to replace the away message, they were elevating the facebook status to an interactive forum for conversation. They were changing the face of digital self-expression, where our personalities are often interpreted through our facebook activity, the camera always recording. They were developing a new environment in which we’ve all become voyeurs and exhibitionists, constantly viewing content (many times in silence) or narcissistically boasting our own activities and whereabouts. Facebook literally reinvented how we interact with our social networks from the digital realm to our real lives.

This seems to be the re-invention challenge Microsoft is looking to tackle with the release of Windows 8. They are well aware of the risks they are taking, as Moreau calls it “the ultimate design challenge. You’ve got 24 years of Windows behind you. There’s a responsibility to preserve it, but also to evolve–knowing that when you change something, you’re changing how computing works.” To me, this feels less like a computing revolution, and more like Microsoft’s attempt to “go big” and join the rest of us. The real question is what will happen to the toolbar-loving users they leave behind?

Advertisement

I Am Almond Milk (and Other Thoughts From a Gen-Y)

In the adventure of my journey into the workforce, my age is no longer a hindrance. It’s not a hurdle in the race. It’s in some ways (and in some ways not) an advantage. When a rainstorm hits the horse track and makes the turf sloppy, my lack of age-acquired experience doesn’t make me an underdog. In fact, these conditions make it so that almost anything goes. A less experienced maiden horse may win the race over the those who have been around the track a few times. The breadwinners know how to run a mile and how to do it well; but when a storm hits, some simply cannot adapt to the new conditions.

When a CEO is told that that their business needs to have an online presence, they most likely respond with, “let’s hire an intern to do it.” Ah, the intern. An unpaid existence. Once thought to involve coffee-runs and sending appointment-based e-mails, the intern now builds a web-presence. They set up facebook pages, they send out timely, consumer-centric tweets. They build the face of a brand for their own kind–the Gen Y-ers.

It’s evident that the ways in which consumers interact with their favorite brands and companies is very different than it used to be. If businesses want to appeal to a huge portion of their demographic, they must have a web-presence. People aren’t looking in the yellow pages for a dog groomer. They’re googling. They’re facebooking their friends for recommendations. They’re reading first-hand reviews on Yelp. If businesses aren’t there to monitor their presence, they could be exposed, their reputations tarnished in the permanent prints of the web.

As the rules of the game change, there is inevitably somewhere, a game-changer. The new generation of consumers are reaching for a new mode of interaction, and on the other side, there must be a new generation of producers to give them what they want. Sure, this may not apply to certain older industries like banking. But when my generation begins investing in their 401Ks and managing their stock portfolios, do you think they’ll want to be talking on the phone to a broker? Or navigating a convoluted and fragmented sitemap? I think they’ll be itching for a means to consume and interact similar to the tools with which they grew up.

By no means am I putting down the generation above me. Moreso, I’m responding to some of the negativity I hear surrounding the seemingly fruitless search for employment. In addition, I’m also reacting to the editorial pieces I’ve read bashing our generation’s lack of compliance with societal standards or lack of traditional ambitions. I just want to put it out there–the world is changing! So are the qualifications for leadership, the definition of success, and the means with which to achieve these. The jobs are out there. They may not be the jobs your parents had out of college, but they are there. And if you can’t seem to find them, there is this extremely beautiful quality of our time that we have the luxury of enjoying. We can create our own jobs with a little passion, innovation, and a whole lot of crazy.

I’ll always remember an article I read in BOP Magazine when I was about eight years old. For those of you who don’t know, BOP Magazine was where I got my fix of Leonardo Dicaprio glamour shots and Jonathan Taylor Thomas interviews (“My one wish? World peace” oh the wisdom… ) A fellow pre-teen wrote into BOP to let the world know that Backstreet Boys were famous before ‘Nsync, and so they were the better of the two. A wise-beyond-her-years Alison, 12 from Maine retorted, “That’s like saying last week’s old milk is better than today’s new carton.” To my much wiser, well-versed and experienced elders: I do not mean to call you old milk, necessarily. Nor do I mean to say that my peers and I are the freshest carton out the fridge. Rather, we’re…a new kind of milk. And we’re not all the same. I may be the Almond Milk. My roommate the Soy, my sister the Lactaid. Really what I’m getting at is that we’re worth something, and we’re not lazy. We’re just going about this thing we call life a little differently. And also as a really quick side note and concluding thought from a 20-something who enjoyed the luxury of a paycheck on her last internship: start paying your (qualified) interns 🙂

Becoming an Adult: More than Ditching the Neon and Wayfarers

When I arrive past the dust of my Millennial youth, I hope I will remember the meat of things as more than neon-tinted vision, text-message based love affairs and rainbow displays of wayfarer sunglasses. Instead of plot lines in a life story, these ephemeral phenomena will set the tone of a realized youth. They’ll serve as the glowing Instagram filter coating the everyday forks in the road. The golden aura lighting this age of possibility. Behind the irreverent tweets and the ever-revolving viral memes, I’ll see not a transition into adulthood. Rather, I’ll see the image of my youth comfortably yet ambiguously straddling the line between digital girl and analog woman.

Where generation priors’ analog woman may be established in reputation, a master of her niche, the digital “girl” may appear fragmented, spread over various social networks. Pinterest boards on jewelry organization, twitter afterthoughts on the white house correspondents dinner, Facebook mobile uploads of an epic sushi dinner. Different mediums call for different correspondence, multiple modes of self-expression. I’ve heard baby boomers say, “I just don’t have the time to keep up with that many outlets.” There is nothing wrong with this statement. It’s a lot to keep up with. But my peers and I don’t really have a choice. For many of us, engaging with this many outlets is not only second nature, it’s something we’ve evolved a need for.

The internet, open-access and the nature of our “beta world” have conjured up an environment foreign to many of our elders. They call my generation the digital natives, and we’ve grown up in a technological petri dish our entire lives. For us, the so-called “digital self” was not a new persona or presence that had to be developed and understood after establishment in the analog world. By the time we had our first AIM screen names, we knew pretty much nothing about the “adult world.” We ventured through adolescence, developing our analog selves alongside our digital.

So for us, developing into an adult is somewhat of a gray area. Qualities that defined adulthood in the past are changing. The foundation of our persons are rooted in a completely different realm than our parents. So the question of youth versus adulthood is a tough one. The line, blurry and obscure. There is some underlying classification of the digital as the eager, progressive, wide-eyed youngen ready for revolution, whereas the analog is a stuffy biz exec, talking at a boardroom, following the standard protocol of a 20th century business model. Neither the digital nor the analog should be constrained to an age group, a limited arena or path. Each has its place in our developing world. But when it comes to our identities, could we possibly be both? Could I at once be a digital woman as easily as an analog girl? Is Instagramming keeping us younger, starry-eyed, and illusioned even longer, past the years of Spring Fair yearning and late-nights in the library? More importantly, is this kind of digital-social behavior hindering our transition into the adult world?

If you ask me, I say no. The virtual realms we interact in everyday are certainly changing how we’re growing into adulthood, but I wouldn’t say they “hinder” our development. The conceptions of leadership, maturity and achievement are changing, and they’re changing conditionally with the ways in which we are actively altering our progression into adulthood. I don’t think posting pictures or making a wise-crack observation about a movie is a sign of self-importance and thus, immaturity. I think it’s an exploration in expression, and signature to this age’s obsession with “sharing.” Sharing feelings, sharing links, sharing e-books, etc. Share I will, while I feel the need; and to be involved in my youth the need feels present. More importantly, it feels like a beautiful time to be a young adult, writing the conditions of our stories as we go along…

Tip of the Tongue, Now Located in Your Macbook!

You know when there’s a thought on the tip of your tongue, but you just can’t get it out? Holding tightly to some magnetic tastebud, the unspeakable words torture you, your brain unable to get its act together, take the elevator down to your vocal cords and push that baby out. What was I going to say? What was I going to do? It’s frustrating.

One time, I had a pretty magical tip-of-the-tongue moment. Irate with my inability to remember what I was about to say or do, I kept fiddling away on my keyboard at a probably over-analyzed gchat conversation. I subconsciously hit my CTRL+V paste shortcut and what I saw amazed me. It was a sentence I’d overzealously typed to my partner in conversational crime and decided to hold onto for later use. However, it was not just any mildly out-of-line thought. It was that verbal foliage that had bloomed in my head, at one time something important I felt the need to say. Yet it withered away in absent-minded distraction–my brain decided to let it go once it was copied into my computer. That tip-of-the-tongue thought–it wasn’t really on the tip of my tongue. It wasn’t even in my subconscious once it made its way to the screen via my fingertips. It was simply a sentence stored in my other subconscious memory system. My computer. My virtual-mental cloud, for when my actual mind is clouded.

I often talk of my third hand(s). My iPhone, which has taken the place of so many of my previously cognitive-based functions over the past 5 years. My laptop, which has been a portal into different worlds, constantly affecting conscious thoughts of my own reality. But now, the locus of my consciousness isn’t just centered in my own hardware. My human capability is not only enhanced or aided by the power of my external technological parts. I’ve made a shift, where my subconscious thoughts are located in some intangible cloud that I can’t necessarily see or touch. My brain has lost that ability to hold onto to a thought of my own creation, which I’ve decided to store in the ever-changing copy-paste pocket of my computer.

It makes me nervous of course. What other functions of my cognition have been lost, or rather, replaced by my constant use of technology? Is the plasticity of my brain finding purpose through technical adaptation versus humanistic mental work? Can the ways in which my brain functions actually be changing as my technological tools become more powerful, more present, and even more weaved into my everyday activities?

In short: I have a feeling the answer is yes–my brain is changing. What will the ultimate effect of it all be? Well if things keep up this way, then–wait I was going somewhere with this…it’s right on tip of my tongue…

Don’t Taze Me Bot: On Respecting Your Technology

A group of US Navy scientists are in the midst of developing a robotic jellyfish which is powered by the same materials as an organic jellyfish–the water around them. Can you imagine the possibilities this presents? This means our electronics could be powered by the same natural elements as biotic beings. Cell phones could breathe air. iPads could re-charge on pizza. Laptops would function more efficiently after a great workout.

Alright alright, maybe it’s not that simple (or whimsical,) but think about the conceptions we have of artificial intelligence, cyborgs, and robots. Robin Williams’s Bicentennial Man plugs into the wall. Minority Report’s predictive pre-cogs live in an electro-wave sensing jelly. We don’t see robots or future cyborgs as our equals. They are very much still considered the “other,” and to many, a negative external entity. But imagine, a robot on your level of humanity. One that feeds off of the same fuels as you. Not necessarily that your Roomba is going to start sharing your love of a bangin’ BLT or anything, but rather that a personal assistant ‘borg might be able to sit down with you to dinner. It might be able to sense that you used more basil in tonight’s pesto penne dinner, compliment you on the vibrant flavor. It might be able to note that you haven’t had a meal with fish in over a month, and maybe the lack of protein is what’s been making you feel so sluggish? It might suggest the 1-day sale on tilapia and shrimp at the supermarket. It might remind you that the Comcast building is right next to the grocery store, and you’ve been meaning to trade out that old box for one with DVR. It might recommend the sitcom Parks & Recreation to watch tonight after dinner, suggested after you finished all the seasons of 30 Rock on Netflix (and because the ‘bot has been missing Amy Poehler ever since she left SNL.)

Do you see what I am getting at? Maybe we wouldn’t view the robots of the future in the light of a perceived apocalyptic fear if we were able to simply relate. There may be people in your life who you (even subconsciously) view as “tools.” That girl from my class who I make outlines with before the test. That guy at work who always helps me figure out the espresso machine. They are helpers. Aids. Acquaintances. And they are treated as such. But when that guy at work that helps with the espresso notes that he saw your photos from the Mediterranean islands at your desk and suggests a place to buy real beans like they have in Europe, suddenly the relationship has been elevated. There is a personal connection. Your social discourse will evolve and expand as your relation becomes leveled with one another.

And this is the future I see for the transhumanists. Those who believe in not only the future of computing power, but the future of humanity as well. While many of us may fear the future of computers, I personally fear the future of humans. We yell at our computers when it takes more than 30 seconds to boot up. We mutter under our breath “stupid iPhone” when we don’t have service. We are disrespectful to our electronics, yet we rave about how much we love them when questioned about their value to us.

Awareness, as always, is the key to avoiding internal collapse. Perhaps the future of our cyborgs can evolve into a digital friendship. At the least, a level of personal respect must be established with our electronics if we are to evolve in a healthy and symbiotic relationship. I mean really, do you think all those robots in the movies were revolting because they were getting too much respect?

Only the Good Die Young: Grieving a Hard Drive Crash

I woke up groggy last Wednesday after a series of vividly convoluted dreams. My toes stumbled upon my Macbook at the edge of my bed. Ah, yes–this again. Hulu dreams: the condition of falling asleep with your laptop open, while Hulu broadcasts an infinite playlist of suggested shows all night long, inspiring seemingly strange yet perfectly-narrated dreams. I skillfully shut the laptop with my lower appendages and hug my pillow tighter; 15 more minutes.

When I do eventually wake up, I do what I do every morning. 1) Reach for phone 2) check personal e-mail 3) check work e-mail 4) check facebook notifications 5) skim public twitter account stream 6) skim private twitter account stream 7) write down bullet notes about dream in my Momento diary app. Then, and only then, have I sufficiently briefed myself for the day ahead of me.

If you want to give me a heart attack, set this image as my background when I'm not looking.

This morning, at some point between checking e-mail and type-scribbling details about my dream, I decided to reach for my laptop and go in for “the real thing” (iPhone, do not cry from under-utilization, I will return to you soon enough on my elevator ride or while in line for a coffee.) I open my laptop to a familiar start-up tone, paired with an awful clicking sound. I know that sound well, and the little optimist living somewhere buried in the folds of my brain says, “don’t worry, it’s nothing.” But alas, it is something. The entire screen is grayscale, aesthetics reminiscent of pre-OS X days, and a flashing folder dawns a single symbol: one giant question mark. Oh, how many questions that punctuation mark included: what the hell happened? Is my hard drive really gone-zo? When’s the last time I backed up? Why me? Is this some kind of karma? Did I not hold the door for the couple with groceries behind me last week?

It’s a terrible feeling when you lose your data. It’s not just inconvenient, it’s literally a tugging sensation at the emotional level of your internal organs. Your stomach wretches. You walk around all day with that inexplicable feeling of confusion and self-loathing, a tragedy has occurred. And how could you not? It’s not just your computer, it’s a part of you. You identify in some way with the songs you listen to in your iTunes library. You hold onto memories of a trip to Europe with a folder of 900 pictures and 35 videos. You store literary accomplishments like that 45 page thesis that nearly took your sanity the last semester of college. Your data exists on your computer as pockets and piles of information that make up who you are. Suddenly, one day without warning your computer crashes and you lose a huge part of yourself. You remember the melodies of your favorite Lynyrd Skynyrd songs, but you can no longer elicit the feeling of any of their hits at any given moment. You’ll never forget how amazing EuroTrip 2010 was, but the image of that sheep meat you almost had to eat in Spain, and the look on your roommate’s face after she actually did–gone. Any memory of the tone of your voice and point of view from college, exists solely in your head, forever re-written as most of our memories eventually become as we age.

In a recent NYTimes article, Carina Chocano speaks on The Dilemma of Being a Cyborg, and points out that these types of data losses do not mimic the natural human process of forgetting. “It happens all at once, not gradually or imperceptibly, so it feels less like an unburdening than like a mugging.” But this is what happens when we rely on technology for needs that were previously filled by our natural biology. I tend to look at technological tools as an enhancement to human capability, not a replacement. Though I am beginning to see this as myself re-branding the implications of my technologies’ capabilities, the way you would excuse the subpar behavior of a love interest you’ve romanticized. Sure, I see it as a +1 that my iPhone will remind me to grab post-its on my lunch break today. I don’t think it negates my ability to remember something extremely important if I need to. But what about the ability to remember the wide-eyed, adrenaline-overloaded thoughts on my first day of college 20 years from now? Or the first words I ever spoke to someone who’d go on to change the course of my life?

I don’t remember emotional details the way I used to. These days, I journal as a form of therapy. There are thoughts clouding my vision, and I must wrestle them out of my skull and onto a word document. Once the pieces are put onto my screen, I can read them clearly analyze them, and evaluate my sanity on said subject. Then I click save and put them away. I’m comforted by the fact that if I need them again, I know they’re there in my “Thoughts” folder, but I don’t carry them around with me at all times. When my hard drive crashed, the first thing I thought was, “My journal!” All these thoughts I’d untangled–the progressive pits and peaks of a young 20-something, elaborately spilled into my MacBook–gone. I knew I would go on to read these in the future, a sort of checking in on the past. But now, there’s no record of this huge part of my self work. It’s impossible to recreate the musings of a moment, too emotional to navigate the jungles of the past, and too disappointing to know that I won’t have the vivid memories of this time.

But alas, c’est la vie. It’s the trade-off we make when we rely on our technological counterparts as an extended 6th sense, as a part of our self, an external brain. Do we attempt to live presently, without the necessity of documenting our past performances? Or simply accept that our digital extensions are imperfect, sometimes failing us the way our own bodies do. Luckily, I get to postpone my decision a bit longer. My 15-inch portal of glory has pulled through in a miraculous recovery, allowing me to keep straddling my nostalgic and present selves. Oh, and I’ve also updated the back ups of my data on two separate external hard drives. Just as the morning I woke up to that awful question mark, there’s no real good answer why or how it happened. I guess my digital karma kicked in this time.

Welcome to the World, Baby ‘Borg

I was sitting at my then favorite restaurant, Joe’s American Bar & Grill, on my 11th birthday. My parents told me when they were buying my present, the guy behind the counter at Best Buy asked if it was for a colleague or a friend. “No actually, for my daughter’s 11th birthday.”

I don't think I knew how to correctly spell Cassiopeia at this point in my life.

My very own PocketPC. In the pre-Mac OSX days, I was a PC girl of course (thankfully Steve Jobs changed all of that *pours a sip out for the big man*.) After teaching my mom how to do things like “save a contact” in her Palm Pilot (and subsequently stealing it to play Tetris and Dope Wars,) my parents realized what I’d been pining for that year for my birthday.

Sure, it seems like no big. For an 11-year-old it was just a more sleek-looking gameboy (unfortunately, without Super Mario Bros.) But for me it was the beginning of something much bigger. I’d been used to writing down my thoughts in a small journal. Now, I was typing them into my handheld PC, e-mailing them to myself. I could access them at my cousin’s house, in the library at school. I stopped using a planner, and started making to-do lists in my PocketPC. Suddenly my mental content, which had previously only occupied my mind or little slips of paper, had moved on to a different space. There was now this virtual realm I was inhabiting at all times without consciously realizing it. Though I shouldn’t have been able to until I was 18, part of me had already checked out of typical childhood that year. The full repercussions wouldn’t occur to me until years later. But I’ll always remember that birthday as the summer I moved out of my house, and onto my digital cloud 9.

Do not worry. This blog won’t be about my childhood stories. It will, however, be about those moves we all make forward in our technological endeavors that pose meaningful consequences in our progression as humans. How did our cell phones become our extended limbs? When did our online social network presence begin to affect our peace of mind outside of the digital world? There are plenty of futurists out there who believe these steps are necessary to our advancement as humans. And plenty of naturalists who say that technology is an inevitable enemy to our humanity. I don’t believe the issue is as black and white as some may think. I do think that many of the negative consequences of this digital age can be avoided by simply cultivating an awareness of the impact technology has on our selves. Stick around with me as I explore this digital cocoon we’re all transforming inside of. Hopefully we’ll all come out alive, and maybe even–bionic.