Anyone on board with me here? Probably wouldn’t hurt to learn from Pandora’s algorithms either…
13 Free Design Tools for Visual Marketers on a Budget
Check out my latest blog post on the HubSpot blog. I outline some of the tools I use everyday to create easy-to-digest infographics, design projects and more.
Microsoft Dresses Up as Apple for Halloween: Get Ready for Windows 8
The last time Microsoft gave me something to talk about that wasn’t a wisecrack was back in 2007. The technological climate was begging for a revolution. The iPhone was on the brink of release and iPads were barely a rumor. Microsoft began to release promo videos for their new prototype, the “Surface.”
Microsoft engineers boasted about multi-touch computing, a new wave of interaction which would bring users a more (quite literally) hands-on approach to interacting with hardware. The surface promised a whole new set of tools, and a whole new way to interact with them. Surface was essentially an oversized table-top version of today’s popular tablet computers such as Apple’s iPad or Samsung’s Galaxy.
Microsoft tackles design
Since 2007, the focus of Microsoft’s computing revolution has shifted, from the Surface’s hardware to it’s latest operating system software, Windows 8 (due October 26th.) The Surface has been downsized from the table-top prototype into what many are calling the PC’s answer to the iPad. Windows 8 however, is an operating system that will run on both the Surface tablet and Windows desktops. Sam Moreau, the director of user experience at Microsoft tells FastCompany that Windows 8 is tackling the OS wars from a design-centric perspective, a field previously dominated by Apple. They call the interface “Metro,” and it’s a re-design of the Windows platform from a world of organizational toolbars and framed windows to a full-screen experience painted in grids of vibrantly colored tiles, each leading the user into a different digital space (see: games, weather, messaging, mail.)
This re-design is going to fundamentally change the way PC users interact with and consume their technologies. This is not simply a face-lift, but something more like an organ transplant. When PC users find their muscle memory directing their mice towards the ever-present Start Toolbar, they’ll come up empty-clicked. Instead, Internet Explorer is a square blue-tile sitting between a picture of your girlfriend and another tile showing real-time updates of the weather in your town. This is the type of organization familiar to mobile devices like smartphones and tablets. It’s sleek. It’s clean. It’s simplified. It’s totally not Windows.
Isn’t design Apple’s territory?
It remains to be seen though whether PC users like developers, and data analysts will flock to or run from a Windows 8 operating system. The preference for PCs often comes from 1) The open ecosystem of software development in the Windows world which allows for the types of tools and software that developers and analysts prefer. 2) The highly transparent nature of the Windows environment that allows users to find, edit, and configure almost anything. Apple on the other hand, appeals to a fluid and aesthetically-appealing user experience ideal for designers. An operating system where many files are hidden so as to minimize a user’s chances of “messing up” anything. I often hear from Apple-haters, “I have work to do, I can’t just be looking at a pretty screen all day.” If Apple is the place where desktops bloom in flowers and morning dew, Microsoft is where cogs turn and command lines are bred.
It feels like Microsoft is trying to pick up where it left off when Apple re-joined the personal computing game as a competitor in 2002 with its OSX operating system (the sleek interface Mac users today have grown to love.) Back then, Apple held only a 2.9% of the personal computing market share under their belt. To re-invent the entire system would not off-put that many users, and felt more like a last resort move to either go big or go home. Today, Microsoft owns around 92% of the operating system market share. A number that is important when considering not only how many users have held out on the switch to Mac despite its cleaner, more modern interface, but also how many will be affected by Windows 8.
Apple & Microsoft share a common enemy: Google
Microsoft has maintained a sense of consistency in that its audience is loyal to its offerings. Gamers and developers use Microsoft, artists and designers use Apple. It feels like a sort of left-brain right-brain distinction we’ve made about the two brands over the years. But as the rise of Google as a common enemy has proven, both Microsoft and Apple are getting their toes stepped on. Google Maps has dominated a market Microsoft used to own, one that Apple is only beginning to respond to now (without much success.) Google’s free tools such as Gmail, Drive (formerly Docs) and Calendar are eliminating the need for and cost of Microsoft Office. Google’s Android platform for smartphones is constantly competing for a majority market share against Apple’s iPhone. It’s no secret that each of these huge companies have had their fair share of flops (Google+, Microsoft’s Zune, Apple’s iTunes Ping) which means nobody is safe from failure, and it could be anyone’s chance to step up and revolutionize the digital game once again. At least this is what Microsoft is hoping for.
Adapting to a New Windows
When any seemingly ubiquitous piece of software or platform changes, many users are up in arms. I remember the day Facebook introduced its signature “Newsfeed” feature as the summer of 2006 came to a close. My peers and I were furious, “What are these statuses about? We have away messages for that. I couldn’t care less about the shopping spree my best friend from third grade went on today.” But Facebook was about 10 steps ahead of us. They weren’t simply trying to replace the away message, they were elevating the facebook status to an interactive forum for conversation. They were changing the face of digital self-expression, where our personalities are often interpreted through our facebook activity, the camera always recording. They were developing a new environment in which we’ve all become voyeurs and exhibitionists, constantly viewing content (many times in silence) or narcissistically boasting our own activities and whereabouts. Facebook literally reinvented how we interact with our social networks from the digital realm to our real lives.
This seems to be the re-invention challenge Microsoft is looking to tackle with the release of Windows 8. They are well aware of the risks they are taking, as Moreau calls it “the ultimate design challenge. You’ve got 24 years of Windows behind you. There’s a responsibility to preserve it, but also to evolve–knowing that when you change something, you’re changing how computing works.” To me, this feels less like a computing revolution, and more like Microsoft’s attempt to “go big” and join the rest of us. The real question is what will happen to the toolbar-loving users they leave behind?
A New User Experience at the Newseum’s New Media Gallery
Today, I went to DC’s Newseum, a museum exploring the ways in which the news-reporting industry has grown and evolved over the years. I went with my class, #internet (how digitalcocoon-esque,) and was fairly unsure of what to expect. We visited the HP New Media Gallery, and it was a social media extravaganza. I felt as if I’d been dropped into a digital empire from a neo-noir film, except a new governor had been put in place to clean up the hardware rubbish, eliminate cyborg corruption and Web 3.0 the whole dang thing (can I start using that term yet?)
The HP New Media Gallery explores the new ways in which we receive our news, interact with those sources, and communicate with others about the stories we care about. Without necessarily having a big definition of what “New Media” is attached to a daunting digital display, the back wall of the round viewing room is surrounded by four large screens, each projecting a different visual with one audio track, each video seamlessly interacting with the audio, pieces to the same puzzle.
The “user” of the New Media Gallery (formerly known as “museum go-er”) can decide which screen to pay attention to, whether it’s milestones in social/new media infiltration (2011: Tweeting is Allowed in Supreme Court) or headshots of the founding fathers and maiden mothers of New Media like Facebook’s Mark Zuckerberg or Huffington Post’s Arianna Huffington. Foregoing a definition, the Newseum attempts rather to achieve an “essence” of New Media. Smart if you ask me, as the meaning, function, and impact is bound to continually change. “What Does New Media Mean to You” flashes before the user in hot pink block-faced letters. “I love it” “A world of information” “Participation.” The four screens can be a bit distracting–do I focus here, my peripherals are calling me over here, clips of an interactive Angry Birds game entices my gaze over there. But yet, the user walks away with a full understanding of the aura that is “New Media.” Incidentally, this is exactly the way in which the user interacts with the Internet. The content is a bit distracting. A tad fragmented. But overall, the user gets the jist of what’s going on.
An interactive wall of bouncing orbs feature 30 news stories, all of which found a voice through viral media sources. The user is invited to explore the evolution of the Apple brand or, how an event like Michael Jackson’s death sprouted from inception to public awareness. Users are asked simple Yes or No polls at the end of each story. Two teenage boys giggle and prod at the story of Justin Bieber’s rise to fame. “Do you think Justin Bieber would have been discovered without YouTube?” An awkward two-step towards the screen and a bursting “NO!” comes from one boy as he casts his vote (I humbly disagree.)
The HP New Media Gallery at the Newseum is a perfect example of a successful interactive exhibit. The actual phenomena explored were put to use, as the New Media Gallery exists on the web. Pictures of visitors are shared online when they virtually “check-in,” a stream visible anywhere with an internet connection. Inside the exhibit, a Twitter stream is on display broadcasting those talking about the exhibit.
The user experience was fluid, visitors entered the exhibit with curious eyes, some skimming the surface, others diving into the interactive XBOX Kinect games designed for the gallery. The Newseum is located at 555 Pennsylvania Avenue in DC. Coming highly recommended to you from the digital cocoon.
I Am Almond Milk (and Other Thoughts From a Gen-Y)
In the adventure of my journey into the workforce, my age is no longer a hindrance. It’s not a hurdle in the race. It’s in some ways (and in some ways not) an advantage. When a rainstorm hits the horse track and makes the turf sloppy, my lack of age-acquired experience doesn’t make me an underdog. In fact, these conditions make it so that almost anything goes. A less experienced maiden horse may win the race over the those who have been around the track a few times. The breadwinners know how to run a mile and how to do it well; but when a storm hits, some simply cannot adapt to the new conditions.
When a CEO is told that that their business needs to have an online presence, they most likely respond with, “let’s hire an intern to do it.” Ah, the intern. An unpaid existence. Once thought to involve coffee-runs and sending appointment-based e-mails, the intern now builds a web-presence. They set up facebook pages, they send out timely, consumer-centric tweets. They build the face of a brand for their own kind–the Gen Y-ers.
It’s evident that the ways in which consumers interact with their favorite brands and companies is very different than it used to be. If businesses want to appeal to a huge portion of their demographic, they must have a web-presence. People aren’t looking in the yellow pages for a dog groomer. They’re googling. They’re facebooking their friends for recommendations. They’re reading first-hand reviews on Yelp. If businesses aren’t there to monitor their presence, they could be exposed, their reputations tarnished in the permanent prints of the web.
As the rules of the game change, there is inevitably somewhere, a game-changer. The new generation of consumers are reaching for a new mode of interaction, and on the other side, there must be a new generation of producers to give them what they want. Sure, this may not apply to certain older industries like banking. But when my generation begins investing in their 401Ks and managing their stock portfolios, do you think they’ll want to be talking on the phone to a broker? Or navigating a convoluted and fragmented sitemap? I think they’ll be itching for a means to consume and interact similar to the tools with which they grew up.
By no means am I putting down the generation above me. Moreso, I’m responding to some of the negativity I hear surrounding the seemingly fruitless search for employment. In addition, I’m also reacting to the editorial pieces I’ve read bashing our generation’s lack of compliance with societal standards or lack of traditional ambitions. I just want to put it out there–the world is changing! So are the qualifications for leadership, the definition of success, and the means with which to achieve these. The jobs are out there. They may not be the jobs your parents had out of college, but they are there. And if you can’t seem to find them, there is this extremely beautiful quality of our time that we have the luxury of enjoying. We can create our own jobs with a little passion, innovation, and a whole lot of crazy.
I’ll always remember an article I read in BOP Magazine when I was about eight years old. For those of you who don’t know, BOP Magazine was where I got my fix of Leonardo Dicaprio glamour shots and Jonathan Taylor Thomas interviews (“My one wish? World peace” oh the wisdom… ) A fellow pre-teen wrote into BOP to let the world know that Backstreet Boys were famous before ‘Nsync, and so they were the better of the two. A wise-beyond-her-years Alison, 12 from Maine retorted, “That’s like saying last week’s old milk is better than today’s new carton.” To my much wiser, well-versed and experienced elders: I do not mean to call you old milk, necessarily. Nor do I mean to say that my peers and I are the freshest carton out the fridge. Rather, we’re…a new kind of milk. And we’re not all the same. I may be the Almond Milk. My roommate the Soy, my sister the Lactaid. Really what I’m getting at is that we’re worth something, and we’re not lazy. We’re just going about this thing we call life a little differently. And also as a really quick side note and concluding thought from a 20-something who enjoyed the luxury of a paycheck on her last internship: start paying your (qualified) interns 🙂
Becoming an Adult: More than Ditching the Neon and Wayfarers
When I arrive past the dust of my Millennial youth, I hope I will remember the meat of things as more than neon-tinted vision, text-message based love affairs and rainbow displays of wayfarer sunglasses. Instead of plot lines in a life story, these ephemeral phenomena will set the tone of a realized youth. They’ll serve as the glowing Instagram filter coating the everyday forks in the road. The golden aura lighting this age of possibility. Behind the irreverent tweets and the ever-revolving viral memes, I’ll see not a transition into adulthood. Rather, I’ll see the image of my youth comfortably yet ambiguously straddling the line between digital girl and analog woman.
Where generation priors’ analog woman may be established in reputation, a master of her niche, the digital “girl” may appear fragmented, spread over various social networks. Pinterest boards on jewelry organization, twitter afterthoughts on the white house correspondents dinner, Facebook mobile uploads of an epic sushi dinner. Different mediums call for different correspondence, multiple modes of self-expression. I’ve heard baby boomers say, “I just don’t have the time to keep up with that many outlets.” There is nothing wrong with this statement. It’s a lot to keep up with. But my peers and I don’t really have a choice. For many of us, engaging with this many outlets is not only second nature, it’s something we’ve evolved a need for.
The internet, open-access and the nature of our “beta world” have conjured up an environment foreign to many of our elders. They call my generation the digital natives, and we’ve grown up in a technological petri dish our entire lives. For us, the so-called “digital self” was not a new persona or presence that had to be developed and understood after establishment in the analog world. By the time we had our first AIM screen names, we knew pretty much nothing about the “adult world.” We ventured through adolescence, developing our analog selves alongside our digital.
So for us, developing into an adult is somewhat of a gray area. Qualities that defined adulthood in the past are changing. The foundation of our persons are rooted in a completely different realm than our parents. So the question of youth versus adulthood is a tough one. The line, blurry and obscure. There is some underlying classification of the digital as the eager, progressive, wide-eyed youngen ready for revolution, whereas the analog is a stuffy biz exec, talking at a boardroom, following the standard protocol of a 20th century business model. Neither the digital nor the analog should be constrained to an age group, a limited arena or path. Each has its place in our developing world. But when it comes to our identities, could we possibly be both? Could I at once be a digital woman as easily as an analog girl? Is Instagramming keeping us younger, starry-eyed, and illusioned even longer, past the years of Spring Fair yearning and late-nights in the library? More importantly, is this kind of digital-social behavior hindering our transition into the adult world?
If you ask me, I say no. The virtual realms we interact in everyday are certainly changing how we’re growing into adulthood, but I wouldn’t say they “hinder” our development. The conceptions of leadership, maturity and achievement are changing, and they’re changing conditionally with the ways in which we are actively altering our progression into adulthood. I don’t think posting pictures or making a wise-crack observation about a movie is a sign of self-importance and thus, immaturity. I think it’s an exploration in expression, and signature to this age’s obsession with “sharing.” Sharing feelings, sharing links, sharing e-books, etc. Share I will, while I feel the need; and to be involved in my youth the need feels present. More importantly, it feels like a beautiful time to be a young adult, writing the conditions of our stories as we go along…
Emotional Addicts: Get Your Fix by Remixing Your iPhone App Folders
Once I crossed over into technological adulthood and started organizing my iPhone apps, I couldn’t understand those amateurs who just throw apps around without purpose. I’ve re-organized the system a few times as I acquire more and more apps, but it’s become pretty intuitive which apps fall into Tools versus Information, and which apps get the bonus bump up to Social. As I was about to show a friend my sleek setup, he absolutely upstaged me. After reading an article on verb or action-phrased folder names (Play, Listen, Look Up) versus function-based (Productivity, Social Media) he was inspired to change his folder names to something a bit more intuitive. Games are found in a folder called “Weeeee,” utilities that don’t give him a huge reaction like Calculator or Reminders go in “Meh” (incidentally his largest folder which has yielded also: Meh Vol. 2) Viewing apps like HBO Go and Hulu are labeled “Ahh,” and my personal favorite, social tools like Facebook and Twitter in “Ooh.”
If I categorize my apps this way, I’m literally attaching an emotional response to the software associated with those feelings. The apps on the screen are clustered according to their potential to elicit a mental response. So every time I want that “Ooh” feeling of social connection or digital gazing, my thumb gravitates to that folder. As my muscle memory takes over, I’ll find my physical self navigating to the Ooh folder when I subconsciously want to feel social warmth. If I find another app, a new social tool (say: Instagram) that has that same power, I’ll put it in the Ooh folder. When I’m craving more “Ooh,” I’ll click it again, having not just a new app easily accessible, but a familiar feeling. As this association goes deeper, we become stage 5 clingers to our phones (and in general, technology.) It begins to sound like an addiction doesn’t it? Of course, that’s what happens when we begin to associate our internal emotions with anything external.
We can try to technologically detox. We can give up Facebook for lent, vow to check our e-mail only 3 times a day, and limit mindless trips without direction into the interwebs. We can try. And some will succeed. But the real question is what will grow faster: our willpower? Or the attractiveness of our technologies.
Tip of the Tongue, Now Located in Your Macbook!
You know when there’s a thought on the tip of your tongue, but you just can’t get it out? Holding tightly to some magnetic tastebud, the unspeakable words torture you, your brain unable to get its act together, take the elevator down to your vocal cords and push that baby out. What was I going to say? What was I going to do? It’s frustrating.
One time, I had a pretty magical tip-of-the-tongue moment. Irate with my inability to remember what I was about to say or do, I kept fiddling away on my keyboard at a probably over-analyzed gchat conversation. I subconsciously hit my CTRL+V paste shortcut and what I saw amazed me. It was a sentence I’d overzealously typed to my partner in conversational crime and decided to hold onto for later use. However, it was not just any mildly out-of-line thought. It was that verbal foliage that had bloomed in my head, at one time something important I felt the need to say. Yet it withered away in absent-minded distraction–my brain decided to let it go once it was copied into my computer. That tip-of-the-tongue thought–it wasn’t really on the tip of my tongue. It wasn’t even in my subconscious once it made its way to the screen via my fingertips. It was simply a sentence stored in my other subconscious memory system. My computer. My virtual-mental cloud, for when my actual mind is clouded.
I often talk of my third hand(s). My iPhone, which has taken the place of so many of my previously cognitive-based functions over the past 5 years. My laptop, which has been a portal into different worlds, constantly affecting conscious thoughts of my own reality. But now, the locus of my consciousness isn’t just centered in my own hardware. My human capability is not only enhanced or aided by the power of my external technological parts. I’ve made a shift, where my subconscious thoughts are located in some intangible cloud that I can’t necessarily see or touch. My brain has lost that ability to hold onto to a thought of my own creation, which I’ve decided to store in the ever-changing copy-paste pocket of my computer.
It makes me nervous of course. What other functions of my cognition have been lost, or rather, replaced by my constant use of technology? Is the plasticity of my brain finding purpose through technical adaptation versus humanistic mental work? Can the ways in which my brain functions actually be changing as my technological tools become more powerful, more present, and even more weaved into my everyday activities?
In short: I have a feeling the answer is yes–my brain is changing. What will the ultimate effect of it all be? Well if things keep up this way, then–wait I was going somewhere with this…it’s right on tip of my tongue…
Don’t Taze Me Bot: On Respecting Your Technology
A group of US Navy scientists are in the midst of developing a robotic jellyfish which is powered by the same materials as an organic jellyfish–the water around them. Can you imagine the possibilities this presents? This means our electronics could be powered by the same natural elements as biotic beings. Cell phones could breathe air. iPads could re-charge on pizza. Laptops would function more efficiently after a great workout.
Alright alright, maybe it’s not that simple (or whimsical,) but think about the conceptions we have of artificial intelligence, cyborgs, and robots. Robin Williams’s Bicentennial Man plugs into the wall. Minority Report’s predictive pre-cogs live in an electro-wave sensing jelly. We don’t see robots or future cyborgs as our equals. They are very much still considered the “other,” and to many, a negative external entity. But imagine, a robot on your level of humanity. One that feeds off of the same fuels as you. Not necessarily that your Roomba is going to start sharing your love of a bangin’ BLT or anything, but rather that a personal assistant ‘borg might be able to sit down with you to dinner. It might be able to sense that you used more basil in tonight’s pesto penne dinner, compliment you on the vibrant flavor. It might be able to note that you haven’t had a meal with fish in over a month, and maybe the lack of protein is what’s been making you feel so sluggish? It might suggest the 1-day sale on tilapia and shrimp at the supermarket. It might remind you that the Comcast building is right next to the grocery store, and you’ve been meaning to trade out that old box for one with DVR. It might recommend the sitcom Parks & Recreation to watch tonight after dinner, suggested after you finished all the seasons of 30 Rock on Netflix (and because the ‘bot has been missing Amy Poehler ever since she left SNL.)
Do you see what I am getting at? Maybe we wouldn’t view the robots of the future in the light of a perceived apocalyptic fear if we were able to simply relate. There may be people in your life who you (even subconsciously) view as “tools.” That girl from my class who I make outlines with before the test. That guy at work who always helps me figure out the espresso machine. They are helpers. Aids. Acquaintances. And they are treated as such. But when that guy at work that helps with the espresso notes that he saw your photos from the Mediterranean islands at your desk and suggests a place to buy real beans like they have in Europe, suddenly the relationship has been elevated. There is a personal connection. Your social discourse will evolve and expand as your relation becomes leveled with one another.
And this is the future I see for the transhumanists. Those who believe in not only the future of computing power, but the future of humanity as well. While many of us may fear the future of computers, I personally fear the future of humans. We yell at our computers when it takes more than 30 seconds to boot up. We mutter under our breath “stupid iPhone” when we don’t have service. We are disrespectful to our electronics, yet we rave about how much we love them when questioned about their value to us.
Awareness, as always, is the key to avoiding internal collapse. Perhaps the future of our cyborgs can evolve into a digital friendship. At the least, a level of personal respect must be established with our electronics if we are to evolve in a healthy and symbiotic relationship. I mean really, do you think all those robots in the movies were revolting because they were getting too much respect?
How a Tongue Piercing Will Change the Life of Paraplegics
Now THIS is what I’m talking about when I hashtag #thecyborgsarecoming on Twitter. For all you skeptics who think the merger of technology and the body is nothing but trouble, please check this article out. This clinical tongue piercing allows the user to control a wheelchair via sensors implanted into a retainer. Check it out here at Popular Science.