The History of the World (of Social Media), Part 1 – The Myth of Social Media

Focus can be a dangerous thing.

Have you ever played golf? I do on occasion. It’s generally not pretty. Whenever I get a golf club in my hands, even if it’s just at a driving range, I face a battle. The battle is between trying to be mindful of how to swing a thin stick made of carbon fiber composite with this funny little bend at the end at a ball in such a way as to make the ball go straight and far…and completely forgetting I’m doing any of that. Usually  what happens are one or two good shots and then some wicked hooks and slices.

…and then that’s when I knocked over Doc Ock’s mailbox.

There are many pursuits which are similar — learning a musical instrument, playing video games, and for me — driving a stick shift. Everything goes along swimmingly until that moment you realize what the hell you are doing and then it precedes to all go sideways (which if you’re driving a car is usually the wrong way to go about any sort of forward progress). However, when it’s right, you forget what you’re trying to do, the world falls away and you just do it.

What is usually referred to as “social media” is like that. The best examples of social media gone wrong are usually the result of someone over-thinking and trying too hard. It feels false to anyone who sees it and thus whatever effect was intended is lost. (See the recent Applebee’s fiasco for an excellent example)

Almost three years ago, I wrote a piece titled “A Look Back @ Twitter.” It was, by far, the most successful, most read piece of writing I’ve posted here and continues to be found and read (more than 2,000 views out of the more than 10,000 I’ve received on this blog). At the time I wrote it, I’d been on Twitter for almost 3 years but only seriously using it for about a year. I have now been on Twitter for more than 5 years (it will be 6 next July) – which is like a millennium in web timescales. If you’re interested, I think the post is still pretty true and has held up relatively well, so it may be worth your time if you haven’t read it. I only bring it up to point out that since I wrote it, Twitter has changed — as well it should.

When the modern Internet developed (not back in the ARPA days, but more recently in the boom times of the 90s when the idea of it went mainstream), it was a tool in search of a problem. A number of people and companies came forward, sure in the knowledge they had figured out the secret, and they tried to make the Internet and the World Wide Web (that term sounds so freaking archaic now!) fit that vision: commerce, communication, whatever. Most of them failed. Some had some success and everyone kept trying because it was just this huge, wondrous thing that everyone knew would be vital…somehow.

Then came “Web 2.0” — a defunct marketing term if ever there was one — and after that “social media.” While there was much corporate verbiage thrown about related to leveraging communication, targeting consumers, engaging audiences, and other such nonsense, what it all basically boiled down to was a bunch of people throwing stuff up on the wall and seeing what stuck and never being quite sure why it did.

But what sticks is this: LIFE.

Zen on the Beach
“And we’ll be saying a big hello to all intelligent life forms everywhere. And to everyone else out there, the secret is to bang the rocks together, guys.” – Hitchhiker’s Guide to the Galaxy, Douglas Adams

People want to do what they’ve been doing since we started banging rocks together: find and acquire things (food, love, a good place to find the right kind of rocks), talk to other people about the things that interest them (food, love, what kinds of rocks are best to bang together), and knit themselves into a supportive social web of people that will make it a little easier to bear all the times when you can’t find food or love, or when you bang your thumb with a rock.

Where a lot of people (and companies spending obscene amounts of money) went wrong was in thinking that the “technology revolution” would change society. Instead what it has meant is that technology has changed. If you’re as old as I am, you remember what technology used to be: centralized, top-down, and hierarchical. Think mainframes. Think Ma Bell. Think broadcast network television. It was all still based on being pushy with electrons, but now it is more often (but by no means always) crowd-sourced, bottom-up, and nonlinear.

And that brings us back to Twitter (and Facebook, Tumblr, Reddit, Google+ and anything else ever referred to under the umbrella term of ‘social media’). Some of the people I met online during that my Twitter early days have bemoaned, as I have on occasion, that Twitter back then was more fun. No one knew what the hell we were doing and it worked. Most of my strongest relationships with people I’ve met online were started during that time period, with many having become close friends in my life-away-from-keyboard (aka #LAFK).

Outside the Twitter bubble, I always surprised to still encounter a lot of antagonistic feelings about Twitter and other social media services, often expressed as: “Why would I want to post everything I do online?” “No one really cares to hear that I am having coffee and a bagel!” and so on.

What I think almost everyone missed (even the folks working at Twitter) was that the reason it works is not that it’s some genius piece of technology, people are incredibly narcissistic, or that it’s a revolutionary communication tool — no, the reason it works is that it’s Life with a capital letter. Life is full of messy conflicts, vacillating between order and chaos, between breathtakingly mundane and prosaically entrancing. But when Life is presented concisely and with most of the uninteresting bits edited out, it’s pretty damn riveting.

Yep, pretty much.

So for all the self-described social media gurus, experts, wizards (and all the other inflated, meaningless titles) out there — stop it. Just stop it. Quit trying to con people into thinking that good technology requires an elite priesthood to understand or use it when the exact opposite is true. The better the technology becomes, the less separation there is between it and us. That’s the whole point really.

I guess this is my roundabout way of revisiting that “A Look Back @ Twitter” piece I wrote ages ago. Twitter has grown since then, and the ways we all use it have changed, but it continues to be a part of my life because it is inseparable from my life – I don’t mean I couldn’t live without it…just that there is no part of my life that hasn’t always had a place in how I use Twitter. As my wife knows better than anyone, my Twitter posts are a pretty damn accurate representation of who I am — random interesting bits I want to share, snarky commentary on things I don’t like, and keeping in touch with the people who are important to me. And the people I follow on Twitter reflect what I look for in the world around me – humor, intelligence, beauty, new ideas, and people basically not being dicks. For better or worse (and it’s probably both), it’s authentically me.

This is why the idea of “social media” is a myth. It’s not new; it’s the same thing humans have always been doing. We get too hung up on the details of the mode of communication and spend too little time focusing on what we’re communicating. This does not require a digital priesthood of gurus showing us the way, it does not require us to engage in the “right” way – it merely requires that we communicate in a meaningful fashion. This is true for corporations just as it is true for people.

Having a “social media strategy” is like what having a “telephonic device strategy” would’ve been like at the beginning of the 20th century. If you have to compartmentalize a method of communication that thoroughly, chances are you’re doing it wrong. Technology and buzzwords change too quickly for that ever to work. Just be who you are and, as I wrote in my earlier piece, “a ‘tribe’ of like-minded people organically grows out of that.”

The more any technology allows that to happen, the more successful it will be and the more ubiquitous it will become. The further from that a technology or service strays from that by attempting to subvert, control, or manipulate (*ahem* Facebook), the less successful it will be.

Communication is older than humans. Older than mammals. Bees doing a dance to show the way to a food source, ants identifying others from their own colony — even these aren’t as far back as it goes. Think single-celled organisms releasing and receiving chemical signals. But for the past 100,000 years humans have communicated better than any other lifeform on the planet, and we’re still not that great at it a lot of the time. We’re getting better at it though and technology is the only real way it’s going to continue to improve.

What has been called “social media” is part of that improvement I think, and at the moment, I still think Twitter does it better than any other similar technology. But identifying it all as something separate from “communication” is a pointless exercise – it’s like picking one fork of a river and saying “This is separate and distinct from everything else! I declare this water behaves differently than that water over there!”  I’m pretty sure that would come as a surprise to a fish swimming downstream.

I’ll conclude here with my not-so-secret secret for social media success, if that kind of thing is important to you: Just stop thinking about what you’re doing and be who you are — and if you don’t like the results of that, trying becoming who you want to be. If you can do that, I promise you will never need a social media guru. If you can’t do that, the problem isn’t how you use social media, it’s you.

Yes, the title of this piece is a reference to the Mel Brooks movie, and like the movie, I’m not sure there will ever be a part 2.

Let’s Not Forget What the Enemy Is

. . . bad code! Some random thoughts on a technological love triangle, why I hate Microsoft, and how the world inside our computers should be more like the world outside of them

So, Apple and Adobe aren’t getting along. And Google and Apple are on the outs. There are cries that Apple/Google/Adobe needs to be more open and transparent. Google need this. Adobe needs that.

What fools these mortals be! Let’s all stop for a second and remember where we’re coming from. Picture if you will a timber-industry forest . . . row after row of trees, and not just trees, but all the same species and often times trees that are essentially identical. That’s the landscape that existed 10 years ago, brought to you by Microsoft. Remember them? Small little company up someplace in the Pacific Northwest called Redmond I think. They’ve accounted for 85%-90% of the client-side OS marketshare for seemingly forever. Oh, and in that article link (published in 2000), an industry analyst said “the Mac OS continues to be a non-threatening element in the market.”

Forest picture taken someplace close to Redmond I'm sureSo why do I equate Microsoft with a monotonous, undifferentiated forest? Well, there’s the obvious dominance of the market, but also because, despite what they claim in every press release, they are the antithesis of innovation. Oh, they may purchase a company and/or technology once in awhile, and bring it out to the masses, but once that happens, it’s dullsville. The technology languishes. This isn’t because of some malevolent effort by the Microsoft suits — no, it’s just a function of the fact that Microsoft is essentially as old-school as IBM at this point. They’d love to innovate, but that market-share of installed tech is an albatross around their necks. By the time they’re able to integrate a new technology, they’ve reached a decision tree fork : option a) continue to expand use and integration of new technology or b) shoehorn in whatever newer technology that is out there and people are clamoring for. As far as I can tell, they always choose option b.

Why does all that matter when discussing Adobe/Apple/Google? Since the popularization of personal computing technology, there have always been only a handful of dominate players in terms of hardware and software, and very little balanced competition — usually you’ve got a dominant player, and one or two also-rans that help keep the antitrust folks from breaking someone up. You’ve got something like a close to 90-percent chance that, if you’re reading this on a computer, it’s got an Intel processor and is running some flavor of Windows. In ecological circles, you know what they call that? Monoculture. And it is invariably a sign of a weak,sick, or damaged ecosystem. There’s never really been any evenly matched competition on PCs in terms of hardware or software.[note: just to be clear, I’m not talking about sellers of PCs, e.g. Dell, HP, etc. — that doesn’t count as they’re all selling the same damn thing)

The lack of diversity and healthy competition shows — Windows for all of it’s iterations, is a dinosaur. To mix my metaphors, every new versionof Windows strikes me as just another layer of lipstick on the dinosaur. Great – they add the “Aero” interface to Vista and Windows 7, but can they bother to include something better than Notepad to edit text files? Of course not. They now support 64-bit, but can’t come up with a better file system than NTFS?  You know how old NTFS is? 17 years! And the most recent version (used in XP, Vista and Windows 7!) is 9-years old!

Compare that to Mac OSX or Ubuntu – not only do those OSs manage to introduce major new features fairly regularly, but along the way, basic tools and services get improved as well.

So is Microsoft the enemy? No, and I ascribe no malice to them either. The enemy is, as Steve Jobs might say, software that is crap. And at the moment, Microsoft’s OS, web browser, and office suite are all crap. From antiquated technology to awkward interfaces and on to a lack of support for standards, Microsoft serves as the perfect negative-example of how to code software.

Which brings us back to our love triangle of Adobe/Apple/Google. I believe the paradigm of the 800-lb gorilla and everyone else we’ve been operating under for decades is about to change. And it shouldn’t be a question of who is more open and transparent, or who is more or less “evil” — the only things that matter should be: does the code work? is it elegant in design and function? does it fulfill a need? Other than that, there is no right or wrong. Now, that’s not to say that other personal preferences can’t come into play. You may only want to use open-source software, which is perfectly fine, but realize that is what works best for you may not be the “superior” choice for everyone else. You may only want to use technology X and never technology Y, but again, while that may work for you, it almost certainly doesn’t matter to most other people. What we often forget is that most users don’t care how something works, just as long as it does.

The lesson Microsoft should provide all of us is that a monoculture is bad. We need diversity in not only the hardware and software, but in approaches to coding the software, to sharing or selling it, and in how it is used. That will be the only way sustainable progress happens. Every time some blowhard pontificates on why some company’s product or service will bring about the end of the world, all they’re doing is displaying their backwards view. No company has that much control anymore – not even Microsoft. The computing world has gotten too big – too many devices, running too many different OSes, and too many different types of users and that’s a good thing.

So let Apple keep being Apple – Steve Jobs turned around a company headed to the trash heap of history and made it a dominate player in industries that didn’t even exist 10 years ago. They will never be as open as some want, but that’s what works for them and their users, and you can’t argue with that. They control the horizontal and the vertical, and that’s part of why the user experience on any Apple device is superior to almost every other competitor. No one is forcing anyone to use their stuff, so what’s with all the complaints?

And Adobe – don’t whine about Apple not accepting Flash for the iPhone. Flash is a crappy, klugey platform that I hope to see disappear within 5 years. It’s a clumsy tool that 95% of the time is used to create clumsy applications, it’s an accessibility nightmare, and your own tools to develop in it reflect all that perfectly. Oh, and considering your own history working with others developing software to work with your products, I’d not be complaining about Apple too loudly.

As for Google — are they a perfect company? No, and I don’t believe such a thing exists. But at the end of the day, they continue to push the borders of what we can do on the Internet and they genuinely seem interested in doing things in the most responsible manner possible. Do they probably need to focus on actually finishing and/or fixing things occasionally – hell yeah. And for all those worried that they’re going to be the next Microsoft – relax, there will never be another Microsoft – the technological and economic environments will never again sync up in such a way to create something on par with Microsoft.

In my ideal world, this would be the progression for any new service or technology. Google should create or develop it, Apple should then take it and make the interfaces functional and intuitive, and then Adobe can take on the maintenance,expansion, and support of it. Actually, in my ideal world, these three companies and all the others out there would do all of these steps well, but that has about much chance of happening as Microsoft delivering software that doesn’t make me start swearing.

Closing thought: Technodiversity? Should that be a term? If it isn’t, I think it should be. I believe in the future it will become as important a topic as biodiversity. Just as with the natural spaces, where biodiversity yields a healthier environment (more biomass, resistance to disease/pests, evolutionary progress), our cyberspace is dependent on technodiversity for dissemination of information, security, and further advances in technology.