Nostalgia, Toys, and Making Connections in a Small World

Much has been said of how the Internet has made the world smaller and more connected. So often in fact, it has now become cliche to comment on it at all. But occasionally one gets reminded of it in such a strong way, you can’t help but shake your head in disbelief …and just a touch of wonder.

Let’s go back to 1980. There’s a small boy, nine years old, sitting on the floor of his family’s living room staring with quiet intensity at what’s before him. We notice he’s small for his age, both in height and in weight (not quite at the point where he disappears if he turns sideways, but it’s a near thing). He has large blue eyes, a mop of dark blonde hair, and a head that is quite a bit larger than the rest of him. He’s been sitting where he is for about two hours, quietly doing what makes him happiest — building.

He has a Millennium Falcon and a X-Wing nearby, along with the requisite action figures. But he’s not playing with those right now. Instead he’s building an Imperial prison. Then he’s building a Rebel base. Now it’s another spacecraft, but one never dreamed of by the masters of model building and practical effects employed by George Lucas. This isn’t LEGO, ruled by right angles or poor stepped approximations of diagonals. This is something infinitely more flexible. It’s a construction set called “Ramagon” and it inspired that young boy like no other toy before or since.

Vintage-1979-Toy-RAMAGON-2000-Construction-System

It also doesn’t exist anymore. Out of production for years now, I had even forgotten the name of it for awhile.

As you might guess, that boy was me, and that Ramagon construction set was, without a doubt, my favorite toy ever. It had a unique hub and strut building system that allowed you to make beautiful and strange creations that not only were large, but looked like the very definition of “the future,” circa 1979. While you could still make right angles, you could actually make connections in twenty-six separate directions off a single piece. While the structures you built looked and felt lightweight, they were substantial and sturdy.

It was not only a fascinating toy to build with just for the sake of building, it was the perfect way to build things that you could use with other toys and action figures. With triangular and square panels, you could create platforms and give your creations heft and solidity. Without the panels you could create airy, skeletal constructions that looked very similar to the plans for a space station that NASA had been planning at the time. I built elaborate worlds for my Star Wars toys. I built towers taller than I was. But the most fun I had was just building really complex geometrical shapes and seeing what I could do with them.

Ramagon  Micro base

I got older of course, and my Ramagon set eventually disappeared – probably in some charity donation. But I played with that set for a good six or seven years. Looking back later I realized that it hadn’t been just a toy used for entertainment, but something that helped me learn problem-solving and spatial visualization. I learned how to break big problems down into smaller pieces. I learned to balance having a plan with spontaneity and imagination. And while I love LEGO too, just connecting one brick to another isn’t very exciting – the building process with LEGO felt like a grind, the focus being on what you were building more than how you built it. Ramagon on the other hand opened up a whole world of possibility — not only allowing you to think about making connections in all directions, but encouraging it.

Flashing forward a number of years, and I now had two children of my own and I wanted to give my kids the same toy I’d had and more importantly the same experience I’d had. The first hurdle was one I’m ashamed to admit: while the toy had stayed fresh in my memories, the name of it was something I’d forgotten decades and decades ago. I did a lot of web searches for “1980s construction toy” and looked at a lot of pictures. I even searched for “1970s construction toy” as, with a child’s self-centeredness, I had no idea how long it had existed before I got mine.

Finally I had my eureka moment and found references and pictures on some sites that listed older toys. It was… Ramagon. Honestly, how I forgot a name like that I’ll never know. And to be fair, the Ramagon pieces were never emblazoned with a brand name the same way way LEGO pieces are.

Well, now I had a name but my jubilation was short lived. Turns out that by the time my kids were old enough to play with them and I went looking for them, they had been discontinued. I was crushed. As a parent, we all tend to want our children to be introduced to the things we loved best from our own childhoods and it looked like I wasn’t going to be able to do that. This was especially discouraging as I thought that Ramagon was the ultimate building toy that could be enjoyed by both my son and my daughter. Especially as both of them have tons of LEGO, and the later Ramagon sets had added panels that allowed kids to integrate their creations with LEGO bricks. I knew they’d love the possibilities it represented. It was frustrating knowing the perfect toy existed at one point but now was effectively gone.

I’d occasionally look for people selling Ramagon sets and would find some outrageously priced sets on eBay, sigh dramatically, and go about my business. My kids continued to get more and more LEGO sets and other construction toys and I continued to comment “Those are cool, but back in my day, I had the perfect building set…” They would roll their eyes and go back to what they were doing.

In the second half of last year I started wondering about where Ramagon came from. Who had invented it? It’s funny – so many commercial toys are completely divorced in the public mind from the person who invented them. Big toy companies don’t have much interest in promoting creative talent the same way tech companies do (obvious break-out hits like Rubik’s Cube being the exception). But I had a feeling that it would be possible to identify a single individual as the inventor – the set, its history, and everything I’d found out so far made me feel like this was someone’s passion, not the result of corporate focus groups and demographic targeting.

I’d already learned that it was never a toy in the same league as LEGO or Erector (or the later K’NEX) in terms of popularity and I would get met with blank stares and shrugs whenever I told people about it. After a consulting job that had me researching various patents, I decided to try looking through registered patents to see if I could find the person who had created, in essence, some of the happiest moments of my childhood.

US4129975-1Thanks to the Internet and specifically Google, searching patents is much easier than it used to be. That said, trying to find a patent without knowing the inventor or even the company that originally manufactured it (I knew the license for the toys changed hands over the years), is very difficult. Especially as the Ramagon name itself likely wasn’t even going to be mentioned in the patent (though later patents for similar toys did mention the toy by name). After much searching and looking at crazy toy designs (most of which were probably never sold anywhere) I found one: U.S. Patent 4129975 A. Inventor: Richard J. Gabriel.

So Mr. Gabriel invented the toy I still thought about all these years later. My question was answered, but I didn’t know what to do with that information. However, as I sometimes do, I drafted a letter in my head, thanking Mr. Gabriel for having created something that meant so much to a quiet, shy kid who found a way to express himself by building what he saw in his imagination. I was sure it was a letter that would never be sent. How could I even find him to send it? Would he even care? Was he even still alive?

And once again we come back to the point I made at the beginning – the world is smaller than it used to be. I grew up at the end of the era of three TV networks and rotary phones, and while I’m frequently an early adopter of new technologies, I can’t say that my thinking isn’t a little colored by a worldview now several decades out of date.

I went ahead and searched using Mr. Gabriel’s name and the word “Ramagon.” I found quite a few hits, mostly the meta cruft that is often associated with business listings. Lots of information, but none of it especially useful. I paged through more results, and finally… unbelievably… I found not just a website, but his website. Fittingly, he’s been an architect for more than 25 years, and there on his website was his email address.

I typed out basically what I’d already drafted in my head and sent him an email, not really expecting anything, but just wanting more than anything to say “Thank you.” That same day I received a reply from his wife Ann letting me know he’d get back to me in a couple of days. I was astounded.

Richard (and his wife Ann) wrote back and thus began a correspondence we’ve sporadically maintained in the midst of busy schedules. Richard and Ann have led fascinating lives, and I’ve loved hearing about what they’ve done and what they have planned. I even managed to provide a little help to them involving web design and online marketing. It was literally the least I could do in return for what I’d already received from Richard. I consider myself lucky to now count Richard and Ann as friends.

This had all started with the itch of unsatisfied nostalgia. I had gone looking for an old toy, and by extension, my childhood. I wanted to find a way to express appreciation for something that gave me so much joy as a child. I found so much more than that.

I found a link to my past that gave me a new perspective. I found new friends it felt like I had known for years. And thanks to the unbelievable generosity of Richard and Ann, I found something else too. In the mail this week, I received the following:

image2(1) image1(3)

Richard had, at my request, even signed the boxes for me. And with that, I was finally able to pass along to my children that idolized toy from my childhood. And along with it, a connection to a world that is both smaller and more amazing than the world I lived in some thirty-five years ago.

From the moment I pulled the sets out of the box they were shipped in, my kids’ eyes lit up. There were appreciative oohs and ahhs from both of them. My oldest, who just turned 13 and who has begun to have a pretty good idea of the value of such things, commented “It almost seems a shame to open them up.” I answered back “It would be a bigger shame not to.” And with that, we set about building.

I may have bogarted the toys a bit at the beginning. The pieces felt comfortably familiar in my hands. The click as pieces came together providing the same satisfying completeness that it had so many years ago. We built a spaceship. We built a Martian base. We built.

FullSizeRender IMG_4357

This isn’t a story about nostalgia, or toys, or being an uber geek about something (though it obviously includes all those things). For me, this experience has been about the sort of connections possible in the small, connected world we live in, and the connections that exist within ourselves. How those connections can go off at any angle but that together, they can make something beautiful, strange, and the very definition of “the future.” It’s been about how when things click together just right, it provides a sense of completion.

And I hope for Richard that this is a story about how if you build with passion and creativity, as he did, what you built will last far longer than you could have dreamed.

I want to once again express my heartfelt thanks and deepest appreciation to Richard and Ann. Nine times out of ten, or maybe even ninety-nine times out of a hundred, if someone in a similar situation had received my email, assuming they even read it, they’d likely just smile and move on. I think it says something that they didn’t. Maybe with all their experiences across the globe, they realize that while it may be a small world, it’s full of large stories and the greatest fun comes either from making your own or from being a part of as many of them as you can.

The History of the World (of Social Media), Part 1 – The Myth of Social Media

Focus can be a dangerous thing.

Have you ever played golf? I do on occasion. It’s generally not pretty. Whenever I get a golf club in my hands, even if it’s just at a driving range, I face a battle. The battle is between trying to be mindful of how to swing a thin stick made of carbon fiber composite with this funny little bend at the end at a ball in such a way as to make the ball go straight and far…and completely forgetting I’m doing any of that. Usually  what happens are one or two good shots and then some wicked hooks and slices.

…and then that’s when I knocked over Doc Ock’s mailbox.

There are many pursuits which are similar — learning a musical instrument, playing video games, and for me — driving a stick shift. Everything goes along swimmingly until that moment you realize what the hell you are doing and then it precedes to all go sideways (which if you’re driving a car is usually the wrong way to go about any sort of forward progress). However, when it’s right, you forget what you’re trying to do, the world falls away and you just do it.

What is usually referred to as “social media” is like that. The best examples of social media gone wrong are usually the result of someone over-thinking and trying too hard. It feels false to anyone who sees it and thus whatever effect was intended is lost. (See the recent Applebee’s fiasco for an excellent example)

Almost three years ago, I wrote a piece titled “A Look Back @ Twitter.” It was, by far, the most successful, most read piece of writing I’ve posted here and continues to be found and read (more than 2,000 views out of the more than 10,000 I’ve received on this blog). At the time I wrote it, I’d been on Twitter for almost 3 years but only seriously using it for about a year. I have now been on Twitter for more than 5 years (it will be 6 next July) – which is like a millennium in web timescales. If you’re interested, I think the post is still pretty true and has held up relatively well, so it may be worth your time if you haven’t read it. I only bring it up to point out that since I wrote it, Twitter has changed — as well it should.

When the modern Internet developed (not back in the ARPA days, but more recently in the boom times of the 90s when the idea of it went mainstream), it was a tool in search of a problem. A number of people and companies came forward, sure in the knowledge they had figured out the secret, and they tried to make the Internet and the World Wide Web (that term sounds so freaking archaic now!) fit that vision: commerce, communication, whatever. Most of them failed. Some had some success and everyone kept trying because it was just this huge, wondrous thing that everyone knew would be vital…somehow.

Then came “Web 2.0” — a defunct marketing term if ever there was one — and after that “social media.” While there was much corporate verbiage thrown about related to leveraging communication, targeting consumers, engaging audiences, and other such nonsense, what it all basically boiled down to was a bunch of people throwing stuff up on the wall and seeing what stuck and never being quite sure why it did.

But what sticks is this: LIFE.

Zen on the Beach
“And we’ll be saying a big hello to all intelligent life forms everywhere. And to everyone else out there, the secret is to bang the rocks together, guys.” – Hitchhiker’s Guide to the Galaxy, Douglas Adams

People want to do what they’ve been doing since we started banging rocks together: find and acquire things (food, love, a good place to find the right kind of rocks), talk to other people about the things that interest them (food, love, what kinds of rocks are best to bang together), and knit themselves into a supportive social web of people that will make it a little easier to bear all the times when you can’t find food or love, or when you bang your thumb with a rock.

Where a lot of people (and companies spending obscene amounts of money) went wrong was in thinking that the “technology revolution” would change society. Instead what it has meant is that technology has changed. If you’re as old as I am, you remember what technology used to be: centralized, top-down, and hierarchical. Think mainframes. Think Ma Bell. Think broadcast network television. It was all still based on being pushy with electrons, but now it is more often (but by no means always) crowd-sourced, bottom-up, and nonlinear.

And that brings us back to Twitter (and Facebook, Tumblr, Reddit, Google+ and anything else ever referred to under the umbrella term of ‘social media’). Some of the people I met online during that my Twitter early days have bemoaned, as I have on occasion, that Twitter back then was more fun. No one knew what the hell we were doing and it worked. Most of my strongest relationships with people I’ve met online were started during that time period, with many having become close friends in my life-away-from-keyboard (aka #LAFK).

Outside the Twitter bubble, I always surprised to still encounter a lot of antagonistic feelings about Twitter and other social media services, often expressed as: “Why would I want to post everything I do online?” “No one really cares to hear that I am having coffee and a bagel!” and so on.

What I think almost everyone missed (even the folks working at Twitter) was that the reason it works is not that it’s some genius piece of technology, people are incredibly narcissistic, or that it’s a revolutionary communication tool — no, the reason it works is that it’s Life with a capital letter. Life is full of messy conflicts, vacillating between order and chaos, between breathtakingly mundane and prosaically entrancing. But when Life is presented concisely and with most of the uninteresting bits edited out, it’s pretty damn riveting.

Yep, pretty much.

So for all the self-described social media gurus, experts, wizards (and all the other inflated, meaningless titles) out there — stop it. Just stop it. Quit trying to con people into thinking that good technology requires an elite priesthood to understand or use it when the exact opposite is true. The better the technology becomes, the less separation there is between it and us. That’s the whole point really.

I guess this is my roundabout way of revisiting that “A Look Back @ Twitter” piece I wrote ages ago. Twitter has grown since then, and the ways we all use it have changed, but it continues to be a part of my life because it is inseparable from my life – I don’t mean I couldn’t live without it…just that there is no part of my life that hasn’t always had a place in how I use Twitter. As my wife knows better than anyone, my Twitter posts are a pretty damn accurate representation of who I am — random interesting bits I want to share, snarky commentary on things I don’t like, and keeping in touch with the people who are important to me. And the people I follow on Twitter reflect what I look for in the world around me – humor, intelligence, beauty, new ideas, and people basically not being dicks. For better or worse (and it’s probably both), it’s authentically me.

This is why the idea of “social media” is a myth. It’s not new; it’s the same thing humans have always been doing. We get too hung up on the details of the mode of communication and spend too little time focusing on what we’re communicating. This does not require a digital priesthood of gurus showing us the way, it does not require us to engage in the “right” way – it merely requires that we communicate in a meaningful fashion. This is true for corporations just as it is true for people.

Having a “social media strategy” is like what having a “telephonic device strategy” would’ve been like at the beginning of the 20th century. If you have to compartmentalize a method of communication that thoroughly, chances are you’re doing it wrong. Technology and buzzwords change too quickly for that ever to work. Just be who you are and, as I wrote in my earlier piece, “a ‘tribe’ of like-minded people organically grows out of that.”

The more any technology allows that to happen, the more successful it will be and the more ubiquitous it will become. The further from that a technology or service strays from that by attempting to subvert, control, or manipulate (*ahem* Facebook), the less successful it will be.

Communication is older than humans. Older than mammals. Bees doing a dance to show the way to a food source, ants identifying others from their own colony — even these aren’t as far back as it goes. Think single-celled organisms releasing and receiving chemical signals. But for the past 100,000 years humans have communicated better than any other lifeform on the planet, and we’re still not that great at it a lot of the time. We’re getting better at it though and technology is the only real way it’s going to continue to improve.

What has been called “social media” is part of that improvement I think, and at the moment, I still think Twitter does it better than any other similar technology. But identifying it all as something separate from “communication” is a pointless exercise – it’s like picking one fork of a river and saying “This is separate and distinct from everything else! I declare this water behaves differently than that water over there!”  I’m pretty sure that would come as a surprise to a fish swimming downstream.

I’ll conclude here with my not-so-secret secret for social media success, if that kind of thing is important to you: Just stop thinking about what you’re doing and be who you are — and if you don’t like the results of that, trying becoming who you want to be. If you can do that, I promise you will never need a social media guru. If you can’t do that, the problem isn’t how you use social media, it’s you.

Yes, the title of this piece is a reference to the Mel Brooks movie, and like the movie, I’m not sure there will ever be a part 2.

A taxing issue: Why California is being stupid

As some of you may have heard, California just enacted an Internet sales tax. Some states have already done so, and in difficult budgetary times, who can blame them – right? I mean, if you walk into a store and buy something, chances are you have to pay some sort of tax on it. And when it comes to budgetary difficulties, none of the other states can really hold a candle to California – I mean, they’re not just broke, they’re like super-broke. As it happens, if California were a country, its economy would be on the scale of Spain or Italy. So they’re like a broke 800-lb gorilla, and they’re turning to companies like Amazon and saying “Hey bub, pay up!” (okay, so now it’s a broke talking 800-lb gorilla — sue me and my overtaxed metaphors…and puns)

For me, the problems with what California (and other states) is doing were hammered home on my drive into work this morning and listening to this story on NPR’s Morning Edition. Take, for example, this quote from the piece:

California needs money and state officials say the new sales tax could bring in two to three hundred million dollars a year. Amazon wouldn’t be paying more in taxes, just collecting the sales tax on customer purchases and passing it on to the state just like brick-and-mortar retailers do now.

You’ll hear that refrain from the bricks-and-mortar retail folks all the time. In fact, Bill Dombrowski, the head of the California Retailers Association had this to add:

In California, that meant [Internet retailers] had a 10-percent, roughly,  price advantage every day of the year, and shoppers were turning our stores into showrooms and then going out and shopping on the Internet.

This is presented as an issue of fairness and how they’re not really picking on Amazon because Amazon wouldn’t be paying the taxes, just their customers.

BULLSHIT

This is simply those with a failed business model teaming up with desperate policy makers to with one hand stifle online retails sales while with the other hand filling state coffers with money that is not the state’s due. Hold on some will say — why doesn’t California or any other state deserve their slice of that sweet, sweet revenue? The answer is obvious – the states DO NOT support the infrastructure that allows those transactions to take place.

At their best, taxes are simply a method for any government to recover costs associated with supplying the benefits of government and infrastructure in a way that hopefully distributes the costs fairly.  You want roads? Fine, but they need to be paid for, and that means taxes. You want a water processing plant so your water is clean? Fine, but there will be taxes to pay for it. And so on. Are all taxes reasonable or fairly applied – hell, no. But that is the idea behind them, and there is a recourse if you’re, say, a California businessman who doesn’t like having to collect a state tax. You can work to vote out the people who put the tax in place and find someone else who agrees with you or even run yourself. What recourse do I have in Virginia, or someone in Washington, or Illinois for taxes collected in California? None. You know, I believe that’s called taxation without representation. Generally not regarded as a good idea since about 1776.

So let’s take your bricks-and-mortar retailer. The local, state, and federal governments provide law and order, police and fire departments, regulation of utilities, schools, roads, etc. All of which the retailer benefits from when there are people around who can easily and safely shop at their store and have the money to do so. And that’s why he pays his taxes. Not because the government is due some tithe like a church – no, governments collect taxes to pay for all the stuff we ask them to do.

Now, look at an online retailer. Do they need roads? They might if they’re producing goods that need to be shipped, but if they’re selling ebooks, well then, not so much. What about all those other government services? Again, they’re either not applicable or the retailer pays for them through local, state, of federal taxes wherever they actually have offices. The only absolute-must-have thing for most online retailers is the Internet, and the states pay absolutely nothing for the maintenance and operation of that.

So what does Amazon “owe” the fine state of California or any other individual state? Not a whole hell of a lot.

As for the bricks-and-mortar retailer who feel this is so unfair? I guess they should have listened to that kid that told them 15 years ago that selling stuff online would some day be huge. I mean I do feel for them – they provide jobs, locally accessible tax revenue, etc. And some people like buying stuff in person from someone they know. Great! Brick-and-mortar shops that have great service and products will continue making money because the people who like to shop in person are willing to pay a little extra for that. But if your business model is so focused on price that the difference between having to charge tax and not is putting you in the red, chances are it’s your business model that has failed, not the government in collecting taxes.

Going back to the point above about states paying nothing for the maintenance and operation of the Internet — so who does pay? Well, the federal government created it and do provide national regulation of a sort across it, so perhaps they’re due something. But the infrastructure that provides us the Internet is already been bought and paid for — by us. First, obviously through federal taxes to fund things like DARPA and the creation of the Internet. But that’s really just a pittance and has almost no bearing on the Internet today. No, the existence and continued maintenance of the Internet is provided by telecommunications companies that own the lines, the networking equipment and the servers and they in turn charge people and business for services. No government “runs” the Internet – so unlike those non-Internet superhighways, state governments really can’t make a case they’re doing anything else but reaching into people’s wallets and grabbing a few bills because they feel like it they really need it.

Now, as a commie-pinko liberal, I like taxes as much as anybody else — okay, not really, but as I stated above, when done right, they are justified. A minimal Federal tax on Internet purchases is within the realm of reason I believe. Something small like 1- or 2-percent tax. It could go to pay for Federal oversight of the Internet infrastructure within the U.S. and the companies that own it, as well as initiatives to pay for research into faster bandwidth and carrying broadband to rural and poor areas. I believe a case can be made that without federal oversight, online retailers would find the Internet a much more hostile place to do business, so the tax would be justified I believe. And if there is any leftover money, add it into federal transportation expenditures so that all those highways that trucks drive on to deliver us our online-purchased goods are kept well-maintained.

That’s a reasonable discussion to have. How much, what exemptions might be needed, etc. There are details that will need to be worked out, but it should be worked out at the national level, since that really is the only place all of us have the representation to have our voice heard. And as it is largely an issue of interstate commerce, the Fed’s role is well precedented.

However, for anyone to claim that companies like Amazon are engaging in “blackmail” when they, as Amazon did, cutoff relationships with affiliates or close offices as a result of state-based sales taxes being enacted is idiotic. It’s like a mugger getting miffed that you don’t come around anymore after having mugged you repeatedly in the same neighborhood. Even worse, a patchwork quilt of state-by-state online sales taxes would have to be just about the most inefficient collection of revenue ever conceived. Also, since some states will see the advantage of NOT taxing online sales, what will happen will be online companies only having headquarters and doing business in those states.

And I’m sorry California, but your Internet sales tax is ill-conceived, counter-productive, and in my opinion probably not legal. I know you all have already cut to the budget bone in some cases, but if you hadn’t tied up so much of your budget into nondiscretionary spending through voter referendums and not also voted in the tax hikes to pay for them, you wouldn’t be in this mess to begin with. Maybe institute some taxes on bad movies, raise taxes on untalented celebrities, or look for other creative revenue streams? Hell, just tax your ex-Gov. Arnold every time he says or does something stupid — that should help.

What do you meme?

So I was looking through my news feeds and saw an article on Slashdot titled “Researchers Claim 1,000 Core Chip Created” and I knew without even looking at it that in the comments there would be at least one reference to a Beowulf Cluster and at least one each of “Will it run Linux?” and “Will it run Crysis?” And lo and behold, right there, out of 60 or so comments, there they were.

This got me thinking about memes and specifically Internet memes and why they have become so pervasive. The broad term itself was coined by Richard Dawkins in his 1976 book The Selfish Gene and refers to a transfer of an idea, symbol, or practice, sort of a cultural DNA if you will. So, in a sense, democracy is a meme, as is fascism. But so is the iconic happy face and a skull and crossbones. So it’s not just data, but data with cultural significance that replicates from person to person. Now, on top of that, add the Internet and you have an explosion of memes.Why?

The facile answer is that with all those bits flying about, we and the machines we use to communicate are more easily able to act as vectors for memes that would have been restricted regionally before the dawn of the Information Age (or as I prefer to name it “The Time of  the Blinky Boxes”). I believe that’s probably a large part of it. With the explosion of various ways to communicate with others on a one-to-one or one-to-many basis, we are able to communicate like never before, and with that communication, the virus-like memes hitch along for the ride.

However, I believe it’s also a signal-to-noise ratio issue. Most of the transfer of any information across any medium contains large swaths that are just noise – transient bits of no value, and it’s up to the receiver to parse the signal for information of value. This is as true of radio waves as it is of you and your Aunt Agatha who is blathering on about her summer vacation to Maine when she was seven when all you really need to know is where the fire extinguisher is at. You’ll tune out your dear Aunt’s tale of youthful adventure and not remember a word of it, because all you really want to know is where that damn extinguisher is at. It’s important to you. It relates to you in that your cousin Freddy was playing with a magnifying glass and is currently burning up your pile of comic books. It has, in a word, significance.

With the Internet, we are able to attach significance to a variety of sources, and with the advent of social media, we’re able to have those sources be actual people, but such interactions long predate anything referred to as “Web 2.0” or even “Web” anything for that matter. Internet memes are the progeny of similar interactions that took place on dial-up bulletin boards, website forums, and USENET groups way back when we wore onions on our belts because that was the style at the time. In some cases, for especially long-lived memes, they have actually survived the technological tsunami and been around from that day to this. Need an example? Think emoticons.

Whether it was onion-on-your-belt olden times or today, I think a large part of the reason we continue to exchange these memes like crazy is when dealing with other people over the Internet, we’re communicating in a fundamentally different way then ever before. While people have written letters to each other as long as there’s been writing (“Dear John, Sorry I haven’t written sooner, but my tribe just recently developed an alphabet.”), exchanges over the Internet are often in real-time or near-real-time. There’s the immediacy of in-person contact, but without all those biological and cultural cues that we’ve evolved to deal with and understand each other. With that comes the risk of not only misunderstanding (very easy to understand in nonverbal, non-face-to-face communication) but also without those cultural cues, there’s a lack of information on that sort of significance I mentioned earlier.

In a way, culture is a way for hairless apes to get together and decide what has significance and what doesn’t. And part of what keeps those cultures coherent and stable is the exchange and affirmations of what those things are. So when we interact with each other over the Internet, battling the lack of geographic and temporal cohesion, we seek to reinforce the cultural ties we have by sending out these little memes and expecting to have them transmitted to us as well. It’s how we evaluate what is the signal and what is the noise. When it comes down to it, there is an Internet culture, or rather several, and we all seek to maintain our ties to what we identify with most.

And I guess this where my ponderings ended up. We’ve all seen “news” articles or TV show segments about how technology is “ruining” us, our young, and how our culture as we know it is under siege. I disagree with almost all of that, as I am a firm believer in different not being bad (or good, for that matter). It’s just different, and the emergence of new nongeographic and atemporal cultures will bring about both good and bad in probably about the same degree as every other culture humans have ever created. But they are indeed unique cultures, and its the memes we trade back and forth that identify them as new and separate entities. Oh, and the part above that I did agree with? Every culture, at every moment in it’s existence, is under siege — either from older cultures trying to wring out a few more moments of existence, or newer or more successful cultures looking to supplant it. Survival of the fittest — the oldest meme of all.

So please continue to use “All your base are belong to us,” “FUUUUUUUUUUUUU,” and hashtag your Tweets with obscure references to movies, comics, games, and LOLcats. It’s who you are, and it’s who I am, and it helps tie together the whole messy ball that is Geek Culture. It’s double-rainbow, all the way.

Note: I am not trained in the study of memetics, and in case someone so trained comes across this post at some point, I offer my pre-emptive apology. I probably got some stuff wrong, but I did try and find some good sources to balance out my ignorance on the subject, including one article from The Guardian on Internet memes from 2000 titled “It’s all in the memes.” Any errors I blame on being reminded of dancing hamsters.

Walking on Water

Apple Computer. Barack Obama.

Wondering where I’m going with this? What do these two have in common? High expectations . . . I would even venture to say, extraordinarily high expectations. Perhaps even unreasonably high expectations?

I am 40-years old and I was around for the original Mac vs. PC war, if you can even call it that. Apple was Betamax to Microsoft’s VHS (see, I told you – even my metaphors are old) — a technically superior alternative that seemed to never get the traction it deserved (and yes, I know that Betamax went on to a long life in professional video circles, but it disappeared out of people’s homes). Steve Jobs was thrown-over for a guy famous for selling sugar water. Apple languished, even having to rely on it’s old enemy Microsoft for a deal to help it limp along. Then Jobs came back, the iPod came out, OS X came out, it switched to Intel processors, and then came the iPhone. Suddenly Apple was not only a going concern, it was profitable, it was leading not just one industry, but several. Apple users could hold their heads high once again. Now of course, they also have the iPad, which if you count it as a computer (which I certainly do), has lead Apple to not only be successful, but the single largest manufacturer of computers in the US.

So when Apple devoted their homepage to advertising a big announcement yesterday, the rumor mill started grinding away with fresh fervor. A music streaming cloud service? An iOS update? Something that they’d managed to slip by everyone until now? Beatles on iTunes? Wait, what?! The Beatles haven’t been a band for decades, half the members of the band are no longer with us – how the hell could adding their music to iTunes be relevant or considered important by anyone? Then the Wall Street Journal broke the actual story last night – the announcement was going to be about the Beatles coming to iTunes.

This morning, as the official word came down from Cupertino, I saw my Twitter stream fill with reaction — most of which seemed to be landing on a spectrum of emotional response somewhere between “meh” to “Oh, come on, who cares?”

Well, as it happens, I do. I will likely buy some of the Beatles songs – I have my vinyl collection, and somehow never seemed to get around to buying the CDs, and because I believe in supporting artists (even those as rich as Sir Paul), I don’t pirate music. But that’s not the reason I really care. The reason I care are my kids —  my 8 year-old son and my 2 year-old daughter. I seriously doubt either of them will ever buy a CD. To them, it’s already a technology dinosaur. My son’s iPod Nano is filled with music that I’ve put on there for him – hundreds and hundreds of songs across multiple genres and from multiple eras. My son likes Johnny Cash and Tom Petty just as much as Bowling For Soup and They Might Be Giants. And I want him, and his sister, to experience the Beatles as well.

The Beatles were progenitors of so much that has happened in music, from their start 50 years ago. They were only together as a band for 10 years! And yet everyone knows who they are, and they are an influence, in one way or another, of pretty much anyone who has ever picked up an instrument and wanted to play a song for somebody else.

Will the Beatles being on iTunes change the face of computing, technology, music, or anything else? No. But will it mean that the generation of my kids and all the generations after them will have a better chance to discover something wonderful? I think so. As Jon Stewart has spent a bit of time alluding to recently, the current environment for public debate and news is an overheated, blazing ball of hot air that somehow manages to shed no light on anything. Have we become so jaded that everything has to rise to a messiah-returning-level to even get 15 seconds of our attention? We’re like baseball fans suddenly wanting our team to hit every pitch out of the ballpark with the bases somehow magically loaded for each at bat.

And that brings me to poor President Obama. Much has been made of the shift of power on the Hill to the GOP and the rise of the Tea Party. Will Obama pull a Clinton and seek to “triangulate” his way forward? Or will he pull an FDR, who said the following in 1936 as he ran for re-election:

“We had to struggle with the old enemies of peace — business and financial monopoly, speculation, reckless banking, class antagonism, sectionalism, war profiteering. … Never before in all our history have these forces been so united against one candidate as they stand today.  They are unanimous in their hate for me. And I welcome their hatred!

Hmm, that certainly seems rather fitting, doesn’t? As a Democrat, and as a progressive, I hope against hope that he chooses the path of FDR. But I also know that is not the type of man Obama is. And that’s okay. I supported him early on in the primaries against Hillary Clinton because I believed he was a liberal pragmatist, a.k.a a progressive, and I voted for him in the general election, not because I believed he was some sort of liberal messiah who would guide us to the socialist promised land, but because after 8 years of Dullard-in-Chief, I wanted a President who was thoughtful and considered in his responses. I wanted a President that would not only lead, but would lead by example.

Am I 100-percent happy with Obama’s term so far? No, I’m not. But do I think there is anyone out there who could have done better? No — and I still believe he’s the best person to have in the Oval Office right now and for the next six years. To all my fellow travelers within the liberal wing of the Democratic Party who have been bitching and moaning about Obama and will he or won’t he cave on X or Y, I ask the following: Did you think that the work towards the country you want ended on Election Day 2008? Did we elect Obama to carry us forward or to give us the opportunity to move ourselves forward? Where were you when the Democrats on the Hill were adding flotsam and jetsam to health care reform? Where was your approbation when the Democratically-controlled Senate sat on its hands as the House passed bill after bill that would have created jobs, true financial reform or a national green energy policy? Where were your howls of frustration that both the House and the Senate could not overturn DADT or that they refused to allow Obama to close Guantanamo Bay?

In short, while he is President, Obama is still just one man, and leads just a third of the federal government. Will he AND the Democrats on the Hill need to compromise to move anything through Congress? Undoubtedly. But is that a sign of failure? Compromise has become a dirty word in Washington and indeed across the country, but the only way to move forward as a country is to compromise. No defeat or victory is ever the last in politics, no matter how much the 24/7 noise I mean “news” machine builds it up. To go back to my baseball analogy, be happy when we get on base and don’t boo whenever it isn’t a home run.

So that’s how I see Apple and Obama linked — by a shared perception that anything less than “insanely great” equals failure. It’s a fine narrative for the media to use to fill in the spaces between ads, but it’s not reality.

I mean, come on, it’s not like anyone is claiming to be bigger than Jesus, right? 😉

Hacking The Body Politic

The intersection (or lack thereof) between geeks and politics

I’m going to start off by apologizing. I’m about to do something I hate to do, because I find it to be a cliché – I’m starting off a piece of writing with a definition. I only do it because I will be addressing several meanings of the word, and rather than stating it all mixed in with the text, I figured I’d tee it up here right at the top.

politics: noun (Etymology: Greek politika, from neuter plural of politikos political; Date: circa 1529)

1 a : the art or science of government; b : the art or science concerned with guiding or influencing governmental policy; c : the art or science concerned with winning and holding control over a government
2
: political actions, practices, or policies;
3 a
: political affairs or business; especially : competition between competing interest groups or individuals for power and leadership (as in a government); b : political life especially as a principal activity or profession c : political activities characterized by artful and often dishonest practices;
4
: the political opinions or sympathies of a person;
5 a
: the total complex of relations between people living in society; b : relations or conduct in a particular area of experience especially as seen or dealt with from a political point of view <office politics> <ethnic politics>

Sorry! Okay, now on to defining “geek” . . . Just fooling! Really, that’s just a tar baby I’m not going to rassle with. Anyway, for a word that gets thrown around so much, those are a lot of definitions!

Of course, politics is a very topical word right now as we’re about to come up on another Election Day, and yet another occasion where I’ll probably look around afterward and say “Bu..but, we’re smarter than that, I swear!” (Note: I’m not saying all liberals are smart or that all smart people are liberal — far from it. More just that smart people on either side of the aisle seem to be a vanishing breed.)

Every Election Day highlights personally for me another aspect of my multi-faceted geekdom. In addition to the techie, movie, foodie, and other types of geek I seem to express, I’m also a huge political geek – both by having been involved on a more intimate-than-usual level and through my own personality and interests. What always has struck me as shocking is how few other types of geeks crossover into politics as well. I mean I am used to a certain level of ambivalence to politics from the general population (amazing what randomly knocking on people’s doors and asking them to vote for somebody will reveal), but generally geeks of all stripes seem to eschew politics more strongly than is usual.

Why is that?

I’ve given this a lot of thought over the years — as much in an effort to reconcile different strains of my own personality as to explain any broader social trend. I don’t think I’ve got all the answers (*gasp!* Yes, you all now have that in writing), but I do think I have some outline of the issues. The following are some generalizations, and are to be taken as such. I’m describing what I’ve seen in my own experience, not the individual and unique snowflake that is you or anyone in particular. 🙂

  1. Geeks tend to be introverts
  2. Geeks tend to avoid conflict, except in certain controlled or vaguely ritualized circumstances
  3. Geeks tend to shy away from anything that is too fuzzy or ill-defined (entirely subjective of course)
  4. Geeks have few good “geek” role-models to look to in public office (More politicians are like the former actor Ronald Reagan than the former nuclear engineer Jimmy Carter).
  5. Geeks see politics as nothing but a beauty contest/popularity contest and we all know how geeks fare in those, right? 🙂
  6. Even if a geek is willing to be engaged and active in politics, the underlying political structures are messy, inefficient, and not prone to rigid analysis (we all love analysis in some form or another, right?)

Why is this a problem? Why do geeks need to be involved in politics? Because geeks have what the system needs to work! Think about it!

  1. Geeks are passionate (my pseudo-definition of “geek” = passionate about a topic beyond reason)
  2. Geeks tend to be smart  (not necessarily “educated” as I refuse to say the two equate) and interested in the world around them.
  3. Geeks are problem-solvers – we’re all about hacking something to get it to do what needs to be done (my pseudo-definition of “hacking” = an elegant solution to an inelegant problem)
  4. Geeks are creative. This can be debated, but I stand by it — hands down, all the geeks I know are far more creative (in a broad sense) than others
  5. Geeks are good at analysis. We like details. Hell, whether you’re talking about video games, movies, comic books, or food — every geek culture is built on the analysis and debate of minutiae that nobody else pays any attention to.

Geeks, by and large, have the tools to be a force in politics, just seemingly not the will. I’ve seen and heard it a hundred times, “I don’t talk politics,” or “Why bother? It’s just politics” or something similar. This from the same people who will have no problem debating Kirk vs. Picard for the 173rd time, or who will happily engage in holy war over how the latest film adaption got everything wrong from the comic book. It’s really one of the only times I feel disappointed in my fellow geeks.

The rules by which we govern ourselves as a society, and the process by which those rules are discussed and created, are arguably the most important subjects we can debate. No matter how big a fan we are, who is directing The Hobbit, where it’s being filmed, or who will be starring in it are not that important against that larger backdrop. More fun to discuss sometimes certainly, and by no means am I saying we should stop — I’m simply saying we need to carve a little time and mental cycles out of our day to focus on this other stuff too.

The media is currently at a fevered hum spinning out new political coverage, chewing it up and regurgitating it back-out – then saying they don’t like the look of the dog’s breakfast that it is, and starting it all over again. We have the Tea Party, about which much has been said and written, but about which not much is known — mostly because it simply exists to fill the vacuum that was left after the Democrat’s wins in 2006 and 2008, and it as much a media creation as an honest and reasoned ideological reaction to anything going on. To put it in the parlance of a particular geek subset – Obama winning was like Superman killing Luthor, Darkseid, and Braniac all at once, and the writers needed to create a new villain to keep the narrative going.

You know what’s missing from what I described above? Reasoned debate. Honest exchange of differing opinions. Rational compromises that move us closer to shared goals. In short, all stuff that geeks are better suited than most to contribute to.

Yes, politics is messy. Yes, it can be filled with the kind of internecine exchanges that demonstrate the worst of ourselves. And yes, there are a 1,001 reasons not to become involved.

But as geeks, I ask you this – when presented with an opportunity to make the world what it should be (United Federation of Planets, flying cars, cures for cancer, jet packs, lightsabers, the whole shebang), how can we refuse? Do you want to continue to live in a world where celebrity is a more valuable asset than knowledge? Do you want to live in a world where glib easy answers are accepted because no one stood up and said “That’s not right!”

Or do you want to live in a world where merit truly is rewarded, where the opportunity exists to invent our future not reinvent our past, and where those rules we govern ourselves by are arrived at through intelligent discourse and debate, and yes, sometimes compromise. How you think we should arrive at such a world will largely shape where you fall on the political spectrum, and honestly I don’t care if you land on the same place I do. I do care about whether you’re on that spectrum to begin with – because otherwise you’re letting other people decide your fate. All our geek heroes, in fine Joseph Campbell tradition, helped to shape their own fates – that’s why their stories captured our attention and interest. Sure, there was always stuff outside their control, but that’s what makes the story interesting.

So, what can you do? Well, I’ll make it easy for now — Go vote tomorrow. If you haven’t followed the candidates in your area up until now, spend some time researching on their websites and Google tonight and then go vote. Even if your candidate isn’t forecast to win. Just go vote. Really.

After tomorrow? Well, I will be writing more about this in upcoming articles. Essentially (and hopefully!) providing a series of geek-centric tutorials on how to be politically informed and involved. Think of it as cutting through the crappy GUI that gets put on politics, learning to get to the command line, and being given root on the political process.

Again, whether you’re liberal or conservative is immaterial – it’s about being informed and involved. And as I’ve discovered, geeks have a wonderful ability to be a transformative force on so many things, I’m hoping the same holds true here.

“Be the change you want to see in the world” – Gandhi  (Philosophizer, activist, and honestly, a bit of a geek himself)

Oh, and for those looking for a few chuckles, read my Election Day post from 2008:  A Portrait of a Poll Worker

Where has all the news gone? (a.k.a WaPo is now CraPo), Part 1

Part 1 – Introduction and “News (sources) You Can Use”

I grew up in a very political, very news-oriented household. Unusually so, even for the well-educated and urbane suburbs of Washington, D.C. , and probably so for a lot of other places as well. My father would read at least a couple of papers a day, starting off with the Washington Post each morning. My mother read the paper and magazines, including Newsweek, and pretty much whatever else she could find. Myself, I started out reading the comics (one of my earliest memories), and even before my teenage years, I’d branched out to other sections of the paper. For a year or two, starting at about the age of 13, I even became obsessed with the Business section and tracking commodity and stock prices and how different stories would impact those values. We also watched the early local evening news, the national news (CBS of course) and my parents would stay up and watch the late local evening news.

It wasn’t until quite a bit later in life that I noticed that this was indeed unusual. However, I continued on thinking of TV and newspapers (and the media in general) as having some redeeming value – they served a valuable public good. If you were interested in a topic or issue, some reporter somewhere was covering it and if it was a big story, other reporters and papers would follow. And for most of my life, I would swear that’s how it actually worked during my formative years (’70s and ’80s). Yes, I know journalism wasn’t perfect, but I grew up in a town with a paper that had a national reputation; a paper that not only broke the story on Watergate, but broke national and local news on a seemingly regular basis. It was the paper of record when it came to Washington and the Federal government.

So why in the last six months have I not only given up only reading the newspaper (WaPo subscription canceled and barely even noticed, other than recycling bin is a helluva lot lighter) but for the most part, I’ve stopped even reading the Washington Post website? Why does leafing through Newsweek, which my parents still dutifully get, actually seem to make me feel ill? And why, other than for primary or election day coverage and other major events, do I rarely watch a single news program or channel anymore?

Well, because it stopped being news. I stopped my “habit” and yet I don’t consider myself any less informed about the facts related to current events. Before I go into anymore detail on why I find most news sources to be complete crap, I’ll describe how I do get my news now. First thing to know is that it takes some hunting around and I think the “perfect” solution is still in flux, but this is what works so far for current events:

  1. NPR – both my local radio station (88.5 WAMU) and the NPR website, which has some pretty well written stories published on the site (not just copies of audio!)
  2. My personalized iGoogle page, where I have the top headlines from the following RSS feeds: Google News “Top Stories“; Ars Technica, NY Times Home Page, Techdirt, Wired Top Stories, and yes, I still pull in the headlines from the Washington Post, but I haven’t clicked on one of those links in months.
  3. Twitter – this one was a bit of a surprise to me, but thanks to a global group of folks that I follow, I’ve heard about earthquakes thousands and thousands of miles away, and received countless tidbits of analysis and news items that I literally would not have seen otherwise. This works especially well if you follow folks that you don’t necessarily agree with, otherwise it’s just a circle jerk of the same thinks you already think or know . . . but that’s a subject for another post. (As I was writing this a 7.4 quake struck Indonesia – I found out about it on Twitter about ten minutes after it struck. First hit on Google news search wasn’t until 24 minutes.) For sites like Huffington Post, I don’t even interact with their websites — just wait to see what’s making its way around the twittersphere.
  4. Google News/Newsmap – Most folks reading this post are probably familiar with the Google News Aggregator, and it is indeed a great service, but the interface is poor – What stories are important? What stories do other people think are important? What’s new? What’s old? It’s all a bit of a hash. Which is why after running across it more than 5 years ago, I use this cool little app called newsmap, which I think is one of the most underrated items on the Internet net. Why? Because you can see the news as you never have before. Here’s the page that explains what it’s all about. Try it out for yourself and you’ll see.
  5. Various other social networking/social bookmarking sites: Slashdot, Digg, delicious, Reddit, etc. (every blue moon or so, I’ll actually find a useful news-related link on Facebook. Guess they somehow let that get by their “relevance filters.”)  I used to use these sites a lot more, but they all seem a little clunky now.[TANGENTIAL RANT: Hell, Twitter wins over them all if for no reason than there isn’t always some asshat posting “First!” and then the subsequent argument/discussion about posting “First” and the type of people who feel the need to post that, and then the subsequent discussion/argument about people who feel the need to comment on people posting “First”. Usually only ended by a couple dozen postings on the grammar/spelling failings of various posters or the invoking of Godwin’s law whenever someone compares someone else to Hitler. So glad to see that Web 2.0/3.0/*insert irrational number here* has risen so far above the behavior seen on every BBS, USENET group and mailing-list serve ever seen. END OF RANT]

What do all these sources have that the Washington Post, Newsweek, and the news on television doesn’t? Well, in the case of NPR it’s the straightforward nature of the reporting. Now those on the right will always claim there is a liberal bias on public radio, because you know they’re*whisper* publicly funded. However in my experience, they evidence no more liberal bias than reality does (as Mr. Colbert noted). From All Things Considered to Marketplace, I find out more information and quality analysis from public radio than from almost any other single source. Also, when opinion is offered, it is clearly labeled as such.

In the case of my personal news aggregation in iGoogle – it’s the flexibility and diversity I enjoy. With so many feeds, I get the information I’m interested in, as well as what I should be interested in.

With Twitter, as I mentioned, the diversity of views is wonderful and as it turns out, interesting people read interesting things and write interesting things about them. So if you follow interesting people (which I do!) than the rest takes care of itself.

With newsmap, I like it because it puts the news, and what people think is important, into perspective. On a big news day, whatever the story is will dominate the screen with the brightest colors. On a slow news day, it’s fascinating to see what pops up out of the primordial news ooze, seeking to gain our attention. It usually has something to do with sex, money or death it seems. Whenever we’re not given something specific to worry about, we do seem to enjoy finding new and uninteresting ways to focus on the first two and pretend we’re not afraid of the third while spending all of our money to avoid it.

As for the other sites listed, they used to be useful, and I’ll still occasionally find something worthwhile to read, but generally it’s the most slanted, biased pieces that get the most attention on those sites. (I’ll leave it to someone else to write the definitive piece talking about websites with “voting” mirroring the ideological extremism you see on the increase in elections – I’ll just offer the observation that everyone who does well on those sites generally plays to an ideological base, and those wishing to get votes consciously or subconsciously play to that awareness.)

Now that I’ve explained what I like specifically in my news sources, in Part 2 coming next week, I’ll explain what I find so horrible and soul-crushing about mainstream news sources. Here’s a hint: modern journalists (by and large) suck at their jobs, or work for organizations that no longer allow them to be good at it.

Do you have a favorite news source (mainstream or otherwise)? Would love to hear about it in the comments and why you like it!

Let’s Not Forget What the Enemy Is

. . . bad code! Some random thoughts on a technological love triangle, why I hate Microsoft, and how the world inside our computers should be more like the world outside of them

So, Apple and Adobe aren’t getting along. And Google and Apple are on the outs. There are cries that Apple/Google/Adobe needs to be more open and transparent. Google need this. Adobe needs that.

What fools these mortals be! Let’s all stop for a second and remember where we’re coming from. Picture if you will a timber-industry forest . . . row after row of trees, and not just trees, but all the same species and often times trees that are essentially identical. That’s the landscape that existed 10 years ago, brought to you by Microsoft. Remember them? Small little company up someplace in the Pacific Northwest called Redmond I think. They’ve accounted for 85%-90% of the client-side OS marketshare for seemingly forever. Oh, and in that article link (published in 2000), an industry analyst said “the Mac OS continues to be a non-threatening element in the market.”

Forest picture taken someplace close to Redmond I'm sureSo why do I equate Microsoft with a monotonous, undifferentiated forest? Well, there’s the obvious dominance of the market, but also because, despite what they claim in every press release, they are the antithesis of innovation. Oh, they may purchase a company and/or technology once in awhile, and bring it out to the masses, but once that happens, it’s dullsville. The technology languishes. This isn’t because of some malevolent effort by the Microsoft suits — no, it’s just a function of the fact that Microsoft is essentially as old-school as IBM at this point. They’d love to innovate, but that market-share of installed tech is an albatross around their necks. By the time they’re able to integrate a new technology, they’ve reached a decision tree fork : option a) continue to expand use and integration of new technology or b) shoehorn in whatever newer technology that is out there and people are clamoring for. As far as I can tell, they always choose option b.

Why does all that matter when discussing Adobe/Apple/Google? Since the popularization of personal computing technology, there have always been only a handful of dominate players in terms of hardware and software, and very little balanced competition — usually you’ve got a dominant player, and one or two also-rans that help keep the antitrust folks from breaking someone up. You’ve got something like a close to 90-percent chance that, if you’re reading this on a computer, it’s got an Intel processor and is running some flavor of Windows. In ecological circles, you know what they call that? Monoculture. And it is invariably a sign of a weak,sick, or damaged ecosystem. There’s never really been any evenly matched competition on PCs in terms of hardware or software.[note: just to be clear, I’m not talking about sellers of PCs, e.g. Dell, HP, etc. — that doesn’t count as they’re all selling the same damn thing)

The lack of diversity and healthy competition shows — Windows for all of it’s iterations, is a dinosaur. To mix my metaphors, every new versionof Windows strikes me as just another layer of lipstick on the dinosaur. Great – they add the “Aero” interface to Vista and Windows 7, but can they bother to include something better than Notepad to edit text files? Of course not. They now support 64-bit, but can’t come up with a better file system than NTFS?  You know how old NTFS is? 17 years! And the most recent version (used in XP, Vista and Windows 7!) is 9-years old!

Compare that to Mac OSX or Ubuntu – not only do those OSs manage to introduce major new features fairly regularly, but along the way, basic tools and services get improved as well.

So is Microsoft the enemy? No, and I ascribe no malice to them either. The enemy is, as Steve Jobs might say, software that is crap. And at the moment, Microsoft’s OS, web browser, and office suite are all crap. From antiquated technology to awkward interfaces and on to a lack of support for standards, Microsoft serves as the perfect negative-example of how to code software.

Which brings us back to our love triangle of Adobe/Apple/Google. I believe the paradigm of the 800-lb gorilla and everyone else we’ve been operating under for decades is about to change. And it shouldn’t be a question of who is more open and transparent, or who is more or less “evil” — the only things that matter should be: does the code work? is it elegant in design and function? does it fulfill a need? Other than that, there is no right or wrong. Now, that’s not to say that other personal preferences can’t come into play. You may only want to use open-source software, which is perfectly fine, but realize that is what works best for you may not be the “superior” choice for everyone else. You may only want to use technology X and never technology Y, but again, while that may work for you, it almost certainly doesn’t matter to most other people. What we often forget is that most users don’t care how something works, just as long as it does.

The lesson Microsoft should provide all of us is that a monoculture is bad. We need diversity in not only the hardware and software, but in approaches to coding the software, to sharing or selling it, and in how it is used. That will be the only way sustainable progress happens. Every time some blowhard pontificates on why some company’s product or service will bring about the end of the world, all they’re doing is displaying their backwards view. No company has that much control anymore – not even Microsoft. The computing world has gotten too big – too many devices, running too many different OSes, and too many different types of users and that’s a good thing.

So let Apple keep being Apple – Steve Jobs turned around a company headed to the trash heap of history and made it a dominate player in industries that didn’t even exist 10 years ago. They will never be as open as some want, but that’s what works for them and their users, and you can’t argue with that. They control the horizontal and the vertical, and that’s part of why the user experience on any Apple device is superior to almost every other competitor. No one is forcing anyone to use their stuff, so what’s with all the complaints?

And Adobe – don’t whine about Apple not accepting Flash for the iPhone. Flash is a crappy, klugey platform that I hope to see disappear within 5 years. It’s a clumsy tool that 95% of the time is used to create clumsy applications, it’s an accessibility nightmare, and your own tools to develop in it reflect all that perfectly. Oh, and considering your own history working with others developing software to work with your products, I’d not be complaining about Apple too loudly.

As for Google — are they a perfect company? No, and I don’t believe such a thing exists. But at the end of the day, they continue to push the borders of what we can do on the Internet and they genuinely seem interested in doing things in the most responsible manner possible. Do they probably need to focus on actually finishing and/or fixing things occasionally – hell yeah. And for all those worried that they’re going to be the next Microsoft – relax, there will never be another Microsoft – the technological and economic environments will never again sync up in such a way to create something on par with Microsoft.

In my ideal world, this would be the progression for any new service or technology. Google should create or develop it, Apple should then take it and make the interfaces functional and intuitive, and then Adobe can take on the maintenance,expansion, and support of it. Actually, in my ideal world, these three companies and all the others out there would do all of these steps well, but that has about much chance of happening as Microsoft delivering software that doesn’t make me start swearing.

Closing thought: Technodiversity? Should that be a term? If it isn’t, I think it should be. I believe in the future it will become as important a topic as biodiversity. Just as with the natural spaces, where biodiversity yields a healthier environment (more biomass, resistance to disease/pests, evolutionary progress), our cyberspace is dependent on technodiversity for dissemination of information, security, and further advances in technology.

A Look Back @ Twitter

Okay, so I joined Twitter on July 4 2007, only a year after the full-scale version of the service was launched. This was my first tweet:

01:52:40 Creating my twitter account – woohoo!

And there that tweet sat, all by its lonesome for nearly two years, when I tweeted the following:

11:34:17 Adding my second tweet after two years so I won’t be listed at http://www.slate.com/id/2219995

The headline of the piece was “Orphaned Tweets: When people sign up for Twitter, post once, then never return.” by John Swansburg and Jeremy Singer-Vine. While not specifically mentioned, the article did sting a little – after all, for most of Web 2.0 I’ve been an early adopter, as a Web 2.0 world fits in with my preconceptions of how the Internet and technology in general should work: nonlinear, connected, and enriching.

I was still interested in Twitter, just really had no Twitter-shaped hole in my life to fill, and so once again, my account languished. However, this time for only about two months. And then a strange thing happened – on a music site I frequently browsed for new, independent acts, I happened to notice a number of artists I really liked listed Twitter accounts. Namely the ever-enchanting @Meiko and the ever-entertaining @jonathancoulton. Started thinking, “Hmm, wonder if some of my favorite authors are on Twitter?” And so then started following two personal writing heroes of mine, William Gibson (@GreatDismal) and Bruce Sterling (@bruces). From there, it snowballed – I found TV shows hosts, comedians, political activists, more musicians, more authors, etc. I followed all of them that I could find and that looked to tweet often enough to be interesting.

Two things happened next that completely changed how I used Twitter. The first was that I found out that a number of the artists and creative folks that I liked knew each other. This led me to old favorites, as well as some new discoveries. The second was that I started discovering other folks out there who liked many of the same people/things/ideas that I did and I started following them.

The “ripple in the pond” metaphor is a hack cliché, but you know what . . . that’s because it works damn it! – and in this case, it’s very appropriate. Because not only was I out there throwing stones in the water, but so was evidently everyone else, and interesting patterns develop when those ripples interact. Sometimes they magnify, and sometimes they cancel each other out, but it is in the end all about interaction. That realization completely changed how I thought of Twitter.

And because of those interactions, I made the online acquaintance of some fascinating people. People I likely wouldn’t have found in “real” life, on Facebook, or in other ways. There’s @shamrockjulie, who reintroduced me to Lost, and who has an abiding love of midgets, monkeys, unicorns and drag queens. Oh, and she regularly beats me at Scrabble, which is, let’s face it, pretty damn impressive. And there’s @KyleeLane, who makes soap — but not just any soap, she makes Han Solo in Carbonite soap and “Abby Normal” Brain soap and Fight Club soap (Hint: buy her soap – how often can you say you got clean with art?). Oh, and she has a sweet tooth to rival even my own and is one of the most creative, down-to-earth people you could ever imagine.

There’s also @tomupton33 who finds the best quotes, there’s @iA who I think knows more about interface design than any other single person on the planet, @barryintokyo who is a book editor and writer in Tokyo who provided me book recommendations on all things Japanese, and @anjkan who gets regularly retweeted by William Gibson (which gives you some idea how fricking far in the cool future she lives)  and is also just about one of the nicest people in the world. There are literally dozens of others now that I interact with regularly to find out the latest in web design, politics, and all-things-geeky.

Following creative people (the folks I follow I refuse to call celebrities because of the lack of substance that implies) on Twitter is a lot of fun and stuff I’ve read there has increased my enjoyment of tv shows, movies, books, and music. And I’ve had great and surprising Twitter conversations or retweets with the likes of Kevin Smith, Rosanne Cash (very cool!), and the aforementioned William Gibson, but it’s the organic circle of like-minded Tweeters that have been why I stay on Twitter.

This came especially to mind during the Vancouver Olympics Open Ceremony some months ago as I was following along on Twitter while watching the coverage. During that I realized I was a member of a fluid, dynamic tribe all doing the same thing at the same time (and having very similar impressions) and I was okay with that. I mean, I’m the least likely to “join” anything – it’s just not my thing, but what Twitter allows is for you to identify yourself by what you tweet and who you follow, and a “tribe” of like-minded people organically grows out of that and it’s not at all static or limiting. There is no joining, there is just you being you, and then you start running into people with similar interests. It is, in essence, nonlinear, connected, and enriching.

Now, I was also an early adopter of Facebook, and that has its uses, but to my mind what it’s missing is that nonlinear experience that Twitter encourages. I have 220 “Friends” on Facebook and they are 99% people I’ve actually met in real life. Family, classmates, former and current colleagues, etc. And I can share my pictures of my kids and I already know them, and they already know me, and where do you go with that? Even outside of family, I’ve known some of them for 35 years – and no offense – but as nice as they are and as much as they may have in common with me, they’re not that likely to surprise me or introduce me to new things or ideas. Often times, I may only have shared an experience with them – same school, same job, etc.

Whereas on Twitter I currently have almost 200 followers and am following around 300 people and only about 5% of them are people I’ve actually met and generally I’ve got no shared common experiences. However, the people I interact with there have introduced me to new things, new viewpoints, and other people that I share things in common with.Part of that is that while we may not have shared common experiences, I’ve found that there are experiences we have in common, e.g. Star Wars being a pivotal film, eating bacon being a transcendental experience, being confused by the latest episode of Lost, etc. and when we talk about them, it becomes a shared experience in a way.

I’m not saying one approach is better than another, but what I can say is that as my tweeting has increased, my time on Facebook has greatly diminshed. When I go back to Facebook now, it seems somehow dated and AOL-ish (not exactly surprising considering how my mother is on it *waves* Hi Mom!) . We used to call AOL “the Internet with training wheels” and to me at least, Facebook is “Social Networking with training wheels.” Even after hiding Farmville and Mafia Wars and what ever other app is clogging my feed at the moment, it doesn’t feel quite real.

Twitter, in comparison, feels more sincere in some way. Perhaps its the distillation of people’s thoughts in 140 characters, perhaps it’s a different style of user – whatever the case, there’s an immediacy and authenticity to what I read on Twitter that seems to be lacking from many other online interactions.

Is Twitter perfect? Hardly. It’s got spam. It’s got wacky nutjobs. It’s got everything that makes the Internet annoying on occasion, but what it also has is a reason to keep coming back. I don’t know how long it will last (and let’s face it, it won’t last forever – your mom and my mom will be out there tweeting soon enough), but if the best of the web and technology is at heart a sort of iterative design, I do think that Twitter is progress in the right direction.

I’d be interested in hearing what everyone else’s thoughts are, so please feel free to add your comments!