Your Unique Amalgam: On the Fluidity of Geekhood

Geek in training.

Geek in training.

I had assigned myself* the task of writing a post about what it is to be a “geek.” It’s obviously not the same as it was when I was in school, as geekhood no longer implies utter alienation from the mainstream, but being part of a kind of cultural elite, a kind of priesthood that knows the Most Holy Secrets of computers (once nerdy, now cool), comic books (same), science fiction. So am I a geek?

The easy answer is yes. I’m into a whole slew of traditionally-geeky things like Star Trek and Macs and Monty Python and the Hitchhikers Guide to the Galaxy. But while I check many of the geek boxes, I can’t help but feel like the label still doesn’t suit me. And not because I’m too “preppy” or conformist, but because I don’t feel like I conform sufficiently to the geek’s clique.

I like Star Trek and some superhero movies, but I never got into comic books. I like science fiction generally, but I’ve never gotten deeply into that genre of books. I love my Mac and my iThings almost more than my children, but I don’t know anything about coding or software development, or even really how the damn things work.

And beyond some of the cultural (or pop-cultural) differences, there’s a class barrier as well, at least that I perceive. Being a geek, or so it seems from the tech blogosphere, is getting expensive. You not only need to have expensive devices (which I barely manage), but clothes and glasses and bags and notebooks and pens and coffee-makers and spirits and cameras and sometimes even cars that meet a certain aesthetic, and cost more than they probably ought to. Do I not qualify as a geek if I can’t see my way to purchasing $300 headphones or a $500 computer bag?

No one’s told me I can’t call myself a geek if I don’t meet all these criteria, of course. But I do feel outside the circle when I can’t match all these references, when I can't afford the right paraphernalia, when I can’t speak the whole language, but just get by with a phrasebook (a moleskine phrasebook of course). It all makes me recall the old King Missle song that stirred me as a sophomore in high school, “It’s Saturday,” where John S. Hall says, with wide-eyed eagerness:

I want to be different, like everybody else I want to be like
I want to be just like all the different people
I have no further interest in being the same
Because I have seen difference all around
And now I know that that's what I want
I don't want to blend in and be indistinguishable
I want to be a part of the different crowd
And assert my individuality along with the others
Who are different like me

What I’m perceiving, I suppose, is really just a popular modern conception and generalization of geekhood. But if you broaden the definition to mean “someone who is passionate about niche subjects,” then I still qualify. Not just on Star Trek and the more consumer-centric aspects of technology, but about things like media criticism, secularism, politics, acting, songwriting, and prose writing. I even still retain some geekiness around certain small policies around electoral reform! Allow me to go into detail about why you should be into instant runoff voting.

And I reserve the right to get geeky about things down the road. I’m rediscovering a love of drawing (dumb cartoony things), and I may yet delve into Dr. Who one day, which would make my dad happy. Other things, like guitar playing and theatre, I’ve gotten less geeky about as other things in my life have left little room for them.

But geekhood needs to be fluid, I’d say. Especially if we want it to refer to something other than the relatively small circle of technological elites on the west coast. I’d like to think of geekhood, then, not as a description of a group that’s into a certain, prescribed set of things, but as the state of being deeply into one’s own unique amalgam of interests. That hodgepodge of perhaps-unrelated passions is your geekiness. That’s different.

_ _ _

* It's not so much that I assigned myself, but that I was asked to write about geekdom by the folks at Singlehop, who do private cloud hosting, and are doing a whole thing around what it is to be a geek. If you'd like more info about SingleHop, check out their new private cloud hosting page. Thanks, guys!

The Old School Transformers Movie You've Been Wishing For

There hasn't been a Generation-1 Transformers animated movie since Transformers: The Movie (discussed in depth on my podcast) in 1986. As excited as many folks my age were that the Transformers were coming to live-action film in 2007, despite the return of Peter Cullen as the voice of Optimus Prime, the Michael Bay versions clearly aren't quite what a true fan was hoping for.

But why? Aside from terrible writing, which even the best iterations of Transformers were always plagued by, what about the Bay Transformers movies doesn't work? Too many humans, for one. Sorry, but Spike never will be interesting enough to carry a movie.

But I think the biggest problem is that in the live action movies, the Transformers don't look like Transformers. Now, the bots never sported svelte, Jony Ive-approved designs, and much to my disappointment, as the years passed, newer versions of Prime and other characters had so many guns, blades, spikes, and other protrusions glommed onto them, that they looked like big mechanical jumbles.

The Michael Bay movies take that into the stratosphere, and make the Transformers all so busy-looking, so, well, messy, that they no longer resembled the robots we used to know. They were now robots in disguise in disguise as, well, junkyards? If shape-shifting robots were to emerge from Saruman's forges, I think they'd look a lot like Michael Bay's Transformers.

This is part of what makes this video so great. Harris Loureiro of Malaysia has taken what look to be the “masterpiece” versions of Generation-1 Transformers toys (so-called because they are built to mimic their platonic ideals from the cartoons and comics, articulating and transforming just as they would), and using stop-motion animation to create his own short Transformers films.

He's using sound and music from the 1986 animated film for this, a battle between some latter-day version of Optimus Prime and the Constructicons, which of course form Devastator. (They apparently have different names in different parts of the world, but it's them.)

But it's tremendous. There's nuance, there are graceful moves, suspense and surprises, and yes, they look like Transformers. Loureiro's done an excellent job with this. He has some others on his YouTube channel, which seem more like experiments and expositions of what he's able to do, but they're worth checking out too. But this is the one with an actual conflict, and it's rather bad-ass. Hollywood may need to hire this man.

Many thanks to Len Sanook for pointing me to this.

Stuck Outside of the Comic Books Multiverse

As a nerd, it really does seem that I ought to be into comic books, but it just never happened. As a pre-teen, eventually I became devoted to The Transformers comics, but that had more to do with my love of the bots than any inclination toward comic books. I also dabbled around that age with Teenage Mutant Ninja Turtles (pre-Archie) and Usagi Yojimbo, because they seemed safe, graspable, and I think also not overtly hyper-masculine. I briefly had a subscription to Web of Spider-Man, but even then I felt alienated by the idea that this was one version of a franchise that had multiple incarnations, with no sense of where I was in the story’s arc, or whether this arc even “counted.”

Now, all grown up and no longer in need of a ride from my parents to a store that actually sold comic books, and with the vast trove of comic knowledge now available on the Internet, it seems a ripe time for me to try again. And yet I pause.

(Let’s presume for the sake of this post that we’re just talking about big, established franchises, your Batmans, your Supermans, your Avengers, etc. I know it’d be easy for me to pick a more recent and independent universe, but goddamn it I want Batman.)

First, there’s the whole fact that money doesn’t grow in my carpet, and these things ain’t free. But mainly, I know that I’d have to choose a franchise and then pick a spot, more or less arbitrarily, to start from. What will I be expected to know in advance? What will it be assumed I have read already? If I haven’t been sufficiently saturated in the mythos of the given superhero, will I reap the full experience?

It’s all too much stress for what’s supposed to be fun.

Andy Ihnatko just wrote about this very thing, and it carries weight for me because a) Andy is awesome and b) he’s a comics-nerd’s comics-nerd, a true blue fan of the art form, and even he is now ready to throw up his hands:

My obstacle with Marvel is that I have no idea how to get a single unit of story from them. Stories start in the middle and they’re resolved later (sometimes after months) in another book entirely. Marvel’s “Avengers” books are such a mess that they often include a little chart of what books you need to buy and what order you need to read them in. Good lord! [ … ]

I rarely get to the end of a Marvel comic and feel like the curtain has closed and the lights in the theater have come up. It’s frustrating and unsatisfying. And Marvel isn’t entirely immune to DC’s troubles, either. Marvel’s story continuity is deeply contaminated with characters who are someone’s son in an alternate-reality, but a future alternate reality, from an Earth that’s a parallel-Earth to the Earth of that alternate reality, who traveled back in time to reach this character who turns out to be a clone of a robot of…


I know, right?!? It’s my understanding that what Andy is describing here is a somewhat recent phenomenon, such that the situation’s complexity dwarfs the one I faced in the late 80s and early 90s. So if it’s crazier now than it was then, I really have no hope, and should settle for enjoying the movies (when possible – Transformers is even now split into a thousand sub-incarnations, and the movies are terrible).

Maybe there’s some self-contained storyline I can get into and be content with. But even then, you know, them comics get expensive. So I dunno.



What Old Dad Can Offer

Lee Siegel on being the father of young kids while in his 50s:

[I]t isn’t too difficult to squelch the regret that I didn’t have children at a younger age. If I had, I wouldn’t be experiencing the joy of these two particular precious darlings. I wouldn’t have known a little more about life, as I do now, or had the same ironic distance from myself that the years have brought me. Blissfully, I experience no yuppie torments about the duties and sacrifices of parenthood. On the contrary. I’m grateful to my children for helping me grow out of my own childish narcissism.

This mostly rings true as the dad of two wee ones at the age of 36. I was far too stupid in my early 20s to have been responsible for children, and while I'm no dad of the century now, I'm far more able to keep things in relative perspective, and to see things from a useful distance at this age. I'm a little wiser than I might have been, though the bar is low.

But being in my 30s, both my wife and I are still subject to the "yuppie torments," the endless comparison of our own parenting to others' -- especially since so many within our cohort have reproduced at roughly the same time. It's pointless baggage, it never helps us parent any better, but we're still vulnerable to it. Thanks, Facebook!

More Siegel:

The plan is to make myself so present in their thoughts and feelings that my immortality will be guaranteed—life cycles be damned.

I have no illusions of immortality of any kind through my kids, but this is my goal nonetheless; to be their absolute safe place, their lives' olly olly oxen free, the person from whom they will always draw strength and love. If I do it right, it will still be so when I'm gone.

(Here is a drawing I just did about how I feel about my amazing boy as he outgrows me and everything else. The girl amazes me must the same, but she's a baby to me, and therefore not yet outgrowing me for now...for now!)


A Ravenous Insistence on Having an Opinion

Andy Greenwald, in a post that’s really about the show Louiediagnoses the tweetosphere:

“We live in an era of opinions. In the Internet economy — in which I am a loyal and grateful participant! — loud voices are more than just currency, they’re coal. The Outrage Industrial Complex burns all day and all night with Twitter as its blistering engine room. A constant stream of fuel is necessary to keep the entire enterprise afloat, and so any event, be it the collapse of a government or the cancellation of a sitcom, is greeted with a near instantaneous torrent of reaction. Though the appeal of the virtual yawp can be undeniably intoxicating, I’m gradually finding it less and less tolerable. It’s no secret that nuance and doubt are rarely retweeted, but as Twitter has metastasized, its vaunted panoply of voices has grown more strident and, oddly, more unified — not in their positions but in their ravenous insistence on having one. It’s become less a conversation and more a crusade. Being silent is far worse than being wrong.”

This is my experience as well, though I’m not so sure about that last bit. I stay silent on a lot of things, partly because of my job, and partly because I don’t necessarily want to burn along with the rest of the coals, or whatever Strong Feeling I have about something has already been expressed by someone else, and better than I would have. And I don’t feel judged for this.

The one kernel of truth I find is that, if anything, having been silent, it makes it all the more apparent when I’m not. And then it’s not so much that I’d be coal, but something far worse: fresh meat. To some group of folks or other, I’d have the Wrong Opinion which would instantly render me a Bad Person, and they would let me know. A lot.

Freddie deBoer, in a post that’s really about reading, seems to get at this:

“You’re doing it wrong” is the internet’s truest, most genuine expression of itself.  For whatever reason, the endless exposure to other people’s minds has made the vague feeling that someone, somewhere, is judging you into the most powerful force in the world.

Silly, isn’t it? I know it’s real people behind those tiny avatars, but they’re so removed, there’s no real idea of who they are. And even if you also know them personally, the Twitter experience is so ephemeral, yet perceieved sins seem to last forever.

There Will Be No Hybrid Device (And That's Good)

Microsoft insists that the Surface is "the tablet that can replace your laptop." But even from the press event with enthusiastic demonstrations, and never having held the thing, it's clear that this simply isn't true. Yes, it seems passable as a laptop I suppose, but to qualify as something that "replaces" it, it has to be as good or better than a standard laptop (and really, better than a MacBook Air, a much higher bar). It's clearly not. Its laptop functionality, for one, is a separate add-on, its keyboard not included. Its trackpad is reported to be inferior. Even with the keyboard attached and with the improvements Microsoft has made, it's still unstable and awkward on a lap, according to reviews. So it fails here.

And as for the tablet part? No one is going to want to use the Surface as a tablet. It's enormous, it's heavy, and it has a poor tablet-app ecosystem. Is it passable as a tablet? Maybe? But again, passable isn't good enough. 

The other selling point left is that is reduces the load of gadgets you carry. But for that to be the kicker, the benefit of having fewer things to lug around has to be so great as to overshadow the device's other drawbacks. The Surface plus a keyboard weighs 2.42 pounds. A MacBook Air and an iPad Air together weigh 3.96 pounds. We're not talking about back-breaking differences, here. And the tradeoff is that by having one device instead of two, you have one heavily compromised and inferior device instead of two excellently refined devices (and that presumes you even want to carry both around). I'll take the extra pound and a half, please.

So while the Surface may be a very well made and interesting device, it's not the Grand Unified Device it's being trumpeted as. And it goes a step further in proving that such a device may not exist, or at least oughtn't.

Bringing this back to Apple, when the top brass at the company made a lot of claims decrying the idea of a unified tablet-Mac hybrid thing, one thing I presumed was that their denials could be standard Apple evasion. Remember, no one wanted to watch videos on an iPod, until Apple made a video iPod. No one wanted to read books anymore, until Apple made an entire bookstore platform. They often look askance at features or ideas, only to adopt them later. I don't fault them for this. They want the attention on the products they have now, not what they might make someday. 

But after Monday's WWDC keynote, it's clear to me that they weren't bluffing about not melding OS X and iOS. Nor were they just being obstinate. For I certainly thought I wanted this unicorn device, the One Thing I'd Ever Use for both relaxing on the couch and for serious work. And when Apple said they'd never make that device, I also thought that perhaps they were being stubborn, the we-know-better company that they get panned for being so often, by simply folding their arms, sticking their noses up and saying "no!"

I revisit this interview that Craig Federighi, the operating systems guy, and Phil Schiller, the marketing guy, did with Jason Snell at Macworld last year, and you can see just how prescient it is, or rather, how the guys at Apple were telling us exactly what they were doing.

“The reason OS X has a different interface than iOS isn’t because one came after the other or because this one’s old and this one’s new,” Federighi said. Instead, it’s because using a mouse and keyboard just isn’t the same as tapping with your finger. “This device,” Federighi said, pointing at a MacBook Air screen, “has been honed over 30 years to be optimal” for keyboards and mice. Schiller and Federighi both made clear that Apple believes that competitors who try to attach a touchscreen to a PC or a clamshell keyboard onto a tablet are barking up the wrong tree.

“It’s obvious and easy enough to slap a touchscreen on a piece of hardware, but is that a good experience?” Federighi said. “We believe, no.”

“We don’t waste time thinking, ‘But it should be one [interface]!’ How do you make these [operating systems] merge together?’ What a waste of energy that would be,” Schiller said. But he added that the company definitely tries to smooth out bumps in the road that make it difficult for its customers to switch between a Mac and an iOS device—for example, making sure its messaging and calendaring apps have the same name on both OS X and iOS.

“To say [OS X and iOS] should be the same, independent of their purpose? Let’s just converge, for the sake of convergence? [It’s] absolutely a nongoal,” Federighi said. “You don’t want to say the Mac became less good at being a Mac because someone tried to turn it into iOS. At the same time, you don’t want to feel like iOS was designed by [one] company and Mac was designed by [a different] company, and they’re different for reasons of lack of common vision. We have a common sense of aesthetics, a common set of principles that drive us, and we’re building the best products we can for their unique purposes. So you’ll see them be the same where that makes sense, and you’ll see them be different in those things that are critical to their essence.”

Unlike Microsoft and a handful of other manufacturers, Apple sees a unique place for each device in the gadget triad of phone, tablet, and PC. Rather than meld them, and worry about merging for the sake of merging -- for the sake of reducing the number of devices one has -- they work on perfecting each device within the contexts of their individual places. And instead of hybridizing them, they build bridges, highways, tunnels, and even wormholes between them, drastically reducing the friction for making them cooperate, without making them the same. If they make good on their promises from WWDC, they will have proven that strategy to be very right.   

(How this will play out with a larger-screened iPhone, or dare I say it, an iPhablet, remains to be seen, though I feel some dread about it.)

And for Microsoft and its would-be customers, the question is begged, why would I want my tablet to replace my laptop? Yes, it's great when new functionality comes to existing device categories. More data-sharing and third-party support on iOS will be great for letting me do more on my iPad, for example, but I don't want new features at the cost of the iPad being a crummier tablet. And I really do want to replace the laptop I have (an aging 2011 11" MacBook Air), but I want to replace it with a better laptop, not a worse laptop that also happens to be tablet-like. Who would?


I usually really don't like going to the beach, but I acquiesce for the sake of getting the kids out of the house, plus my wife really loves it. But today, I had a blast. It was cool, the sun was unoppressive, the kids were having a good time, but mostly, I became fascinated by the extraordinary variety of rocks and pebbles strewn all over the shore. And whatever was visible before a wave came in could change entirely after. I took some photos of what I saw, and while I also have plenty of my own personal photos of the family, mostly these are of the landscape and the rocks and the water. 

Twitter Tsunamis of Desperate Signaling

Alan Jacobs on the swarm of me-too righteousness online, in the form of “Twitter tsunamis.”

This kind of thing always makes me want to flee Twitter, even when I am deeply sympathetic to the positions people are taking. It’s a test of my charity, and a test I usually fail. To me these tsunamis feel like desperate signaling, people trying to make sure that everyone knows where they stand on the issue du jour. I can almost see the beads of sweat forming on their foreheads as they try to craft retweetable tweets, the kind to which others will append that most wholehearted of endorsements: “THIS.” I find myself thinking, People, you never tweeted about [topic x] before and after 48 hours or so you’ll never tweet about it again, so please stop signaling to all of us how near and dear to your heart [topic X] is.

Here’s one of the things I love about Jacobs. Even when he hates what you’re doing, he gives you so much benefit of the doubt.

Likewise, Freddie deBoer:

Indeed: sincerity, in these instances, is in abundant supply. What’s lacking is the understanding that good people being publicly sincere makes nothing happen. But what else are you going to do? What am I doing? What can I do? I don’t know. I don’t know.

I don’t know either, but I am more cynical, and while I agree there’s sincerity behind these tsunamis, I also suspect that the impulse to act on them en masse to no other end than to add a “+1” to what has been said innumerably is a true expression of vanity in the digital age; not much different than dressing in-fashion, only here it’s done with text rather than fabric, it’s joining the in-group via a conviction instead of clothes. It’s the yellow ribbon gaudily displayed by those who have never done anything to support a troop.

And lord do I hate it when people just type “THIS” and then a link. As though by doing so they have presented the final word on a subject, and can thereby bask in the glow of having delivered it to the rest of us.

Worse, is “THIS. SO MUCH THIS.” It’s the triple-dog-dare of conviction-bearing tweets.

DeBoer again:

The trouble with talking about right and wrong in the age of the internet is that our communicative systems are oriented towards communicating only with those whom we wish to.

There’s no risk in taking part in one of these storms (unless you’re a woman, in which case you’re going to get a lot of shit from assholes, because you always do no matter the topic), because you know in advance that everyone shares your opinion. I of course can’t read anyone’s mind and can’t prove anything here, but my deep suspicion is that hand in hand with the vain in-grouping of these tsunamis is the pose of courage, that by expressing such and such an opinion, which I know is shared by everyone who will read it, I have somehow really put myself out there in a vulnerable position, but dammit, I can’t remain silent about this any longer. Never mind that no one else in this 48-hour period is being silent about it either, and being not-silent in the exact same way.

Oh, except for this writer from a favored partisan journalistic outlet, who really nailed it, beyond all dispute. This. So much this.

We Are All Short Now

Reihan Salam writes on short men’s failure to collectively reject heightism, and it’s a piece so good I found myself highlighting more than half of it for potential excerpt here. Rather than do that, let’s see if I can get to the meat of it.

First, a good description of the problem:

As I go through life, I will occasionally say, “well, as a short person …” before making some observation. And I’ve found that my interlocutor will often interject something to the effect of, “Hey, you’re not that short,” as if to reassure me. But why would this be reassuring if there were nothing wrong with being short?

And there’s not, of course. One thing I particularly like about Salam’s take is his acceptance that the preference for taller men in certain areas, such as in women’s choice of a mate, is a totally understandable, if regrettable, vestige of our biology. There’s no point in tying one’s guts in knots over an instinctive preference, which, thanks to civilization, can be overcome. (My wife is a bit taller than me and she likes me just fine.)

But there’s no reason to extrapolate that archaic preference into presumed height-based superiority. Being short is not an affliction, and it’s not a modern-world physical disadvantage. (Salam addresses the societal disadvantages, which of course all spring from these erroneous perceptions.) There’s nothing “better” about being tall. But we all behave – really, almost all of us – as though being short is bad, something to be ashamed of, and indeed, something to fudge.

And that’s the problem Salam wants to tackle here. Heightism is aided and abetted by short men themselves. They perpetuate the false idea that being short is a bad thing by doing things like rounding up their heights, or making fun of men who are shorter than they are. This must not stand, says Salam:

To be sure, rounding up is not the worst thing in the world. I’ll tell you what is the worst thing in the world. It is that short men who have internalized heightist attitudes are more likely to stand by as those shorter than them are casually mistreated. In our culture, men who are 5-foot–8 don’t see men who are 5-foot–1 as comrades. They treat their shorter brothers as strangers, or perhaps even as objects of pity or contempt. … To the short men among you, I’d like to ask: Have you ever poked fun at someone for their size? Have you done so to delight your taller friends, and to establish that you are truly one of them? If so, I’d like you to think hard about the place in hell that is reserved for your ilk.

Like many other accidents of biology such as skin color or sexual orientation, the stigmatization of shortness is arbitrary and baseless, and humans would do well to discard it right along with all of its other stupid prejudices. Of course, this particular stigma is not nearly comparable in severity to those based on race or sexual orientation (no one tries to ban short people from marrying each other, or from marrying tall people for that matter). But that only makes a call for short men to back each other up all the more compelling and sensible: Relative to other struggles, it’s just so damn easy. All we need to do is not buy into the myths and prejudices about height, and reject them out loud when we hear them, and things could change. We could start to make a lot of people’s lives easier and less filled with shame about something for which none should be felt.

For the record, I used to say I was 5-foot–6, when technically I measure 5 feet and 5 and a half inches. I rounded up. (My feelings about my height are a major focus of one of my songs as well.) But many years ago I decided it was absurd to try and eke out an additional half-inch for…well, for what? I’m 5-foot–5, and while there are many, many things wrong with me, that is not one of them.

Peak Outrage and the Exhausted Amygdala

Why have I lost interest in politics, when it was once such a passion of mine that I left theatre and performing Shakespeare for a living to pursue it? John Dickerson gets it. In a piece about the titanic clusterfuck that is the VA, he writes:

One primary reason to despair is that we’re already living at peak outrage. Fake umbrage taking and outrage production are our most plentiful political products, not legislation and certainly not interesting solutions to complicated issues. We are in a new political season, too—that means an extra dose of hot, high stakes outrage over the slightest thing that might move votes. How does something get recognized as beyond the pale when we live beyond the pale?

This is of a piece with the utter lack of a generosity of spirit from even the most well-meaning progressives out there, who have been socialized to salivate at the prospect of uncovering the heretics in their midst, taking as much pleasure in sicking the mob on the perceived transgressions of fellow liberals as they do in substantive policy wins. How can you be truly moved to tackle problems like Veterans Affairs, climate change, or the Boschian hellscape of our prison system, when you're consumed by your fury over Alec Baldwin on Stephen Colbert?

More Dickerson:

As FDR said, the public cannot “be attuned for long periods of time to a constant repetition of the highest note on the scale.” If we are constantly yelling outrage, it leaves us with nothing when the real thing comes along.

True, but perhaps even worse, the constant repetition of outrage I suspect trains our lizard brains to be in a constant state of threat. Our collective amygdalae are pumping out fight-or-flight chemicals at such a rate, that either everything looks like an equally existential threat or unpardonable offense, or we become exhausted, and cease to care about much at all.

For myself, I have to wonder, now that I've passed both of these stages, is there any coming back? 

"Just Bring the Kids" is an Option. But it is One That Sucks.

Maybe this will help you understand. Christine Skoutelas at A Morning Grouch enlightens those without children as to why she and other people with kids seem to socially disappear. (Now, I’m inclined to socially disappear anyway, but having kids only turns it from a predeliction to a kind of existential necessity.)

Here’s two parts to her post that really stood out to me, first, on why we don’t want to “bring the kids along,” even if it’s totally cool with you:

We just can’t focus on you very well when we have to simultaneously keep an eye on our kids making sure they don’t choke, drown in randomly placed vat of water or get a head injury bumping into the pointy corner of a table. We spend a lot more time and energy worrying about keeping our brood alive than you might imagine. A lot of times we host events you don’t get invited to. Again, this isn’t because YOU aren’t fun, it’s because our events aren’t fun, at least not for most adults. They are loud, obnoxious, and strategically located where there are wide open spaces or playscapes that allow toddlers to run and bounce off padded surfaces, screaming like banshees, that allow us to leave the Xanax at home since we don’t have to fear death by pointy edge.

Imagine the anxiety level in the room when it’s parents with kids doing stuff with other parents with kids! It’s like the other parents aren’t even there. But at least everyone’s on the same page.

But here was the big one for me, on the value of idleness:

We’re spending so much energy carrying, wiping, toting, cleaning, chasing after, listening to, reasoning with, teaching, and doing, that sometimes we need to just sit, in a quiet space, for ten or thirty or one-hundred-and-twenty minutes in a row, for our own sanity, and for the safety of those around us. There is no sleeping in, or afternoon naps, or resting on the weekend, so these moments are critical to help our bodies and minds recover and recharge for the remainder of our day or week. God help you if you infringe on our time we’ve allotted to revive ourselves.

Exactly. I’ve always been overly precious of my free time, since I’m a severe introvert and a value beyond measure my chances to get the hell away. But caring for kids, worthwhile as it absolutely is for me, just leaves me with nothing to run on. Whereas day-to-day activity and interactions were already draining, being a parent has utterly sapped all reserves. It’s actually a miracle that I get to work from home, making my in-person time significantly mitigated. But when I once donned the Blue Shirt in the Brushed Aluminum Retail Location, that was nothing but personal interactions.

Back then, though, I only had one kid.

Being a Bully is Good for You

The old trope has it that while bullies make your life hell during your years in school, once you’re all grown up and in the world, the bullies’ targets all become successful and self-assured while the bullies themselves wind up in crappy, dead-end jobs, miserable and full of regret and self-loathing.


Researchers at Duke say, “Enhanced social status seems to have a biological advantage.” You don't say. Apparently it has something to do with inflammation:

In adults, a high social status, including income or education level, is associated with lower levels of inflammatory markers, the researchers wrote.

“The finding of lower increases in [c-reactive protein] levels for pure bullies into adulthood is novel,” the researchers said, adding that previous work tended to focus on the those who struggled through adversity.

Meanwhile, we bullying targets, the future-Bill-Gateses of the world, have a different fate.

I had always been skeptical of the bullies-are-doomed myth, which I know is uttered primarily to give the bullied a small sense of hope or justice, but rings incredibly hollow. Whatever the ugly motivations for bullying, the results are pretty much the same: the asshole gets to feel even better about himself than he did before. They get a little burst of confidence, validation, and a sense of superiority. Presuming they are not also getting bullied themselves (by peers, parents, what have you), it’s hard to see how bullying wouldn’t help them with their well-being into adulthood.

Goddamn it.

The iPad's Deep Niche


Jared Sinclair (who I find by way of Alan Jacobs), in a well-reasoned post, comes to an “uncomfortable conclusion” about the iPad:

In order for the iPad to fulfill its supposed Post-PC destiny, it has to either become more like an iPhone or more like a Mac. But it can’t do either without losing its raison d’être.

I’m not at all convinced that this is true. First, though, I agree with him on some key points, such as:

Although both the iPhone and the iPad are multi-purpose devices, it seems only the iPhone fills a multi-purpose need in customers’ lives. A typical customer’s iPhone is put to work in all its capacity, while her iPad is relegated to only one or two niche uses. An iPhone is a phone, a flashlight, a GPS navigator, a camera, etc. An iPad can be most of those things, but in practice it gets stuck being just one or two of them.

I’d word this somewhat differently, but largely this is correct: the smartphone is a Swiss Army Knife of tools that most folks who are even tangentially involved in the information economy need to have on them today. And even if they’re not, it’s come to replace many of the other devices most consumers would otherwise think of as standard. You need a smartphone (here, specifically an iPhone) because you need a smartphone.

Sinclair argues that in many ways the iPad over-serves its users in relation to the iPhone in many areas, and then goes on to contrast where the iPad falls short versus phones and Macs. Here’s what he says about the Mac:

A Mac is Better Than an iPad for…

-Workplace Productivity – The Mac has an exposed file system, physical keyboard, a pixel-accurate pointing device, and multitasking applications, all of which contribute to more efficient workflows.

-Power Computing – There are some professional tasks that require powerful processors, expansion ports, large storage devices, multiple displays, etc. These features are only available on a PC.

No arguments here. The Mac/PC is what you have to go to if you want to do most serious work in an effecient way. Yes, there are some blurry lines and overlap (lately I’ve found iMovie on the iPad more convenient for quick video editing of my kids’ antics than on the Mac, for example), but in 95% of cases, excepting those special circumstances that happen mostly in iPad commercials, you need a Mac to do “work-work.”

Here’s where Sinclair comes down:

I think the future of the iPad is for it to disappear, absorbed at the low end by iPhones with large displays and at the high end by Macs running a more iOS-like flavor of OS X. Perhaps it won’t disappear completely. After all, for certain niche uses [Sinclair is referring to things like reading, movies, and the things that happen in iPad commercials] … the iPad is great because it’s neither a phone nor a PC. But these are still niche uses and can’t possibly sustain the long, bountiful future that many hope the iPad has.

And here’s where I disagree. I can grant all the facts he presents. Yes, the iPad is not as good at utility-belt type stuff as the iPhone, and it’s not as good at get-work-done-at-my-desk stuff as the Mac. But I think that’s because, as I noted in the opening paragraphs ofmy iPad Air review, the iPad shines as the device you reach for when you don’t need the other two. Speaking in broad terms, you use a smartphone because you need, at that moment, what it provides. You use a Mac/PC because you need what it can do in order to do your job, or what have you.

You use an iPad, however, because you’re now on your time, and can do the things youwant to do, rather than what you need to do. You can kick back and read, or watch movies, or draw, paint, fiddle with music, chat on social media, futz around on the Web, work on your novella, play Tiny Wings, and so on.

Again, there’s enormous overlap for these functions among all three of these device categories, but I still submit that smartphones and PCs broadly exist as “necessities” for modern work, and the iPad, broadly, is your off-time device, the device you use when “need” or “must” becomes “want” or “choose.”

I don’t see that going away by any means. For some, of course, a phone is enough, especially if they opt for phablets. For others, a laptop is sufficiently “casual” to use as one’s kick-back machine. But I think that tablets (iPads, really) excel in this space, being the device-of-choosing. That doesn’t mean that they will remain explosive in terms of sales. Once you have a good iPad, you don’t need to upgrade often at all. But if the space that the iPad occupies can be described as a “niche,” then it’s a deep and wide niche, one that will not be covered over any time soon.

(And a small note on phone sizes: Having now owned, briefly, a couple of medium-sized Android phones that were overall excellent, there’s just no getting around that I highly, highly prefer the smaller size of the iPhone. One-handed use is something you don’t realize how much you miss until it’s gone, as it’s so incredibly useful and convenient that its utility dwarfs the benefits a larger and more dazzling screen might provide – and I include the 4.7" Moto X as too big. So I actually suspect/hope that phones won’t subsume the iPad in that particular direction by increasing in size. And I hope Apple doesn’t abandon the current iPhone size as it almost certainly also introduces larger iPhones this year.)

Public Discourse, Public Persona

Marjorie Romeyn-Sanabria counters the eulogy for Twitterwhich I responded to here, with thoughts about what makes Twitter valuable:

Twitter is a portal into public discourse, a tool that allows a glimpse into groupthink, and provides a platform to build your own public persona.

I have used the hell out of it for this specific purpose. When I began as a skepto-atheist blogger in earnest in 2008, still early for Twitter, I made a point of arbitrarily following almost anyone who had “atheist” in their profile bio, just to get the attention of a potential audience of net-savvy nonbelievers. I actually think this really helped, and I don’t think it’s something that you could pull off today, now that there’s a critical mass of both early adopters and normals using the service.

Today, I intentionally use it not just to vent or trade quips with friends, but to serve as a sort of marketing service for “Paul,” a kind of regularly-updated reminder that, hey, there’s this witty guy who’s always anxious and worried who writes pretty well. Maybe you will find that fact useful or monetarily valuable.

But I don’t know if that kind of usage, beyond that of celebrities, is what Twitter-corporate needs to stay solvent and relevant. It may, because while I’m a nobody, somebodies still do this all the time, and that has to matter.

For one’s own use of Twitter, counteracting my lament about dudgeon and finger-wagging, she recommends this:

In order to maintain the “good neighbors” aspect of Twitter, users may have to put up some good fences to protect their conversations from the wake of larger-scale shouting matches.

Right, which is why more often you see folks in the know recommending you start pruning your follows. TechRepublic’s Jason Hiner has a whole piece about this, where he even says he emails folks he unfollows to tell them there’s no hard feelings. I wouldn’t necesarily go that far, but I am already becoming more merciless about who I follow on social media. But there are still many who I’d rather kick out of my streams, but fear that fact being noticed. God forbid someone disapprove of something I do!

Twitter lists are key for this kind of curation, if you use the service as heavily as I do. There’s the big stream of everybody I follow, and then I have lists based on who I just don’t want to miss out on, or by topic/beat (like tech or politics). The problem is that, as far as I’ve seen, the only really good way to get everything I want out of lists and notifications is Tweetdeck, which, while great on the desktop, works only as a janky web app on the iPad, and not at all on iPhone. Time was I would tweet at least once a day at @Twitter to please, please make a native Tweetdeck client for iOS. I have thus far been ignored (see: me being a nobody).

Lament for a Pre-Dudgeon Twitter

The enemy of Twitter? It's us.

Well, not me. But possibly you.

Here's Adrienne LaFrance and Robinson Meyer with a eulogy for Twitter:

Twitter used to be a sort of surrogate newsroom/barroom where you could organize around ideas with people whose opinions you wanted to assess. Maybe you wouldn’t agree with everybody, but that was part of the fun. But at some point Twitter narratives started to look the same. The crowd became predictable, and not in a good way. Too much of Twitter was cruel and petty and fake. Everything we know from experience about social publishing platforms—about any publishing platforms—is that they change. And it can be hard to track the interplay between design changes and behavioral ones. In other words, did Twitter change Twitter, or did we?

Twitter changed, for sure, but that's not the real problem. It was totally us that spoiled it. And, again, not me. But maybe you, and a lot of other people who came on board to (unwittingly I presume) find things that emotionally fire something up in them, and allow them to feel morally superior, either by dint of being offended or as part of an upright citizens' mob against someone who said The Wrong Thing.

More LaFrance and Meyer:

…When it was good—when it is good—Twitter created an environment characterized by respect and jokes so funny you wanted to show the person sitting next to you in real life. Not agreeing could be productive, and could happen without devolving into histrionics. The positive feedback loop of faves and interactions didn’t hurt, either.

It can still be this way from time to time. The authors say that nobody “hangs out there” anymore, but I still do. It's like a neighborhood you grew up in, and love and know intimately, but then the place starts getting developed and folks who don't appreciate the place's quirks move in and try and sanitize it.

So there's Google+. I'm there a lot more lately, but as others have noted, this has a lot to do with the fact that so few people are there. That it hasn't taken off with the general public is a feature, not a bug. The folks that are there, well, they're not unlike those who were on Twitter in Olden Times. Early adopters, a little more technologically sophisticated, and eager to experiment with a new publishing platform. But of course, now Google+'s future is in doubt.

But in the abstract I prefer Twitter, because of its parameters, its limitations. The modern Web is too full of bells and whistles, of full-bleed images and dynamic content, of Choruses and Snowfalls. Twitter is (was) 140 characters of text, and we embraced the quirks and kludges that needed to be adopted within those parameters to make a little more sense of it all. It was simple, it was busy, it was a percolator of thoughts, both profound and profoundly silly.

Now, it's people finger-wagging and high-horsing. Now, it's people trivializing the grave and ascribing gravity to the trivial. Now, it's high dudgeon as parlor game. Now, it's a lot of sadness.

For me, I mean. Maybe not you.

I hold out hope that there will be a boiling point, where the finger-waggers become so chronically incensed that they'll move on, and a little of what Twitter was might come back. I'll wait it out a while longer.

Hey, there's always


Do You Swipe Your Thumb at Me, Sir? (The Big iPhone vs. Android Problem)

I miss my iPhone, but there's a few small things and one big thing keeping me from going back. 

To recap, I had traded in my iPhone to T-Mobile to get out of my AT&T contract and lower my monthly payments substantially, and switched to a cheaper, unsubsidized Android device, which is now a Moto X. (I'm actually back on AT&T now because they offered far better coverage, an even lower rate thanks to a new discount, and I'm still not on contract.)

The Moto X is a great phone. It's as thoughtfully designed as an iPhone, free of the crapware that plagues the Android universe, and full of genuinely useful, subtle features. 

But I really miss having an iPhone, and I'm considering seeking out a cheap, used one I can just drop into my existing plan (meaning no new contract). There are two big reasons:  one being ergonomic (even being a smaller phone for an Android, the Moto X is still not quite small enough for my wee paws), and the other being access to my music library. No matter what I try to do, I can't get my music listening experience on Android to come anywhere close to the way it was on iOS. And that wasn't even all that great, but it was a dream compared to the hacky kludge-fest I've been living through with Android. No, I won't subscribe to Spotify. 

But! On my Moto, I have an app that disables the PIN code lock screen when I'm on preapproved WiFi networks. I can activate its voice commands without ever touching the phone ("Okay, Google Now..."). I can share data from any app to any app. Those are quite nice, and would be sorely missed if I were back in iPhoneland. 

The big thing, though, is the keyboard. Here's an explanation from Andy Ihnatko from when he made the iPhone-to-Android switch:

[T]he real Win of an Android keyboard is its enhancements to the classic “tap and type” mechanism.

Android offers Swype-style typing as a built-in option. By sliding my finger from key to key instead of lifting and tapping, I’m sending more information about my intentions to the OS. It makes this mechanism faster and more accurate than tap-tap-tap. Swipe-style typing also makes the phone easier to manage one-handed. I can search for a name in my contacts without even slowing down my walking.

And if you don’t like any of the keyboards that ship with Android, you can install one of your own. My add-on keyboard of choice is SwiftKey. It’s doubleplus-brilliant and costs just four damn dollars. …

I find that typing on an Android device is faster and much less annoying than typing on my iPhone. It’s not even close.

This example also points out some of the philosophical differences that often allow Android to create a better experience for the user. Why is the iOS keyboard so stripped-down? Why can’t the user customize the experience? Because Apple’s gun-shy about adding features at the cost of simplicity and clarity. They’re not wrong; it’s a perfectly valid philosophy, and usually an effective one.

But sometimes, an Apple product’s feature lands at the wrong side of the line that divides “simple” from “stripped down.” The iPhone keyboard is stripped-down.


This is huge, and it's by far the biggest thing keeping me from running back to Apple. The difference between tap-typing and swipe-typing on a phone is night and day. Swipe-typing is much faster, much more accurate, and even a little fun. It's perplexing to me that the folks who work at Apple don't want this functionality for themselves. This is not some geeks-only niche add-on, it's a fundamentally superior way of inputting text on a smartphone. And guess what one does a whole hell of a lot on a smartphone.

The wisest thing for me to do is wait, see what Apple comes up with for iOS 8, or hope some developer at least makes a quality standalone word-processing iPhone app that uses swipe-typing. I don't expect it, though. It's not like this is a new idea. Apple's had plenty of time to add this feature, and if they haven't yet, there's no reason to suspect they will. 

I still think the overall experience of using an iPhone is a superior one to Android, with less friction, more general enjoyablilty, and with a nicer class of apps. But goddamn that "tap-tap-FUCK-taptaptap" keyboard. Goddamn it. 


Your $2000 iPhone

An interesting infographic (source) on the full cost of iPhone ownership -- and really, it applies to all modern mid- to high-end phones, not just iPhones. I can't vouch for all the numbers, but this is good to have in the back of your mind when thinking about phone purchases. 

The first mistake people make is believing that the subsidized cost of the hardware (usually $200) is the real cost of the device, when it's far wiser to remember that this small device you're haphazardly tossing around and shoving in your jeans pocket is actually a super-advanced computer that costs roughly $650 or $700. 

But if you consider all that goes into owning one to make it usable, namely the cellular data and accessories, you really are in the $1500-$2000 range. That's more than almost anyone other than professionals pays for a brand new, top-of-the-line PC!

Which may be fitting when you think about it, since these phones really are our PCs these days. (As opposed to tablets, which we all thought would be our PCs, and probably aren't going to be, but that's another conversation.) And in many ways they do more for us -- more of what we actually want to be doing -- than our desktops and laptops ever could.