The Tablet Reconsidered: A High Bar for the Middle Space

Image credit: Blunt on Flickr.
The tablet computer as we know it is about to turn 5 years old. Yes, tablet computers in some for or another existed before then, but on January 27, 2010, Steve Jobs introduced the iPad, which truly formed the basis for what we currently understand a tablet to be. Whatever we considered a tablet computer to be before then no longer counted. The iPad line itself is now in its sixth generation of iterations. So now that we’ve had tablets for half a decade, what can we fairly say about their impact and their role in our digital lives? Have they lived up to their promise, and do they continue to justify their existence as a product category unto themselves?

Yesterday, I wrote about how I was considering abandoning the iPad, despite my iPad evangelism of the recent past. As delighted by iPads as I have been, I’ve lately found them to be less useful and less necessary, as other products, namely smartphones and laptops, encroach on their territory.

I thought it might help to go back to the beginning, to the iPad introduction where Steve Jobs laid out the case for Apple’s decision to make the iPad to begin with. (As a side note, man, does going back to those old videos make me miss Steve.) As was his wont, Steve made some pretty bold claims about what role the iPad should play, and how it excelled in comparison to other devices, and all before anyone outside of Apple, Inc. had ever even used one.

“All of us use laptops and smartphones now,” he said, and asked, “Is there room for a third category of device in the middle?” He posed this question obviously as a fan of smartphones and laptops, being the guy who makes iPhones and MacBooks.

“The bar is pretty high,” he said, and he was right. In order to justify their existence, “those devices [in the middle] are going to have to be far better at doing some key tasks.” Among those tasks were:

  • Browsing the Web (which he said was a “tall order”)
  • Doing email
  • Viewing and sharing photos
  • Watching video
  • Enjoying music
  • Playing games
  • Reading ebooks

This product would have to be not just adequate at these tasks, but superior to laptops and phones. “Otherwise it has no reason for being,” Steve declared.

It’s interesting to note how definitive this position is. Steve wasn’t saying that the iPad was merely going to be a nice or novel way to do some computing tasks and consume some media, but that it would be better at all of those things than even the very products Apple made at the time.

And for the time of its introduction, and for some time after, it was hard to argue with him. Consider the state of things at the time, even in the best case. The iPhone 3GS, then the most current phone, had a 3.5“ screen – unthinkably small by today’s standards. On the flip side, the smallest laptop screen Apple produced was that of the 13” MacBooks. The MacBook Air as we know it (with the tapered design and all-solid-state storage) didn’t yet exist. MacBooks weighed just under 5 lbs. and their batteries lasted a handful of hours at best. So we have a situation in 2010 in which mobile screens are tiny, and laptops are relatively heavy, short-lived on battery, relatively slow with traditional hard disks, and often hot. The iPhone was super-personal, but limited in functionality in comparison to a MacBook. The MacBook was super-powerful, but too much of a “machine” to feel personal or “fun.”

And for the sake of this discussion, let’s grant Steve’s position at the time that netbooks “aren’t good at anything” and dismiss them as viable options.*

Coming into this environment, the iPad is very compelling. It’s personal, in that you hold it in your hands, sitting back in a chair or what have you, and manipulate the content with your fingers. “It is the best browsing experience you’ve ever had,” Steve said, and repeated variations on “Holding the Internet in your hands; it’s an incredible experience.” And it was! It was also relatively powerful, powerful enough anyway. You could take care of email, and get real work done if needed. “It’s a dream to type on,” said Steve, which I think is arguable at best, but I’ve found it to be a better typing experience than most, I think. Perhaps due to my wee little hands.

He made other claims. He said it was superior for watching TV and movies, which I think was and still is true today.

He said it was the best way to enjoy music, which I think was and is still untrue. There’s no beating a pocketable device for being the central repository of one’s audio content. The iPad is arguably not even superior to a MacBook for music, as a MacBook can sit at a desk attached to a good sound system, where it’s easy to assemble playlists and do other fiddly things with one’s collection. An iPad simply has sub-par speakers and a big screen going to waste on album art. So, sorry, Steve.

He said it was the best way to read e-books, granting that the Apple was “standing on [Amazon’s] shoulders” to do the Kindle one better. I think that claim was a wash at the time, as the iPad did not yet sport a high-res/Retina display, and text on the E-ink Kindles was much nicer to read. Today, it’s also a tough call. The glowing, high-resolution displays of the Kindles Paperwhite and Voyage are wonderful for reading, but a Retina iPad mini is in many ways just as nice, and there’s no way a Kindle can best the iPad at ease of use in user interface. So I don’t think this match-up has been sufficiently settled.

One claim I think we can say is settled is the idea that the iPad boasts “the best interface we’ve ever seen” for productivity apps, which at the time were the first iOS iWork apps. Maybe they were the best interface for tablet productivity apps, but that bar was so low that it was probably underground. But then as now, despite the many strides Apple and developers have made in the productivity space, the iPad still can’t come close to matching the laptop. I considered for novelty’s sake writing this piece on my iPad, but I couldn’t bare the thought of doing longform writing, editing, and formatting on it. So here I am on my MacBook Pro.

That said, I decided to do my writing at the local Starbucks (my “satellite office”), and since I hadn’t charged my Mac in a while, I had to make sure I brought my AC cable and found a seat near an electrical outlet. With the iPad, there would have been a much better chance that I wouldn’t have even had to think about whether the battery would last.

It’s not 2010 anymore, of course. In five short years, the landscape has changed enormously. iPads have evolved and improved to be almost unbelievably thin and powerful, and now come in two distinct screen sizes.

Other manufacturers, who once rushed out laughable “competitors” to the iPad now make all manner of quality hardware, from the inexpensive-but-dead-simple Amazon Fires, the svelte and slick Nexus 7 and nVidia Shield, to the high-end, ultra-high-resolution Samsung Galaxy Tab S line. The biggest problem faced by these devices is the simple fact that the Android software ecosystem is still pretty lame for tablet-optimized apps, and that often these manufacturers make overly-complicated interfaces in order to squeeze in unnecessary “features.” (Not the case for Amazon or Google/Nexus of course.) iPads remain almost indisputably superior, but they are no longer the only good choice, and the gap narrows more and more all the time.

But the real challenge to the iPad and to tablets generally is that the space in the middle that Steve Jobs talked about in 2010 has shrunk. A lot.

Let’s look at laptops. Again, in 2010, good laptops (meaning MacBooks) were 5-pound hunks of metal that needed power and heat dissipation and had slow spinning-disk hard drives. Today, MacBooks are light, svelte, have incredible battery life (with MacBook Airs well outlasting iPads), and game-changingly fast solid state drives for storage. Even MacBook Pros are lighter, thinner, and longer-lasting than ever before, coming close to the iPad in battery life. A full-size iPad has a 9.7-inch screen, but one can also buy a MacBook Air with an 11-inch display with a similar footprint, and if you match storage capacities between devices, the prices are only about $100 apart.

If I need to run out to, say, Starbucks and want to get some work done, is it really more convenient to bring an iPad than a MacBook Air? The MacBook will have a far-better keyboard, and have vastly superior functionality for things like working with multiple apps in multiple windows, and text editing and formatting (and simple things like copying and pasting).

But what if you’re hitting the coffee shop in order to kick back and relax, browse the web and read a book? Well, the iPad is great for that, and if you need to do some work, it can be done reliably. So is that the trump card for the iPad? As I’ve written many times before, the iPad for me is a “choose-to-use” device, the thing you reach for when the work you have to do is done, which usually happens on a PC or a phone. And in this scenario, sure, the iPad absolutely beats the laptop.

Of course we now have to look at the state of the smartphone. In 2010, phones were mostly small. The iPhone 5 with its 4-inch display was more than two years away, and even that is considered small by today’s standards, and the current new line of iPhones’ smallest screen size is 4.7 inches (though the 5S and 5C are still being sold as new by Apple).

But that’s just the tip of the iceberg. Big phones are now the norm in the Android space, and Android commands the largest chunk of smartphone market share. Anything under 5 inches is now considered “compact” or “mini” in Android-world. The most highly-regarded Android phones of the last couple of years, the 2014 Moto X, the HTC Ones M7 and M8 the Samsung Galaxies S4 and S5, and the Nexus 5, are all over 5-inches. (The 2013 Moto X was 4.7 inches, and it was regarded as adorably small.)

[Note: As I write at this moment, my battery on my MacBook is at 18% and I need to dig out my cable and plug into the wall. So there’s that.]

Image credit: 彭家杰

And this doesn’t even get into the world of phablets, loosely characterized by having displays of 5.5 inches or more. Even Apple now produces a phablet in the iPhone 6 Plus (5.5“), and its fans are zealous ones, and the Samsung Galaxy Note 4 (5.7”) is nearly universally adored, both for their large, high-resolution displays, and for one other big benefit: They obviate the need for tablets for many people. In particular, I hear anecdotal tales of people forgetting about their iPad minis, and I include myself in that. And then of course there’s the Nexus 6 at 5.96 inches.

With my excellent new LG G3 (5.5″) my iPad mini 2, at 7.9 inches, was rendered almost entirely redundant. I sold it, and assumed that I’d still require a tablet in my life, and certainly I’d still want some iOS device, so I got a used iPad Air. But I found I still wasn’t using it much, despite the fact that it remains superior in some ways to the phone and the laptop. But maybe not in enough ways.

Last year I wrote in defense of the iPad as a writing device, comparing it to smartphones as cameras, adapting the adage that the best camera is the one you have with you. The iPad was there, and sufficiently capable to make it a great device to write with right now when the thought strikes. It is with you, isn’t it?

But my iPad stopped being with me all the time, and if I have to seek it out to write, I might as well seek out the better writing device, my MacBook.

What about the other areas in which Steve Jobs said the iPad was superior? Web browsing stands out, certainly, as there really is nothing like having an entire web page in your hands, one that you control with your fingertips. But it’s not so much better than doing the same thing with a large phone. It’s better, but not enough that it means I’m going to stop what I’m doing and seek out my iPad.

Book reading? Nope, the phablet is better. Ultra-light, a large enough screen comparable to a mass market paperback, and ultra-high-resolution, crisp text. The iPad is a great e-reader, but the large phone with a high-res screen is perhaps the best one, maybe even better than the best Kindle.

Games? I’ll give this one to the iPad, certainly, at least for the games I like. Scrabble, Monument Valley, Robot Unicorn Attack 2, Tiny Wings, Bejeweled Blitz, Crossy Road – these games are much better on the larger display of an iPad. Other games it’s more of a wash, like Threes.

For me, as an evangelical enthusiast of gadgets like these, there is an element of sentimentality attached to the iPad and tablets. Though I didn’t even own one until the iPad 3 in 2012, they entrenched themselves into my psyche very quickly (and I suspect of millions of others as well). Whereas in 2007, almost nobody even had a smartphone, out of nowhere we’ve reached a place where it seems like a middle-class person in an economically advanced country is “supposed” to have a PC, a phone, and a tablet. (Kudos to Apple’s marketing for convincing us of this, whether or not it’s true.) I still love my iPad; it’s a beautiful, powerful, fun device. I have a genuine affection for it, and for the brand (again, a bow to Apple marketing). To disavow the use of an iPad, to even consider it, well, feels like a kind of apostasy. Like I’m going to disappoint someone or some higher power. (Stop glaring at me like that.)

Having an Android phone, I would also miss being in both conversations, as it were, because with no iPad, I have no iOS device. It also throws out all the time and money invested in that software ecosystem. But this is my personal issue, not a facet of the broader discussion. My personal decision is not yet made, and these things are always fluid, particularly for me. A change of mind a few weeks after any decision could mean more buying, selling, and trading to reconfigure my setup once again. I’m lucky that such a thing is even feasible, with a little work. And it’s fun.

At the end of the 2010 iPad event, Steve Jobs summed up what he had introduced as “Our most advanced technology in a magical and revolutionary device at an unbelievable price.” But now even price is no longer a marquee aspect of the iPad. At the time, people were shocked it wasn’t $1000. Today, very good tablets can be had for just over $100. iPads remain the best tablets, and really good ones (like the 16GB iPad mini 2) can be had for about $300, which is fair.

But with the encroachment of phones and laptops onto the iPad’s “middle space,” it’s hard to beat the price of zero dollars: No tablet at all.

– – –

Update: I have an addendum post that also takes into account something Steve Jobs didn’t: Comics and graphic novels.


* As my friend Tom Loughlin pointed out in the comments of my previous post, Chromebooks occupy an interesting position in all of this, as not-quite full power PCs, but not-quite tablets, but a kind of secondary or “spare” PC for portability, battery life, and kicking around on a budget. They don’t quite qualify for being part of this discussion per se, but one could for the most part transpose Chromebook for MacBook throughout this post, but I also understand that it could for many be seen as an iPad replacement.

Here’s to the Dirty One: Tim Cook and His 53 Minutes with Charlie Rose

Screen Shot 2014-09-13 at 10.07.16 PM
Apple CEO Tim Cook sat down on Thursday for the first half of what will be a two-part interview with Charlie Rose. I’m embedding the video, and below that I’ll have some thoughts.

Some things that got me thinking:

  • This is a good venue for Cook. He’s not tied to a script or performing for a crowd. He’s at a table with another dude in a black room just chatting. He’s humanized.
  • There was a lot of talk about Steve Jobs and the shadow he casts over Cook and Apple as a whole. I don’t think any particular statement by Cook about Jobs is new or newsworthy, but I will say that it’s the first time I got what seemed like a genuine sense of the emotional attachment Cook feels toward Jobs and his memory. You can hear his voice become louder, and see his eyes widen and his gestures sharpen as he talks about Steve and what he continues to mean to Apple. I think it takes an interviewer like Charlie Rose to bring something like that out of as reserved a person as Cook.
  • I really get the sense that when Cook talks about Jobs, he’s talking about a paternal figure, a kind of dad-who’s-my-hero. “I literally think about him every day,” he says. “He’s in my heart.” He talks about how he never really took to heart the idea that he could lose Steve to cancer. “I always thought he would bounce,” he says.
  • It was also telling how sincere his enthusiasm for the Beats acquisition seemed, but not because of headphones, which barely warranted a mention. It really did seem to be all about how the music service “felt” to him, and of course the unspoken part was how much better it was than Apple’s own, which also uses human curation, the Beats’ service claim to fame.
  • When asked who Apple’s competitors are, Cook mentions only Google. Rose offers Samsung as an obvious competitor, but Cook demurs and makes sure to note that Google makes Samsung devices’ operating system. Even Amazon is dismissed as a company that makes a phone that “you don’t see it in a lot of places,” and, yes, “they have some tablets.” And that’s about it.
  • Closer to the end, Cook mentions that Apple is working on “products that haven’t been rumored yet,” and I think I may have some fun with speculation on that one.

Oh right, the Teddy Roosevelt quote they talk about is this:

It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.

And what does Cook say of himself in this context? “I’m the dirty one.”

The Waning Religious Experience of Apple Keynotes

When I first became an Apple fan about a decade ago, it took an inordinate amount of effort for me to actually see one of their keynote events or product announcements. They were rarely livestreamed, for one (though few things were back in 2005 or so), and even when they were made available online, I had crummy Internet connections or spotty wifi, and the stream would often come in pixellated and jittery.
Even as the stream quality and accessibility improved over the Steve Jobs era, it never seemed like Apple much cared whether you watched. These are technically press events (or developer events in the case of WWDC keynotes), and while it was fine if the general public watched them, that wasn’t really the point. Whatever the actual thinking behind them, it always seemed to me that Apple just didn’t really care whether fans could see these things. If anything, that only made me want to see them more. As a true believer, I would seek out their wisdom.

As someone who found little to be enthusiastic about in any medium, the Apple keynotes were a kind of religious mass for me, where I and a diffuse and relatively small collection of enthusiasts would go out of their way to see what our Dear Leader, The Steve (peace be upon him), would bless us with, and what lessons he would have to teach. I didn’t (and don’t) care about sports events, I didn’t (and don’t) care about entertainment industry award shows. Everyone else did, it seemed to me, but Apple keynotes, they were my thing.

When I first began watching these (and I use this term specifically) religiously, Apple was unveiling things like Macs, computer operating systems, and iPods. Thanks to the popularity of the iPod, these events did grow more popular, but I suspect they still remained mainly the purview of the zealots like myself.

I suppose something changed with the introduction of the iPhone, and later the iPad. The masterful presentation of both those devices, along with their imminent ubiquity, made something shift in the public hunger for the Big Apple Show. I suppose that even to the nominal Apple fan, the thinking went, something like, “that Steve Jobs keeps showing us amazing things every time he steps on stage – I want to watch it happen!” And the charisma of Jobs, even when he was visibly ill, was undeniable.

But even as the audience for these events grew, it didn’t seem like the flock of Apple devotees was necessarily growing at the same clip as the raw consumers. Yes, people were terriby excited about new iPhones and iPads, no doubt. But whatever religious feeling Jobs and Jony Ive instilled in so many of us, our particular devotion wasn’t shared by the masses who were merely buying cool gadgets. Apple was now dominant in the industry, and its place in the culture was no longer one of a scrappy upstart that knew better, but still couldn’t break through to the unwashed masses. Now everybody, washed and unwashed, wanted iThings.

And as we saw with this latest event unveiling the iPhones 6 and 6 Plus and Apple Watch, Apple has completely changed its outward attitude to the events themselves. No longer posting them as a bone-throws or afterthoughts, they positioned this one as a first-order mass media event, akin to the Oscars or the Superbowl, complete with a homepage-dominating countdown clock. (The fact that the livestream was a technical disaster is beside this immediate point.)

Part of this, I think, is because Tim Cook will never be Steve Jobs. Neither will Tim Cook + Phil Schiller + Bono + insert-name-here. Steve can no longer draw us in, so if Apple wants to keep up the mystique of the must-see product unveiling, they will have to fan some of the flames themselves.

I don’t know how many people saw (or tried to see) this latest Apple event. But I think it’s a safe bet to say that the vast majority, while excited about the new products, were not “Cult of Apple” zealots. We don’t get the same kind of sermons we used to from Steve, instead we get beautifully made opening videos, or narrations of a design process from Jony Ive. That’s all fine. It’s not the same, but it’s good, it’s fun.

And I still try and squeeze some religious feeling out of these things, difficult as it can be. I want to be bowled over, I want to be convinced of Apple’s ethos as they present it. At least for those couple of hours.

But just as the twinge of ecstasy some folks feel after church wears off after returning to their regular lives, so too can an Apple zealot like myself step back and then think more critically. It is just about, gadgets, right? They’re just a big, rich company, right?

Kind of. But for a while it was my thing.


 

Related: Three friends of mine joined me for a one-off podcast discussion about the Apple event. Check it out here.

The Apple Ethos is Bigger Than Apple Itself

Steve-Jobs-Memorial-Video-625x351
The idea of Apple as a “cult” or “religion” is often expressed somewhat derisively, but there’s no doubt that for many, many people the company represents something beyond the products it produces, particularly when referring to Steve Jobs’s Apple in particular. When Jobs died, I wrote about how the grieving wasn’t exclusively about the loss of a man, but of an ethos, a way of thinking and working. Here’s John Gruber explaining that ethos back when Jobs resigned in 2011, as his illness was overtaking him:

The company is a fractal design. Simplicity, elegance, beauty, cleverness, humility. Directness. Truth. Zoom out enough and you can see that the same things that define Apple’s products apply to Apple as a whole. The company itself is Apple-like. The same thought, care, and painstaking attention to detail that Steve Jobs brought to questions like “How should a computer work?”, “How should a phone work?”, “How should we buy music and apps in the digital age?” he also brought to the most important question: “How should a company that creates such things function?”

And then Gruber ends with the refrain that was often heard during the transition from the Jobs era to that of Tim Cook:

Jobs’s greatest creation isn’t any Apple product. It is Apple itself.

Jobs himself probably knew this, that the ethos of Apple was the product, the one that would birth all others to come. And that’s why he established Apple University in 2008, an institution internal to Apple that would formalize the propagation of the company culture long after he’d be there to do it himself. Brian X. Chen at the New York Times got some rare glimpses into Apple University, which, like the rest of Apple, is kept under tight wraps.

Read the article if you’re curious about things like the individual classes, but this quote from analyst Ben Bajarin is a good idea of what the motivation behind it is:

When you do the case studies on Apple decades from now, the one thing that will keep coming out is this unique culture where people there believe they’re making the best products that change people’s lives. That’s all cultural stuff they’re trying to ingrain. That becomes very difficult the bigger you get.

And from Apple folks Chen spoke to:

[Employees] described a program that is an especially vivid reflection of Apple and the image it presents to the world. Like an Apple product, it is meticulously planned, with polished presentations and a gleaming veneer that masks a great deal of effort.

Most folks will hear about Apple University and the attempts to codify the Jobs ethos and presume this applies more or less exclusively to the goings-on in Cupertino. But I can tell you from my own experience as a blue-shirted retail drone that, university or no university, the culture and values of Jobs and the company are instilled across the corporation’s many manifestations.

Matthew Panzarino understood this when he wrote his own response to Jobs’s resignation:

This philosophy has been instilled in Apple employees from the Retail Stores to the executive staff. No other major technology company employs staff as convinced that they are producing some of the best products in the world. This is a result of Jobs’ ability to lead by example, infusing the corporate culture with that same passion and pride of a creator.

To the retail employees, the store was (and I assume is) as much an iconic product as any iThing, where even those of us at the bottom of the ladder felt a little drunk on the effects of Reality Distortion Field and googly-eyed from the glow of the logo. There’s a reason why a customer’s experience with employees at Apple Stores is so vastly different from anything else in the mall, and most anywhere else for that matter.

It seems to me, given that the Apple ethos is being instilled in a formalized way within the company, that it’s kind of a shame that it isn’t done outside the company as well. Not by Apple itself, of course, as that would be against its interests as a company; why teach competitors how to do what you do?

But go back to the retail example. Now that I know how things were done within Apple Retail, imperfect as it was, I’m now ruined for all retail experiences outside the metallic box of the Apple Store. From high to low-end, I leave most retail experiences sorely disappointed, shaking my head and thinking about how whatever I just went through would never fly at Apple. There would have been an extra mile not traveled, a question unasked, a consideration not made.

This is just one example. Imagine if more aspects of life – be they commercial, governmental, artistic…anything – were to take more of an intentional queue from the way Apple works, or at least strives to work. Maybe there’s something funny about the idea of the “Cult of Apple” and the messiah Steve Jobs (peace be upon him). But if someone truly qualified offered a course or a certification in this ethos, I’d sign up right quick.

Apple may have been Steve Jobs’s greatest product, but it’s the ethos that fuels it is an even better one. I certainly wouldn’t want a world of short-fused, megalomaniacal Steve Jobses, but I would like a culture that aspires more intentionally to the fractal Gruber described: “Simplicity, elegance, beauty, cleverness, humility. Directness. Truth.” That’s a church I could believe in.

Collective Genius and Brains in Vats

Screen Shot 2014-07-31 at 4.31.43 PM
I’ve at times felt some discomfort of the idea of the “genius,” maybe chiefly because I discovered I am not one, much to my delusional childhood chagrin. But the more one knows about one’s brilliant heroes, who despite having powerful creative and intellectual gifts are also rife with human flaws, one begins to see that for “genius” to flower and actually become something meaningful, someone else needs to work alongside the genius. Last year Hélène Mialet wrote at Wired about Stephen Hawking (someone we regard as easily qualifying as a “genius”) and how in many ways Hawking is a “brain in a vat,” and what we think of as “Steven Hawking” is really a larger gestalt of brilliant people:

In another version of Hawking’s story, we notice that he is more “incorporated” than any other scientist, let alone human being. He is delegated across numerous other bodies: technicians, students, assistants, and of course, machines. Hawking’s “genius,” far from being the product of his mind alone, is in fact profoundly located, material, and collective in nature. … What I discovered was that to understand Hawking, you had to understand the people and the machines without whom he would be unable to act and think; you had to understand the ways in which these entities augmented and amplified Hawking’s competencies.

And I think this applies to a lot of people we think of as either geniuses or extremely important or powerful. There’s a reason we can refer to President Obama, the White House, or “the administration,” and mean the same thing each time. The language we use reflects the understanding that the words and deeds of “President Obama” are actually those of a vast array of human beings working extremely hard, all under the banner of “Obama.” Barack Hussein Obama the person may be the hub of that network, the brain in the vat, but he is only one part of it. The successes and failures we ascribe to him are really borne by that network.

Joshua Wolf Shenk in a piece at the New York Times narrows this idea of “group genius” to the pair.

[A]n impressive body of research in social psychology and the new field of social neuroscience…contends that individual agency often pales next to the imperatives of a collective. The elemental collective, of course, is the pair. Two people are the root of social experience — and of creative work. When the sociologist Michael Farrell looked at movements from French Impressionism to that of the American suffragists, he found that groups created a sense of community, purpose and audience, but that the truly important work ended up happening in pairs, as with Monet and Renoir, and Susan B. Anthony and Elizabeth Cady Stanton. In my own study of pairs, I found the same thing — most strikingly with Paul McCartney and John Lennon.

I found this both enlightening and alarming. As readers of my previous blog have been told severally, I am a pretty severe introvert, and the idea that “genius” really requires, or at least best thrives, with people in pairs worries me a bit. (The fewer the people in a group, the more the intimacy is increased, and the more threatening that can be to us introverts.) This is not to say I have never been or am incapable of being collaborative with another human being — I used to be a professional stage actor! — but especially these days it doesn’t come up much. I don’t even work in physical proximity to my coworkers, but alone in my home office.

Perhaps Lennon needed McCartney, Carl Sagan likely needed Ann Druyan, and presumably Steve Jobs needed Steve Wozniak and later Jony Ive (or Tim Cook?) — not as the person behind them, but as creative partners — and they were fortunate enough to find each other and decide to collaborate. But all these collaborations took place before the Internet dominated so much of our social lives. These people had to interact in meatspace. Even the founders of Google, Larry Page and Sergey Brin, came together in person in a garage.

But what Google and others helped enable with the rise of the Internet and the Web was collaboration — deep, meaningful, substantive collaboration — between people who have never been in each other’s physical presence. That gives me hope.

Now, just because it’s been enabled, it doesn’t mean that it’s necessarily happening much. I genuinely don’t know, and we’re really only on the cusp of this being a viable way to work and create together, via our connected devices, never in the same room. As someone who works from home, I feel pretty confident that good work and collaboration is not only possible, but in many ways improved and augmented with remote interaction, and I can thrive and excel in a vat-brain support network. But is this how the next “White Album” or Cosmos or iPhone will come to be?

I’d bet that probably yes, eventually, when more areas of friction in communication are removed, and there’s no meaningful difference between popping into someone’s office with a sudden whim and doing its equivalent online.

Our Finite Lives During the Tech Revolution: Hello from iMortal

Steve_Jobs_with_the_Apple_iPad_no_logo
Anyone with a passing interest in technology will be familiar with the “cult of Apple” cliché, the idea that Apple’s core users are less customers or fans, and more devotees and fanatics.

I’ve had a lot of fun with this idea, casting myself as a fundamentalist disciple of “The Steve” (peace be upon him), asserting, with tongue partially in cheek, that Apple could do no wrong, that all of its design choices and marketing messages were divinely revealed through Steve himself, as well as certain chosen prophets like Jony Ive, he of the Perfect White Heaven of Industrial Design. For a time, I even donned the blue shirt and belonged officially to Apple’s priesthood.

But as much as I do admire and connect with Apple and its devices, I am kidding on the square with all of my quasi-religious proselytizing and rhetorical genuflecting.

Along these lines, last year Brett T. Robinson wrote about the mystical properties being ascribed to the iPhone, and how Apple had succeeded in selling a mix of the physical and the metaphysical:

The iPhone and its touchscreen interface engage the technological faithful at a heightened level of intimacy. The iPhone is not a cold and lifeless machine; it is an enchanted talisman, animated by touch. It mimics an encounter with the transcendent by mediating the infinite body of online information and communication possibilities.

I’m not certain if Robinson’s use of the word “talisman” in this context is the first I’d seen, but something about this idea, the iPhone-as-talisman, the tech gadget as religious artifact, really resonated with me. Instead of poking fun at the religious fervor of Apple fanboys, it suggested to me entirely new avenues of thought when it comes to the human relationship to our devices, the seemingly endless layers of technologies that surround us, support us, guide us, keep us, and in many ways define us.

Supernaturalistic religion is entirely false, baseless, dangerous, and on the decline. What moves in to fill some of the gaps it leaves behind? Ethics, science, humanistic compassion, and each individual’s own efforts toward making meaning within their lives, certainly. But I think that additionally technology, permeating our culture and superimposed over our day-to-day lives, is part of that. I think it’s a big part of that.

And I want to think out loud about that here.

iMortal will look at technology and the human experience, written from a skeptic and humanist perspective. Not every post will be about tech-as-religion per se, but the main focus of this blog will be this interplay of modern technology and the way we live our lives.

You like the banner? It was made by most excellent friend Justin Sapp, who often does these nice things for me.

The contents of my other blog Near-Earth Object will be migrated here over the coming weeks, so don’t let that particular hodgepodge of subject matter confuse you, but also don’t expect dogmatic adherence to a singular topic from here on out either. I reserve the right to continue to indulge in writing about my absurd adventures as a parent, non-philosophical gadget reviews, laments about politics, and other sundry things that catch my interest. (I will also continue to contribute to Friendly Atheist.) But the intent of this blog is to be more specifically focused than the previous one.

We are mortal beings riding an incredible wave of technological change the likes of which our species has never seen. At the same time, so many of us are still mired in Bronze Age myths and magical thinking. What’s it all mean? What are we in for? I certainly don’t know, but I’m dying to find out.

So this is iMortal, a blog about our finite lives during the tech revolution.

It’s Not Your Job to Know What You Want: A Jobsian Lesson for Government?

Matt Bai of the New York Times wishes Washington could learn some lessons from Steve Jobs.

In his obituary of Mr. Jobs on The Times’s Web site, John Markoff quoted him as explaining his aversion to market research this way: “It’s not the consumers’ job to know what they want.” In other words, while Mr. Jobs tried to understand the problems that technology could solve for his buyer, he wasn’t going to rely on the buyer to demand specific solutions, just so he could avoid ever having to take a risk. This is what’s commonly known as leading.

Bai goes on to lament that modern politics are too inflexible to adapt to change in the way Jobs forced technology to adapt. But I want to take Bai’s analogy far further than he probably intends.

Let’s say we really want politics and government to learn from Apple and Steve Jobs. Let’s then take the above quote and amend it slightly to, “It’s not the voters’ job to know what they want.”

Apple is often chided by many of the more hack-prone members of the technorati for having a “closed” system; there’s no upgrading the hardware or tweaking the software of an iPhone, for example. It’s exactly as Apple intended it. Indeed, Apple’s whole point is to say to the consumer, ‘You are in no way expected to understand how this works. Just use it and enjoy it.’

Is there an analogy here for politics? Let me take this all the way down the line for the sake of bloggy brevity: Should government be more of a closed system? Should not voters simply elect the representatives and leaders who reflect their values and who they trust to manage the almost-unmanageable machinery of government, and leave all the comprehension to those office holders? Might then government and politics be more flexible, more adaptable, and more efficient if every iota of its machinations were not broadcast and dissected for the under-informed electorate to (mis)evaluate?

I am not endorsing this approach, but it’s a worthy thought experiment. Could we still have an accountable and uncorrupt government and political system if the electorate is mainly in the dark — as a favor to them? Is there something simply leaning in this direction that does make sense and is less reminiscent of, well, 1984?

After all, if Apple’s approach is distasteful, the market will take care of them right quick, and folks can choose another option. Not so easy with government, I suppose, but as long as the principles of republican representative democracy remain in tact, cannot this radical idea be beholden to similar “market-based” (read: electoral) forces?

I’m not sure. Indeed, there may be nothing to this. But I’m going to keep thinking on it.

Grieving an Ethos, Ctd.

Following up from my post on Steve Jobs, I decided soon after posting that I needed to make one addendum.

Lots of folks have been throwing Jobs’ name into lists including Henry Ford and Thomas Edison, among others, and I think those particular two men raise an interesting point: they were both, reportedly, pretty abysmal human beings. Edison was apparently rabidly litigious and brazen in his filching of credit for others’ ideas. Ford was, of course, a virulent anti-Semite. But both men changed the world in ways that have proved to far outshine their uglier sides.

Steve Jobs, according to most reporting, was not the easiest guy to work for, accused by many of getting what he wanted by being abusive or a bully. I can’t speak to any of this, of course, and it’s unsettling to think of it. To be fair, he is also widely reported to be a great dad, a great friend, and a genuine lover of humanity. The point is that he was, like everyone, not perfect, and he probably caused a lot of unnecessary grief in his time. As the New Yorker’s Ken Auletta wrote yesterday:

You know, one of the things that’s interesting about Jobs is that in many ways he was not a nice man. And yet he was a brilliant man. He was not particularly kind to people. Ultimately, we won’t remember the personal cruelty; we’ll remember the great products. Ideally what you want to have is a greater balance between being nice and being effective than he achieved. But one of the questions is whether he could have achieved what he achieved if he were nice.

I’ll take that further. It’s okay to acknowledge his failings, his coulda-beens, and still feel a deep, painful sense of loss and despondency at his passing.

As someone who may not have been “a nice man,” he was certainly one of the crazy ones, in the best sense. No diss to Richard Dreyfus, but Steve’s was always the voice that belonged in this bit of transcendent ad copy.

Grieving an Ethos: Thoughts on the Loss of Steve Jobs

Most of the people I work with are, naturally, having very strong feelings about the death of Steve Jobs. But not all, and that’s fine. My wife is also not what I would call crushed by his passing, but she is extremely sympathetic and supportive; she understands how I revered the man and what he built. But I do tend to get hung up on the contrarians: why should I mourn — why should I feel so deeply sad — over this super-rich guy who I’ve never met?

It is a deep sadness. There’s no getting around it; my heart has been very heavy, as though its soaked in thick liquid. The idea of his absence is creating a physiological distortion that my mind can’t entirely process.

But I am also that guy who tends to roll his eyes at the gushing over dead celebrities. I didn’t understand the writhing over Princess Diana, she really did seem to me to be just a lucky rich person, who was probably very nice, and died tragically. But that happens all the time to good people who don’t happen to be really rich. I’m even worse with public fingers who at least in part bring it on themselves through excessively damaging choices. I am the eye-roller over public gnashing of teeth over the expiration over the otherwise super-fortunate.

My feelings about the loss of Steve Jobs is different, though, but it took me some thinking to figure out how.

Steve Jobs was not tapped by fortune, he was not swept into history by forces outside his control. He built everything he had bit by bit. He suffered for his sins, took his punishments, and climbed back out of the ditch to become even stronger than he or anyone else could have imagined. He did it with his own talents, his own mine, his own nearly-insane work ethic. His own love of the work.

And what he built is, of course, more than a company. Just like the Beatles did more than just record some albums. Like the Beatles, Steve Jobs affected the culture. He crafted an ethos, he championed a way of thinking, he embodied a particular spirit. Not all aspects of what he created were of his devising, and his successes were not due to him exclusively.

But he is responsible for putting this ethos into practice so that it served as the framework for Apple’s products, for the company itself, and for all those who sought to learn and grow by that ethos. It is not so much the stuff he made for all of us to buy. It is the culture he spawned, something that now lives on in countless thousands of people who have opened their minds and hearts to it, this philosophy, this ethic, this credo.

I grieve for Steve Jobs much like I grieved for Carl Sagan, like many who grieved for John Lennon, before I was old enough to know who he was. These are people who not only added to the catalogue of human-borne discoveries, ideas, and products, but they changed — and in their way created — the best aspects of what we consider human culture.

I did not lose a friend or family member, but my species lost an important source for its culture. If my tears fall, they fall for the great hole left in our civilization by the loss of Steve Jobs. It’s hard to believe he could die at all. But he’s gone, and all we can do is work to fill that chasm with our own emanations, our own representations of that ethos.

It won’t be the same, but it might be enough. But to fill that hole, we’ll have to be crazy enough to think we can.