

a weblog by Paul Fidalgo
Time flies when you’re having fun, and it flies at Mach 5 when you’re not. When I hear my kids complain, “I’m bored,” I tell them how much I envy them. Oh, to be bored! To have no immediate demands on my time, energy, and attention! Boredom may appear to be an unpleasant state, but it’s also a harbinger and a breeding ground of things worth doing. It’s the preamble for activities of choice, not obligation.
By mere coincidence I read in succession two pieces on how terrible we humans are at perceiving time and its passage, and how we might alter those perceptions in a more meaningful and satisfying way. They are both entirely convincing, and yet they each offer conflicting ideal states of mind. Or they might not.
First, Alan Jacobs in The Guardian. (I have never met this man, but I swear I count him among the most valuable teachers of my life.) Jacobs refers to our culture, as driven by our various media, as “presentist.” He writes, “The social media ecosystem is designed to generate constant, instantaneous responses to the provocations of Now.” There’s no way to think deeply or consider alternate or broader perspectives because the fire hose of stimuli never ceases.
The only solution is to cultivate “temporal bandwidth,” which Jacobs defines as “an awareness of our experience as extending into the past and the future.” Less “now” and more “back then, now, and later.” And the way we do that is to read books. Old books, preferably. “To read old books is to get an education in possibility for next to nothing.”
That education sets the stage for one’s mind to not only absorb the wisdom and the mistakes of the past, but to contemplate how they “reverberate into the future”:
You see that some decisions that seemed trivial when they were made proved immensely important, while others which seemed world-transforming quickly sank into insignificance. The “tenuous” self, sensitive only to the needs of This Instant, always believes — often incorrectly — that the present is infinitely consequential.
But cultivating temporal bandwidth is happening less and less, it seems. And as Jacobs says in a separate post, “Those who once might have been readers are all shouting at one another on Twitter.”
But while Jacobs recommends steering us away from believing the present to be of prime significance, David Cain at Raptitude urges us to grasp the present more tightly, and let concerns about the past and future fade to periphery.
And it is all to address the same basic problem: we feel washed away by the force and flow of time. Comparing an adult’s perceptions of time to a child’s, Cain writes:
As we become adults, we tend to take on more time commitments. We need to work, maintain a household, and fulfill obligations to others. […] Because these commitments are so important to manage, adult life is characterized by thoughts and worries about time. For us, time always feels limited and scarce, whereas for children, who are busy experiencing life, it’s mostly an abstract thing grownups are always fretting about. There’s nothing we grownups think about more than time — how things are going to go, could go, or did go.
Cain doesn’t point to social media or cultural illiteracy as culprits, but rather our disproportionate fixation on the past and the future. It may be that Cain is largely discussing a different scale of time than is Jacobs. Cain seems to be referring to our fixation on what has happened in the relatively recent past (10 minutes ago or 10 years ago, for example) and what the immediate future bodes (say, the next couple of hours or the next couple of months). Jacobs, by emphasizing the reading of “old books” (and by quoting lines from Horace) is certainly thinking of a much deeper past and a more distant future, spans that transcend our own lifetimes.
But as I said, Cain recommends regarding the past and future less, and home in on the present. “The more life is weighted towards attending to present moment experience, the more abundant time seems,” he says. And the way to attend to that present moment, as clichéd as it might sound these days, is through mindfulness, which can mean meditation or any activities “that you can’t do absent-mindedly: arts and crafts, sports, gardening, dancing.” Here’s why:
It’s only when we’re fretting about the future or reminiscing over the past that life seems too short, too fast, too out of control. When your attention is invested in present-moment experience, there is always exactly enough time. Every experience fits perfectly into its moment.
Note that Cain never mentions reading as one of those activities that one can’t do absent-mindedly. I don’t know about you, but if I read absent-mindedly I’m probably not actually reading at all, or at least not in such a way that I’ll retain anything. So whether or not he intended it or agrees with it, I’m throwing “reading books” into that list.
This is the bridge that connects these seemingly-conflicting viewpoints, making them complementary. Much of this rests on the difference in time scale I referred to, which, if taken into account, begins to form a complete picture. Few would argue with the idea that fretting about the immediate past and future is detrimental to one’s experience of time, or that contemplation and consideration of history and the long-term repercussions of our actions is a waste of time.
They key word here might indeed be “fretting.” In this sense, the definition of “fretting” isn’t limited to “worrying,” but describes a broader practice of wasting energy and attention on things within a narrow temporal scope without taking any meaningful action to address whatever concerns might be contained within. We fret about choices we’ve made and what such-and-such a person is thinking about us or how we’ll ever manage to get through the day, week, or year with our sanity intact. We rarely fret about how the Khwarazmian Empire was woefully unprepared for the Mongol army under Genghis Khan in 1219, or how the human inhabitants of TRAPPIST-1d will successfully harvest the planet’s resources to support a growing populace.
And of course, nothing engenders fretting like social media. Already primed for fretting by the demands of work, family, and self-doubt, now we can fret in real time (and repeatedly) over anything relatives, acquaintances, total strangers, politicians, celebrities, and algorithms flash before our awareness. It is possible to exist in a state of permanent fret.
Let me tell you, time really freaking zooms when you’re fretting.
So let’s combine the recommendations of Jacobs and Cain to address our temporal-perception crisis. Let’s get off of Facebook and Twitter, let’s turn off the television, and let’s get to that stack of books (or list of ebooks if you prefer) and read. Let’s allow our brains to expand our awareness, considerations, and moral circle beyond this moment, this year, this era. Let’s not burden ourselves with the exhausting worries about what we’re reading or how long it will take to read it or what else we should be reading but aren’t. Let’s make time to chat with our kids and our parents, and write, tinker, draw, arrange, organize, build, repair, or tend as best suits us. Let’s stop and breathe and think of nothing for a few minutes as we focus on the present instant in time and space, even to the atomic level. And then let’s think big, daring, universe-spanning thoughts beyond all measure.
Let’s be bored, and let that boredom nudge, inspire, or shock us into activity, be it infinitesimal or polycosmic.
It will take practice. It will not be easy. Let’s accept that this, too, is a journey of time and effort and moments.
And let us fret no more.
If you feel so inclined, you can support my work through Patreon.
At Wired, Issie Lapowsky summarizes some research that tells us something that is not surprising, that more or less no one is ever persuaded to change their mind about a political position because of a post they saw on Facebook.
I suppose people do actually think that their social media posts are badly-needed ammunition in the political war of ideas, and that their fierce, impassioned, and ironclad arguments will surely win over the misguided. I assume they really do think that. Intellectually.
But the truth, which I believe they at least feel at a gut level, is that these political social media posts are social tokens, signifiers of belonging to a particular group, earning good will and social capital by reaffirming that which they all already believe. That’s largely why I write political tweets, usually because I think I can do so in a funny way and get some positive validation that might begin to fill the abyss that is my self-esteem. My zinger about Trump or my spirited defense of Hillary isn’t going to move the needle one teeny tiny little bit in anyone’s mind, and I have no expectation that it will.
At this level it’s harmless (other than those perilous moments when my tweets are not affirmed and I fail to achieve validation). The problem arises when the posts and tweets and memes go from social tokens to something more like Madame Defarge’s knitting. Outside of the more black-and-white world of election-year D vs. R posts, social media posts involving politics and heated social issues are designed to affirm via othering, by striking clear delineations between the good people and everyone else who is irredeemably bad for failing to check every ideological box, whether they know those boxes exist or not.
And it’s not just reactions to one’s own posts that do this work. It’s the posts of others. Lapowsky writes:
The majority of both Republicans and Democrats say they judge their friends based on what they write on social media about politics. What’s more, 12 percent of Republicans, 18 percent of Democrats, and 9 percent of independents who responded say they’ve unfriended someone because of those posts.
So it’s not political persuasion, as we might like to believe, it’s shaking the trees for villains to fall out of, it’s political partitioning.
In the film Bananas, the Castro-like ruler Esposito delivers his first speech to his people, and tells them, “All citizens will be required to change their underwear every half-hour. Underwear will be worn on the outside so we can check.”
The kind of social media I’m talking about is that underwear you just changed, and you’re pretty damned proud that you did it after only 29 minutes.
Yesterday, I tweeted:
Get really mad. Together.
Twitter.
Ha ha I’m so witty. Anyway, it’s an expression of my feeling of alienation from the mob-attacks that pass for “debate” on Twitter and other online outlets. Last year I put it this way:
There is plenty of argument online. But actually relatively little open disagreement. [It’s] really just agreement on the position that those other people (or that one poor dumb bastard) on the other side are wrong.
It’s people, astride very tall horses, agreeing at other people.
At Big Think, Jason Gots traces this phenomenon to annoyance, the power that being irritated by a person or an idea can have on us emotionally. And where do we go to vent our emotions? Twitter! Gots writes:
Irritation is a powerful force. It has the whiff of righteousness.
Think about how you feel when (if) you’re annoyed at a smoker, or the way someone drives, or how they dress, or how they parent. Admit it, you feel bothered by their wrongness, that a moral principle has been broken. Ugh, look at that person just lighting up like it doesn’t even matter. Ugh, look at that mother letting her kid behave that way. Most of the time, these things don’t even effect you. You just feel like they’ve violated something sacred rule even though it has nothing to do with you.
More Gots:
[Irritation] inspires dread in the meek. If you read old accounts of any society that eventually erupted in some form of ethnic cleansing or witch-hunting, you’ll hear people gossiping and commiserating about the annoying habits of the marginalized group, nodding their heads in agreement about the ways in which these people obviously don’t “get” the rules of society that are perfectly obvious to anyone with common sense.
Have you ever known that a romantic relationship was more or less over, as far as you were concerned, but couldn’t really justify a breakup with any ironclad offenses? They haven’t wronged you or cheated on you, you’re just done. So (and I know I’ve done this in my stupider days) you unconsciously begin to invent things that bother you about them, or the small annoyances that never mattered before suddenly become deal-breakers. Now apply that to one cultural, ethnic, political, religious, or any other identity group’s feelings toward another. (Android people can’t stand how snooty Apple people are. Apple people can’t stand how crass and tasteless Android people are. They are so annoying.)
Gots doesn’t just shrug it off, though. He wants us to stop it. And it’s harder than it seems to stop.
Words have power, and the line between opinion and fact is not nearly so clearly demarcated as it once was. So when your rhetoric suggests that something you’re saying should be completely obvious to anyone who isn’t an idiot, you’re basically bullying people into agreeing with you.
Remember bullies in the classic middle and high school sense? Well I sure do! And they were always so annoyed by my existence. The fact that I was, well, the way I was, way over there, away from them, really irritated them. That feeling justified their ruining several years of my life. What I looked like, what I was into, the way I stood or sat or walked, it wasn’t right, so I had it coming.
This happens all the easier if you, the hearer of a given annoyance, don’t know any better. If I’m an Apple fan, and I hear all this irritation with Android people, I’ll be pretty likely to share that opinion and that annoyance, even if I have no experience with Android or its users. If I know nothing about feminism, and a bunch of dudebros I follow go on about how annoying they are, what with their always asking for equality and whatnot, I will likely share this opinion of them whether I intend to or not.
Because you see I probably don’t know any better, and I certainly don’t want to be on the outs of my group, right? I can’t be sticking up for Android or feminism or whatever, because then I’m the one my group gets annoyed at. Then it’s open season on me.
Oh hey, it’s open season on Rachel Dolezal, isn’t it? We’re so annoyed and irritated that she thought it would be okay to just pretend to be black. She’s fair game. It doesn’t matter that we don’t know why she does it, or what might have happened to her, or been done to her, to make her want to escape her identity, to turn away from her other life. Let’s make her feel even worse because we’re annoyed.
Arthur Cohen had an op-ed in the Times a few days ago about political hating, but it applies to all of these things. His advice:
Declare your independence by not consuming, celebrating or sharing the overheated outrage and negative punditry — even if it comes from those with whom you agree. Avoid indulging in snarky, contemptuous dismissals of Americans on the other side. And always own up to your views.
Imagine that.
You may recall that last year they announced the discovery of a dinosaur species which they called Dreadnoughtus, thought to be the single largest land animal to ever live. Cool, right? Suck it, Argentinosaurus!
Anyway, my 5-year-old son has a project this week in his preschool class on dinosaurs, his favorite subject. He had to choose one to report on, and build a poster based on what he learned. Well he already knows gobs of facts about all manner of dinosaur species, so in order to up the ante and challenge him a bit, we chose, you guessed it, Dreadnoughtus.
He was really enthusiastic about it, he knew he’d be the only kid to choose it, and he threw himself into learning new facts about it, and especially drawing his masterful picture.
I snapped a picture of my wonderful boy and his project, and shared it to the inter-social-webs. And guess who responded to the tweet? None other than paleontologist Ken Lacovara, the paleontologist who discovered Dreadnoughtus! (He describes himself in his Twitter bio as “Papa to Dreadnoughtus.”) He’s the guy laying next to the fossil in the picture on my son’s poster above. He tweeted:
Nice! Please tell him I said he did a great job!
And on my contention that my boy would “kick those other kids’ [projects] butts,” Lacovara said:
Totally
I echo what my wife Jess said about this: It’s this kind of thing that’s so wonderful about the social Internet. That my preschool-age son could excitedly work on a project about a dinosaur, and almost instantly be encouraged and congratulated by the very person who discovered it.
Anyway, thank you, Dr. Lacovara!
Technology is all about change, and rapid change at that. But even with the pace of technological development being dizzyingly fast, there are still larger paradigms, grander assumptions and codes of conventional wisdom, that are more or less static. In 2014, though, a lot of those paradigms shifted, and many of our preconceptions and understandings were altered, enlightened, or totally overturned. Here’s a short list of some of those paradigm shifts in tech in 2014.
Microsoft the Scrappy Upstart
In another age, Microsoft was the Borg, the unstoppable and loathed behemoth that destroyed all in its path. Then, sometime in the middle-to-late twenty-aughts, it became the ridiculous giant, releasing silly products, failing to even approach the hipness caché of its once-defeated rival Apple, and headed by a boorish clown prince. Zunes? Windows Vista? The Kin smartphone? Windows 8? “Scroogled”? Each risible in its own way.
And then Microsoft got a new boss, and Satya Nadella’s ascent immediately changed the public perception of the company, especially among the tech punditocracy. The products still weren’t fantastic (Windows 8.1, Surface Pro 3), but the company began to emphasize its role as a service provider, ubiquitous not in terms of desktop machines, but in terms of the various services through which all manner of machines and OSes did their work. Think OneNote, Office 360 on iPad and Android, Azure, and OneDrive. The tide had turned, and now as Google and Apple (and Facebook and Amazon) battled for supremacy, Microsoft would simply work with anyone.
To get a strong sense of the change in attitude toward Microsoft, listen to prime-Apple-blogger John Gruber’s interview of Microsoft beat reporter Ed Bott on The Talk Show early this year, recorded at a Microsoft conference, at which Gruber was featured as a marquee user of Microsoft services. Gruber and Bott were full of hope and admiration for the old Borg, which would have been unthinkable even five years ago. It is a new day indeed.
“I Was into Big Phones Before it Was Cool”
When Samsung unveiled the Galaxy Note in 2011, it was ridiculed for being absurdly huge, as though anyone who bought one should be embarrassed about it. Today, the original Galaxy Note would be considered “medium sized” compared to today’s flagship phones, almost all of which have displays over 5 inches. Meanwhile, even larger phablets are objects of high desire and status, such as the Galaxy Note 4 and the iPhone 6 Plus. “Mini” phones (the 4.7-inch HTC One Mini, for example) are those with displays bigger than the biggest displays offered by Apple as recently as 2013, which topped out at 4 inches.
No longer silly, phablets are now considered high-productivity machines, the mark of a busy, engaged technophile, and are perceived to be eating well into the tablet market. (They’re still too big for me, but even I could be turned.) Big phones are now just phones.
Podcast Inception
At some point in 2014, it was decided that everyone in tech must have a podcast. If you worked for a tech site, you had a podcast (like me!). If you worked at a tech company, you had a podcast. If you’d just lost your tech job, your new tech job was to have a podcast. And on those podcasts, they woud have as guests and co-hosts who also had podcasts, because, of course, everyone had a podcast. On those podcasts, they would talk to their fellow podcasts hosts about podcasts, making podcasts, the future of podcasts, the monetization of podcasts, and podcast apps.
I predict that sometime in the middle of 2015, there will be a Podcast Singularity which will swallow up all tech podcasts into an infinitely dense pundit which will consider how this will affect the podcast industry, and will be sponsored by Squarespace.
Amazon’s Weird Hardware
Amazon was on a roll. The Kindle had proven itself to be an excellent piece of hardware years ago, and solidified this position with the magnificent Paperwhite in 2012. In 2013, its Fire tablets had become genuinely high-quality devices that were well-suited to most of the things anyone would want a tablet for, with strong builds, good performance, and beautiful screens. It seemed like Amazon was a serious hardware company now.
Then it released the Fire Phone, and everyone got a queasy feeling in their stomachs. A half-baked, gimmicky device that was incredibly overpriced, it landed with a thud, and Amazon continues to slash its price to clear out its inventory. (People really like the Kindle Voyage, I should note, and the Fire TV has been much better received as a set-top box, though my own experience with the Fire TV Stick was very poor.)
And then they awkwardly previewed the Amazon Echo, the weird cylinder that caters to the dumb informational needs of a creepy family, and the head-scratching turned to scalp-scraping. Amazon’s status as a serious hardware maker was no longer a given.
The Revolution Will Not Be Tablet-Optimized
The iPad was going to be the PC for everyone. Most people would not even bother with a computer with a monitor and a keyboard, they’d just get a tablet, and that’d be it. PCs would be for professionals in specific situations that required a lot of power and peripherals. For the rest of humanity, it would be tablets all the way down.
Of course, now we know that in 2014, tablet growth has slowed, and few people use their tablets as their primary computing device. Instead, they’re causual devices for reading, browsing, and watching video. Despite the niche cases heralded in Apple’s “Verse” ads, on the whole, tablets have become the kick-back magazines of the gadget world.
That’s fine! I’ve written before that iPads/tablets are “zen devices of choice,” the computer you use when you don’t have to be using a computer, unlike smartphones and PCs which are “required” for work and day-to-day business.
The shift this year is the realization that tablets are (probably) not going to take over the PC landscape, especially as phones get bigger, and laptops get cheaper and sleeker. Could there be any better argument against an iPad-as-PC-replacement than Apple’s own 11″ MacBook Air? Even Microsoft, which once positioned its Surface machines as iPad replacements now markets them as MacBook competitors. Why? Because tablets just don’t matter that much, they’re more for fun, and the Surface is for serious business.
Forcing the tablet to be a PC has proven so far to be awkward and hacky, and PCs themselves are better than ever. The iPad revolution may never be. Which, again, is fine, but in 2014, we realized it.
(And relatedly, e-readers aren’t dead!)
The Souring of Twitter
Twitter hasn’t always made the best decisions, and sometimes even its staunchest defenders have had to wonder what the company really wants to make of its crucial service. But to my mind, in 2014 the overall feeling toward Twitter has tipped from reluctant embrace to general disapproval. It’s gotten worse on privacy, it’s been MIA or unhelpful in handling abuse and harassment, and it’s began to seriously monkey with what makes Twitter Twitter. And more and more, I read pieces about once-avid Twitterers saying just how miserable the torrent of negativity makes people feel. Once the underdog to Facebook that all those in the know called home, it now looks like a hapless, heartless, clueless company that has no idea how good of a thing it has.
You Have Died of Ethics in Games Journalism
Tech has always been a boy’s club, but in 2014, a lot of the industry decided it shouldn’t be anymore. As more and more instances of harassment, abuse, sexism, and overt misogyny were exposed – in the wider tech industry and in gaming particularly – the more people stood up to declare the status quo unacceptable. A wider embrace of inclusiveness and encouragement of women in tech emerged, along with, of course, a counter-reaction of hatred and attacks from those who liked things as they were.
2014 forced the tech universe to confront some very, very ugly things about itself. But it will likely prove a net win, as more of us work to fix it than don’t.
(I have this shirt with the above image, and it’s here.)
Google’s Glass Jaw
In 2013, Google Glass was the future, the way all things tech would soon be. In 2014, no one wears them, a consumer version seems to remain a fuzzy concept, and even those who were breathlessly enthusiastic about it have felt their novelty wane. The tech punditocracy is now waking up from its Google Glass hangover, and they’re all a little embarrassed.
Now, of course, we’re all excited about watches. It remains to be seen what we feel like the next morning.
Twitter released its strategy statement to investors on November 12, 2014, to (at best) mixed reception. Here, Kermit the Frog performs the statement, word-for-word.
See also Kermit performing Mitt Romney’s explanation about releasing his tax returns from 2012.
How much do you care what people think of you? How much do you care what people you’ve never met think of you? How much do you care what people you’ve never met think about any individual choice you make or opinion you share? How much do you care what people you’ve never met think of the specifics of the format, timing, wording, tone, or technological means of the opinion you’ve shared?
If you use Twitter, or social media in general, you already kind of know the answer, or at least you’re learning it.
I have been learning some lessons myself about social media; how it can be used either passively or with intention; how it informs our personal identity; and how I have allowed it too much unfettered access to my nervous system, among other things. Clearly, for all its benefits, Twitter is also an enormous source of potential stress, eliciting what I call the Torrent of Feelings. I won’t get into the myriad factors that make this so. Browse this here blog, and you’ll see other ruminations on this subject.
What occurs to me lately is that a lot of the stress that Twitter (et. al.) engenders has to do with our perceptions of being judged. The more you present yourself on one of these platforms (and I’ll just use Twitter for now, since it’s my traditional platform and I’m tired of typing provisos indicating “et cetera”), the more you have your sense of self and identity wrapped up in it. And that can make one sensitive to the scrutiny that comes with such exposure.
Freddie deBoer recently put it like this:
“You’re doing it wrong” is the internet’s truest, most genuine expression of itself. For whatever reason, the endless exposure to other people’s minds has made the vague feeling that someone, somewhere, is judging you into the most powerful force in the world.
But what is being judged? The more I think about it, the more I think the answer is “everything.” And not “everything” in the sense of one’s whole self. That is happening, but it’s piecemeal. Very piecemeal, granular in the extreme. Because of course no one can encapsulate their whole selves in a tweet, or even a series of them, so judgment comes in small units. The hyperscrutinization that people experience (I know I do) on Twitter happens tweet by tweet, and on down.
Of course you can be called out for the substance of your opinions and choices, whether deservedly or not. But you can also be derided for your word choice, the timing of your tweet, your grammar, your nuance, your lack of nuance, your hashtag use, your frequency of tweeting or lack thereof, what client you’ve chosen to tweet from, and so on. And in those instances, though they are highly focused, the effect on the recipient is to add it to the collections of judgments about themselves as people. As Boone Gorges puts it, “A life spent on Twitter is a death by a thousand emotional microtransactions.”
And while I strongly advocate using Twitter and social media with great intention, there’s not much you can do about this micro-judgment phenomenon besides not using Twitter. That’s because Twitter is used by humans (usually), and humans, even the ones we really like, also tend toward the shallow and the knee-jerk response in an environment that fosters that kind of thing. Gorges again:
Every tweet I read or write elicits some small (or not so small) emotional reaction: anger, mirth, puzzlement, guilt, anxiety, frustration. I’ve tried to prune my following list so that when I do find myself engaging in a genuine way, it’s with a person I genuinely want to engage with. But there’s a limit to how much pruning can be done, when unfollowing a real-life friend is the online equivalent of punting his puppy across the room. So all day long, I’m in and out of the stream, always reacting to whatever’s coming next.
And there’s a domino effect. Especially during times of collective stress (such as the siege on Ferguson, the death of someone notable, etc.), those on the periphery peek in, see the Torrent of Feelings swirling around them, which causes them to judge the validity of that. Erin Kissane writes:
In the flood of information and emotion from something like Ferguson (or war crimes or an epidemic) … there we all are, gradually drowning. So people get huffy about the volume emotion that these events arouse—angry that others are angry about the wrong things or too many things or in the wrong register. … (I am properly angry, you are merely “outraged.”)
It should be noted that of the three writers quoted here, all three have left Twitter. DeBoer’s been gone for a while I think, and the other two announced their exit in the quoted posts.
Now, I’m not leaving. I have too much invested socially and professionally in Twitter to foreswear it. I will have to make do with diligent pruning, and accept that it will require a degree of fluidity: maybe I mute or unfollow certain people at certain times, and then bring them back to my feed at other times, for example. I will probably screw some of it up.
All of this is to say that Twitter is valuable, but we human beings are so damned vulnerable. The Twitter service does not care at all about this vulnerability, and probably thrives as a result of it. But I think we can do a lot to both harness Twitter’s positive value while being highly mindful of its power to kill by a thousand cuts (and this is before we even get to outright abuse, harassments, and threats, which is a related problem at a much higher temperature). I’ll be thinking about these things as I tweet and react, but also as I take in the reactions of others to me. It won’t be easy.
—
Image by Shutterstock.
Your browser does not support the audio element.
You are what you tweet? Do your “likes” tell the story of who you are?
For so many of us, major portions of our lives are lived non-corporeally. We don’t define ourselves solely by what we do in physical space in interaction with other live bodies, but also through pixelated representations of ourselves on social media.
How do we strike a balance between the real world and the streams of Twitter and Facebook? Can we be more truly ourselves online? When we tweet and share and comment, what parts of ourselves do we reveal, and what do we keep hidden or compartmentalized?
The way social media defines our identity is what we’re talking about on Episode 2 of the iMortal Show, with my guests: Activist and communications expert Sarah Jones, and master of tech, twitter, and self-ridicule Chris Sawyer.
Subscribe in iTunes or by RSS.
The iMortal Show, Episode 2: “Your Self, in Pixels”
Originally recorded September 3, 2014.
Produced and hosted by Paul Fidalgo.
Theme music by Smooth McGroove, used with permission.
Running time: 43 minutes.
Links from the show:
Sarah Jones on Twitter, at Nonprophet Status, and her blog Anthony B. Susan.
Chris Sawyer (with “sentient skin tags”) on Twitter.
Previous episode: “Ersatz Geek”
iMortal posts: