Thursday, November 21, 2013

So, Getting Used To Working At A Teaching School Is Weird, But Good.

I want to start this post by mentioning some things that I like. I'm going to do this because after that, I'm going to talk about how I don't like some of them as much as I like others, or, at least, how some have become more important to me than others, their value more obvious, and I don't want to give the impression that I dislike the less-liked things...I just like them less than maybe I once did.

When I switched my undergraduate major from Computer Science to English Literature way back in 2001 (or maybe 2002), I did it because over the first 2.5 or so years of college, I'd realized that I fundamentally, absolutely loved the lifestyle higher education allowed me to have. I was perfectly happy working 12 hours a day if that work was reading, learning, questioning, writing, exploring, and if that time was spent with and around others who also valued those activities. I was happy enough with this lifestyle and wanted to live it badly enough to give up a major that would have easily (in the early 2000s) landed me a job right out of college that would have paid significantly more from the onset than my current tenure track job will ever pay even at the full Professor level.*

My enthusiasm for this lifestyle has waxed rather than waned over the years. It's become less and less financially practical to actually make a living by working in higher education as an actual educator (it's another story if you want to be an administrator), but that continues to not really matter much to me. I still love writing, I still love reading. During my time at WSU, where I was required to teach courses while pursuing my graduate degrees in order to receive funding, I even discovered that I actually really like teaching, something I would never have imagined about myself back in the KSU days, when I was just desperately looking for an excuse to stay enmeshed in the intellectual environment of a large public university after getting my bachelor's degree. That said, some of my favorite memories of my graduate career involve participating in great seminar discussions after wading through hundreds of pages of theory, and condensing that theory and tweaking it to plot and write up a dissertation that I thoroughly enjoyed and am (mostly) proud of. So, by the time I received my Ph.D, I had, usefully, realized that I loved both teaching and researching.

Interestingly, my new job wants me to do one of these and not the other. It's 2013, so I'm pretty sure you can guess which is which.

When I first learned this, during a post-interview phone call with my then-potential employer, my heart sank a little. Taking a job where I taught a heavy course load and wasn't expected to engage in research (read: "We won't give you time or money to do any research"), seemed utterly depressing to me. I loved the research I was doing, and couldn't imagine giving it up to "just" teach more classes.

Well, as you can probably guess, this is the part where I decide that I don't like some things as much as others: flash forward about seven months, and I feel completely different about the whole teaching vs. research thing. I realize that my opinion isn't for everybody, and I'm not writing it here to force it on anyone else, but just to share my own experience. So.

Working at a teaching-oriented university is awesome.

See, the always-underlying issue I had with the work I was doing before was that it was pulling me in two opposing directions basically constantly. I love teaching and I love doing research, but when you're doing both at the same time, you're always doing both to about half of your ability. Or, at least that's how I've always felt. When I was able to just sit and read and write for a week, I was able to create some of my best-ever work. When I was able to just focus on lesson planning and teaching for a week, I was able to teach some of my best-ever classes. Out of eight years of work at WSU, how many times did I get these "one-only" opportunities? Maybe 2-3 times apiece. That's not an indictment of any particular person or department or policy, just an observation of how the general expectations that come down from on high at a huge public research university force you to mix, mingle, and often ultimately shortchange your priorities.

Here, it's easy: I work on teaching all the time, except for when there's a committee meeting or another service-related opportunity, and then I put the teaching aside for an hour and focus on that. The transition back is much easier, because helping to guide the course of a small department has a lot in common with teaching (and learning). Research, though it can certainly inform one's teaching, doesn't have a lot in common with pedagogy itself. Unless, I suppose, you are researching pedagogy. Which I was not.

So, practically speaking, my job's not nearly as confusing. I'm switching gears much less often, and though I work just as hard and often get just as tired, I'm not burdened with that feeling of being stretched too thin, of knowing I could teach better if I didn't have to attend my fourth conference of the year or knowing that I could really knock this article out of the park if I didn't have to teach three classes this semester, and so on.

Existentially, focusing just on teaching is a lot nicer, too. This is what's really surprised me, and what's possibly going to piss off some of the gentle readers out there. Surprisingly, "service" is no longer a bad word to me. Once, it was that pesky thing that you half-assed on your CV, the grad student equivalent of putting Who's Who Of American High School Students on your college application. Now, it's things that I do naturally every day as a result of being part of a small department, in a small town, with a recognizable and manageable community of students and faculty and administrators.

Because this is such a small school and the only thing that's more important on faculty evaluations than "service" is teaching, there's (generally speaking, at least) a positive attitude toward university- and department-related activities built into the way most people approach their work here. And this is great for me. Since I started in September, I've already made progress on helping to introduce a new academic program to the school, created six new courses that should all be seeing the light of day over the next 2 years or so, made inroads toward getting placed on two faculty committees whose work I think of as very important to the university and community at large, and met informally with a large number of faculty members to discuss how we can improve our Humanities offerings to more directly suit their programs' needs. I have a lot more time to meet with students individually, I've talked to every member of my department at length, both about work and more mundane things, and I'm setting up one-on-one, face-to-face meetings with the Provost and President of the university to discuss my thoughts on revamping the Humanities offerings and general education requirements.

I don't list all of those things to brag, but to make the points that a) I'm doing lots of things that are going to (hopefully) be directly valuable to the university and its student population and b) all of these things are way more meaningful to me than getting another article published in some obscure journal after months of toiling away alone in my office.

I once read an article (and I really wish I could remember where, because it would help this post out a lot) that suggested, pretty compellingly, that requirements for "research" in the humanities was something colleges and universities just sort of started requiring middle-level faculty to engage in before they could qualify for tenure for the sole reason that there wasn't much else for them to do besides teach classes. In larger universities, departments got so big that there was really no meaningful way to include all faculty in the processes of governance and upper-level decision-making, so they were, essentially, given something else to do instead to hide the fact that the system had become too cumbersome to care about their input anymore. Now, I don't know how accurate that is (like I said, the article was pretty convincing, but I also didn't follow up by investigating its claims in detail or anything), but it rings fairly true based on my own experience.

Like I said at the onset, I love doing research and writing articles, and writing my dissertation is one of the most rewarding things I've ever done academically. But it's hard to regularly put so much time and effort into publications that are likely to only be read by a few people who are already hanging out in the same echo chamber as you, especially when the time spent writing could be applied to teaching or to working to make a positive change in your immediate department or university. Research is great, but if given the choice between working on an essay of mine for an hour or serving on a committee that's working toward a positive infrastructural change at my university for an hour, I'll always choose the latter. It's more immediate, and to me, more ultimately meaningful.

Certainly, when you write to publish, you're learning a lot about your field and then you're "creating new knowledge." But I can still do that by reading a bunch of books before choosing one for a class (which I do every semester) and then discussing that text in class, requiring my students (and thus myself) to critically think about the material in it for a couple of weeks. I can blog, tweet, talk to my colleagues, and engage in other forms of meaning-making based off of the things I'm reading and the concepts I'm pondering, all without having to resort to the draconian and ultimately somewhat hollow experience of hurling myself into the maw of academic publishing again and again. I haven't written any new article material yet this year and I don't really plan to, but I don't feel dumber as a result. I feel both smarter and more like my energies and talents are being applied to tasks that have a more immediate benefit both for me and for the rest of the university.

Of course, I don't mean this as a condemnation of humanities research or anything dramatic like that; it's just that it's been striking to me to realize how much more focused and simultaneously more relevant and productive I feel without having to worry about it as much anymore. I've been surprised at how hollow it seems (again, just to me) now that I have an actually well-defined and relatively powerful role to play in a small department that's excited about my input in a way that a larger department just couldn't be. It's a combination of factors that obviously isn't available to everyone who is in this line of work, and even if it was, I imagine a lot of people wouldn't want it. But it's working out really well for me by allowing me to stop focusing on the things that I've been told Matter Greatly and focus instead on things that matter to me intrinsically.

* At least two friends of mine who majored in CS around this time immediately started jobs with six-figure salaries after receiving their B.S.

Monday, November 18, 2013

Low-Attendance Classes Are Awesome. I Think.

This is something I think about and talk about face-to-face, in meatspace, with friends and colleagues a lot. But I've always been hesitant to post about it on the internet in fear that it will get a lot of those knee-jerk, judgmental responses the internet is so famous for. I've had a few good discussion arise from my blog posts lately, though, and very few heads have been bitten off. So I thought I'd try.

I take attendance when it's required that I do so by the school. Insofar as I'm allowed to formulate my own attendance policy, I have none. I make it clear to my students up front that a big portion of my class is discussion, participating and "meaning-making" in the classroom, on a day-to-day basis. If they choose not to show up for this, I don't directly penalize them, because I believe (and have seen after teaching over 1,000 students) that they are indirectly penalized and almost entirely across the board perform worse than students who have perfect or near-perfect attendance and participation.
Pudding, meat, etc.
For better or worse, this is generally my approach to teaching at a university: you, the student, are (nominally, at least) an adult, and you're free to make your decisions about attending class, about doing the work, about putting in the time, etc. I make my expectations clear up front, I make myself readily available to you if you want to do the work but don't maybe know how to do the work. Even if you don't know how to correctly want to do the work, I'm willing to help. But I'm hesitant to force students to put in a good effort because then it won't be a good effort...at least that's how it seems to me. This has been my approach for a long time, and I'm mostly happy with it, though sometimes I feel bad about not being more prescriptive, as in "Student A wouldn't have failed if I'd have forced him to turn in more drafts of his essay!" But that brings it's own set of problems, I suppose.

Anyway, the point I'm trying to get at is that I find myself frequently happy when I have a low-attendance day or week in class rather than frustrated (which seems to be the "normal" faculty reaction to such things). I think it's because it reinforces the implicit assumptions I'm making in my attendance policy as I've described it above: if the students who show up are choosing to show up, and their motivation for showing up is interest, if not in the subject matter then at least in GPA-based success, then a low-attendance day actually raises the percentage of interested students in the classroom.

As someone whose courses are dialogically-based 95% of the time, I'm all in favor of smaller class sizes as a rule, but even courses with a twenty-student cap are typically going to have around ten students who would just barely not rather be dead than be in my classroom. On the other hand, every single class session I've had where there were 30-40 students enrolled and only 3-4 showed up (something that happens depressingly frequently) have been absolutely brilliant class sessions.

There's a part of my brain that feels like I should feel guilty for thinking this, but I can't tell if it's a logical part, or the part that occasionally insists that we should give a trophy to every child, even if they come in last place, because trophies make people feel good and sad people are sad.

Monday, November 11, 2013

An Ode To Instagram

So, thus far on my new work-related blog, I've had three posts. One is arguably work-related and the other two not so much.

But I realized something awesome about Instagram the other day and wanted to share, and since I at least nominally research and write on new media for a living, I don't think it's too far off the mark to post it here as well as my Super-Secret Blog of Feelings and Secrets.

As if it wasn't clear from my original post on this blog, I have a pretty ambivalent relationship with social media. I used MySpace for awhile, but I was never quite sure what it was for besides streaming other people's music. I started using Facebook because my friend Toria signed me up using my laptop while I was in the bathroom one day at work. I started using Twitter because it was required for a class. I started using Foursquare because I needed another example of digital storytelling to talk about in my dissertation. I started using Google+ because I was just curious if it was actually better than Facebook.*

In short, I didn't really start using any of these services for what they were meant for; namely, for socializing. Like a dog with a type of food he's never encountered before, I've always been content to just take a lick here and there and then spit the thing back out, because, frankly, the food I usually eat has always been good enough.** I started researching and writing on Americans' use of social media not because I was a huge fan, but because it was so fascinating to me that we could collectively become obsessed with a form of expression that was simultaneously so full of promise and yet so empty in execution.

I post a lot on Facebook, but it feels fairly pointless, unless I'm sharing news that I know will interest or entertain someone specific. I post a lot on Twitter, but it's the internet equivalent of having "conversations" with the colleagues that speed past my open door en route to their class: it's nice for a quip or shout-out here or there, but ultimately not much beyond that. I still use Foursquare from time to time, but only because it's fun from a gaming perspective: I like earning new badges and mayorships in that shallow Unlock Steam Achievement way. If each Foursquare interaction took about a second longer, I'd probably stop using it entirely.

Instagram, though...Instagram is cool. I realized this the other day as I was waiting for one of those things you occasionally find yourself waiting for, like someone to call you back or your girlfriend to use the restaurant bathroom. During those little spaces nowadays, when you know you don't have enough time to Accomplish Something but there's also not enough time to Fall Into A Deep Reverie And Reflect On The Meaning Of Life, I usually pull out my phone and tool around. In this case, I went to my Instagram feed and started looking at the pictures posted recently by the people I follow. And I realized after about four pictures that it was just really awesome, interesting, and creative in a way that all those other social networks rarely are.

Part of it, of course, is the requirement that one uses visuality to express themselves on Instagram. There's a huge gap between posting "I ate veal tonight" on Facebook or Twitter and turning that written expression of mundane fact into poetry.*** However, even if you take the most soulless, robotlike photograph of your food and post it to Instagram, you're still making decisions about how to frame the photo, what light you're taking it in, the angle you're taking it at, etc., etc. You are forced, at the options screen, to make choices about the frame that goes around your picture, if you want a caption, if you want to apply a filter, and so on. Even if you opt out of all of these choices, that itself is a choice. It's a social media app that requires you to express yourself artistically and creatively, even if only in the smallest of ways, and for that I love it.

Of course, this also democratizes interesting content creation in the sense that you don't need to be able to describe your plate of veal like Whitman or Keats to make it interesting: visually, to me at least, it's really exciting to be able to actually see what other people I know find important in their lives. Written language is a fun tool to play with, but not everybody likes to play with it. Not everybody is comfortable enough using it to play with it. A neat photo on Instagram is just a click away (a few more if you want to apply the Amaro filter first, which you should probably do).

There hasn't been a single time over the last year or so that I've checked my Instagram feed that I haven't found something that struck me as a legitimate piece of point-and-click art. I'm lucky if that happens once a week or three on Facebook or Twitter. Other social media networks feel a bit like I'm living on a street with everyone I know and we all leave our curtains open and our windows open and talk really loudly, even when we're just talking to ourselves. Instagram, instead, feels a little bit like those times in my undergrad Creative Writing classes when we'd all write some super-sloppy, semi-outrageous, but totally earnest poem and read it out loud to the rest of the class. There was always something a little bit silly about everyone's poem (including your own), but the fact that everyone was sharing made it so everyone deserved to be taken seriously.

I think this happens with Instagram at least in part because the platform forces you to share in this way if you choose to use it at all, but I think I'm okay with that.

* It is, but nobody uses it, so it's sort of a hollow victory.
** This is a strained metaphor, but it's also happening on the carpet in front of me right now, so it's what came to mind.
***And here I mean lowercase-"p" poetry, not uppercase-"P" Poetry.

Sunday, November 10, 2013

Linux Mint Wants To Be Your (Work) Friend

Note: This was cross-posted on Recomposing the Pines today.

It's true. I won't go into depth on why it's true, I'd rather leave you to figure that out yourself if you're so inclined. But I wanted to do a little mini-post here simply to say that I've been using Linux Mint (first version 13 and then version 15) for all of my work-related tasks since mid-summer, and it's been absolutely fine.

A lot of people seem to have the perception that Linux = Difficult, or that you have to be some sort of crazy computer whiz to be able to use it. I say this because that was exactly my perception until I tried out Mint.

There are, it turns out, a lot of different distributions of Linux. Many are obviously built to serve particular purposes (for example, Puppy Linux is built to work well on old computers and netbooks, but has less flash and functionality than something like Ubuntu). Many are unique in ways that would be basically unnoticeable to the non-guru, but are, I'm told, mightily significant to The People Who Know Computer Things.*

But, generally speaking, Linux is appealing to a certain kind of person (read: "not just a computer person") because it is the early-90s, manual-transmission car of the operating system world. At a time when it's getting harder and harder to look under our "car"'s hood and find the radiator, to learn to change the oil by ourselves, or to diagnose a fluid leak without hooking everything up to a computer, Linux still has an easily navigable, wide-open space under the hood, easily identifiable and removable (and installable) parts, and so on.

Obviously, in this metaphor, a lot of people are going to prefer to have a car that "just works." Of course, that is, until it doesn't work, in which case you either have to take it in to the mechanic (Windows) or just buy an entirely new car (Mac). But if you have a modicum of interest in at least slightly understanding the machine in front of you that likely contains your entire professional (and maybe personal) life and on which your ability to discharge your workday responsibilities relies, Linux gives you a foot in the door.

Mint, specifically, is built to mimic the Windows UI (pre Win7) in a lot of ways. It comes prepackaged with a lot of software and drivers, to the point that if you're one of those weirdos who can happily sit at a work-provided computer for years and never think about installing new software, or changing your desktop background, or changing the arrangement of the taskbar icons.
Mint 15 UI with the taskbar and an image preview open.
I've never really had any trouble with Mint on install before, and I've installed it on five different computers now, from three different manufacturers. The only times I've had issues that required some serious troubleshooting was when I started trying to install new programs, change system settings, etc., and most of those "errors" and system failures were more a result of my own clueless flailing around than any inherent instability in Mint.

Fortunately, basic functions work well "out of the box," as it were, and pretty much all the software you'll need to perform typical work-related tasks are already installed, and your sound, graphics, etc. should work right away, to a surprising degree. I installed Mint recently on a two-monitor setup in my new office, and both monitors came online immediately -- something that didn't happen even when I installed Windows 7 on the same computer.
Different is terrifying!
I guess the big catch for using Linux for typical work tasks would be that you'll need to learn how to navigate some programs that are going to be a little different than what you're used to. For example, there is no Microsoft Office. There is, however, LibreOffice, which for my money is faster, more reliable, and nicer-looking than MS Office anyway.
Plus, it can save documents, slide shows, spreadsheets, etc. in all the typical Microsoft formats, so you can pull the same documents up later on a computer in the meeting room and have them run just fine. Admittedly, sometimes there's some formatting, etc. lost in the process of moving from LibreOffice to MS Office or vice versa, but honestly I've never had as much trouble as I've had moving from, say, MS Office 97 to 2003 and back or 2003 to 2010 and back, and so on.

If you want to try Linux Mint, but not totally commit, the site I linked to above will explain how to make a LiveCD and boot your computer from it. In this state, you'll be able to play around with Mint "on" the CD, so there are no changes made to your computer while you test it out and see how you like it. If you want to install Mint alongside Windows 7 (or lower), so you're not making One Big Choice as much as letting yourself make the choice every single time you boot up the computer, you can do this. In fact, this is how all my computers are set up: they start up to a simple menu at which I choose either Linux (for 95% of my day-to-day tasks) or Windows (for the occasional movie-editing or sound-editing task that it does better than Linux, plus video games). There's a really easy-to-follow tutorial for this, which is right here.

Anyway, once I got everything installed, I found Mint to generally be more reliable, faster, and better-looking than Windows, at least as far as doing the typical word-processing, emailing, and messaging tasks that I do at work every day. I boots up way faster, too.

If you want to explore the system a little bit more, it actually lets you, too, and as long as you don't jump into doing things like editing system kernels right off the bat, the learning curve is pretty shallow and there's a number of great communities online that are willing to help you figure things out and answer newbie questions.

Oh, plus everything's free. I suppose I should have mentioned that before. Mint itself is free, and all the software that comes with it is free. All the programs you might want to install are free. Don't like the default music player? There are literally hundreds more to choose from, and none of them cost a cent. The Adobe Photoshop analogue? Free. High-end music editing programs used by real DJs? Free.

You get the idea.

Anyway, just something to think about. Switching to Mint will likely stretch your brain just a little bit, but in a good way. Plus, over the last few months I've come to find that it's not just a good alternative for work than your typical paid operating system, it's actually better in a lot of significant ways.

* I finished my undergrad a measly nine credits short of a Bachelor of Science in Computer Science, and do not consider myself smart enough to be counted among these people.

Tuesday, October 22, 2013

(Not So) Defiantly (Maybe): Profile Fatigue and Why I'm Hiding My Blog From You Now

So, tonight marks sort of a big moment for me and my relationship with the internet.

I started my first blog in December of 2004, and ever since then, first on Livejournal, then Blogger, then Tumblr, and then Blogger again, I've written whatever came to mind about whatever came to mind, and left it all available for anyone to read, both in the interest of generating discussion and as an exercise in self-expression. Before blogs, before social networking was a thing, I wrote pretty consistently (both fiction and just general thoughts) on my computer from 1995 or so up until 2004, and though there wasn't as ready a mechanism for making those writings widely available at the time (thank god!), I was willing to share them with whoever was interested.

As of tonight, I'm changing my primary blog over to private, so the entries can only be read by people who I've personally okay'd. Here's why.

One of my absolute favorite times to be doing anything on the internet was when I was blogging c. 2006-2008. Facebook wasn't a huge thing yet (at least not in my social circles), I had a MySpace, but it was a secondary form of communication, for the most part, to email, and blogging was fun. I've never had a large readership on any of my blogs, but I never wrote for other people, just for myself, so that wasn't a big worry. What I did have were a few friends who followed my posts, occasionally responded to something that interested them directly, and often blogged in their own right. Not only did it feel like I had a small, invested group of friends/readers who would take an occasional personal interest in what I was writing (an important thing for an insecure writer, thinker, and human), I was allowed access to their thoughts in kind, and some of the most interesting conversations I've ever had arose from back-and-forth comments during this time. These experiences were in large part what led to my declaring, when I started writing professionally on new media issues, that relationships and communication online could be just as meaningful and "deep" as those offline.

Then something changed. It would be reductive to say that the impetus for this change was the rise of Facebook, but that certainly had something to do with it. Suddenly, nobody (including me, much to my own surprise) read blogs anymore, and took to reading status updates instead. More importantly, nobody used blogs to communicate anymore, but took to writing status updates instead.*

This change wasn't as immediate as I'm making it sound, but at some point in 2009, I realized I wasn't really blogging as much anymore, and people weren't responding anymore. I still had conversations with the same people, but they were on Facebook (and later on Twitter). These conversations were, by necessity, faster but also shorter, at times near-synchronous but oddly unsatisfying for all that. Blogging started to seem like something that took too much effort, when other, faster methods of online communication were so readily available. In 2008, I posted 723 times on my blog. In 2009, it was down to 378. By last year, it was down to 32.

Obviously, this wasn't some sort of dystopian disintegration of virtual community. It didn't happen to me; I just stopped writing, albeit slowly over the course of a few years. For one thing, I was working harder at grad school, I was in a relationship, I had more friends in "the real world," and I was spending a decent amount of time virtually socializing on Facebook (and later Twitter) each day as it was. I had hobbies and other interests, and it seemed natural that pursuing those hobbies and interests was of primary importance, while sitting in front of a blog writing about pursuing them should be of secondary importance. It wasn't until last week, in the midst of a discussion with my Digital Diversity students about Sherry Turkle's new book Alone Together that I realized how drastically things had changed: I realized, in the midst of one student's particularly strident defense of texting as an emotionally nuanced form of person-to-person conversation, that relationships and communication online was not as meaningful or "deep" as "real world" interaction, at least not for me. Not anymore.

I don't have any fancy academicese to back this up with at the moment, and I don't really want to cast about for any. I don't need it. That golden age of blogging is gone, and it's been replaced by a fistful of different social media profiles, a tangle of different audiences, all with their own expectations of me as a writer, and as a person, and it feels more and more, day to day, that my interactions on Facebook and Twitter, even with my best, dear friends from "real life" are really just a series of carefully calculated acts, impression management, with at least one eye always toward a bigger audience that might or might not be watching. And it feels like it's sucking all the fun out of this whole internet thing.

Now, obviously, I'm not going to stop using Facebook, or Twitter, or blogging (clearly). I study these things for a living because they're interesting to me, and they're just as fascinating when they don't work as they are when they do work. But. I can't help but feel more and more as time goes on, that a lot of my time spent on social networking sites is just sort of a waste. I have a very difficult time having meaningful conversations with real human beings over a medium like Facebook ("meaningful" academically, philosophically, emotionally, comedically, whatever). Everything is bite-sized pieces of information that people (including me!) have a tendency to attach planet-sized amounts of meaning to, and if you try to explain a point of view or opinion or whatnot "at length" (by which I mean "in the amount of text you might be able to speak in 45 seconds), half the comments that follow are "TL;DR lol" or similar. I'm not a huge fan of what it's done for our discourse. It seems pretty often that your options for communication on Facebook are: 1) talk about very trivial things constantly and 2) try to talk about something more substantive, but spend most of the following thread defending yourself from wild misinterpretations of what you were trying to say from generally well-meaning people who naturally project their own agendas onto your comments on a site that's really all about self-absorption (I'm just as guilty of this as anyone else, of course).

It's not that I have anything against trivial conversation. Between the ages of 16 and 32, the percentage of my daily conversations that end up being about bodily functions has only slightly decreased from 70% to about 55%. But I don't like the fact that after a day of trading Facebook/Twitter info-bites, the idea of writing a long-form blog post deeply exploring some issue seems like "too much." I don't like that I would rather refresh my News Feed for the twentieth time instead of reading a few pages out of a book that's been sitting on my desk for a month, gathering dust. Clever and funny as they are sometimes, I don't want to think in memes for the rest of my life, even if the instant-gratification part of my brain keeps trying to convince me I do by pumping me full of dopamine.

When I have a particularly full day at work, or some other professional/personal/emotional difficulty, I really enjoy going for a walk or a jog, or even a drive (without music). It's about the only way I've found to give my brain space to sort through the knots and try to make sense of what's going on so that I can respond to stress productively. I love climbing mountains; it's a beautifully rewarding hobby to have, both physically and spiritually. But half of what makes me get out there and do it on a regular basis is it's one of the few ways I have nowadays to actually get away from all the Facebook-level stuff (both online and in "real life") and let my brain process at what feels like the actual speed a brain should be allowed to process at. I still have some spaces like this in the physical world, clearly. And I really appreciate them, because they keep me from going nuts. I used to have a space in the virtual world like this as well, but then me and everyone else stopped going there. I'm still not sure why, but now, finally, I know that I wish we hadn't.

So what does making my blog private have to do with this?

An even more important question might be "Why are you still reading this?" But I digress.

I want to have a chance to reestablish this lost virtual space. Doing that means that I need to set up my online presence in such a way that I'm encouraged to write long-form posts again instead of the shorter, spur-of-the-moment content that social networking requires. To do this, I need to have at least one space where I know my audience and feel like I'm able to screw up in front of them without being harshly judged. At one point in my life it was important for me to prove to myself that I could be honest about who I was publicly, in front of anyone who happened to stumble across my blog. I know I can do this now. Now, I worry about losing a friend, a colleague, or a job over saying too much in a virtual space where we've all agreed to only say very little.

I write about a lot of things. Most of them are stupid. Some of them might be smart, but I doubt it. But I enjoy this, and, like going for a walk, it helps clear my head. It helps even more if certain people are willing and able to chime in with constructive observations along the way. And maybe the worst part of the blog-to-Facebook transition is not the dumbing-down of post content, but the fact that there's no easy way to share with these people without sharing with everyone you've ever friended, or with everyone who's ever chosen to "follow" you. This, I believe, fundamentally alters the way in which we perceive of our own identity formation and our written communication with others. No, seriously.

In the past, if you wanted to be weird, foul-mouthed, juvenile-y poetic, or even (the worst sin when all your online friends are academics!) self-contradictory, you could probably get away with it as long as you were audience-literate and made sure that each of the selves you presented to different people/groups was consistent for the particular audience you presented it to. Now that option is gone. The people who know the Facebook You also know the Physical World You. The people who read your tweets also sign your paychecks. And so on. People can say "Oh, just be yourself!" or "Just use Facebook Lists!" or "Just don't put anything stupid online!"

To these I say, in order:
1) The world must be a wonderfully uncomplicated place for you, random 13 year-old objector!

2) These sorts of workarounds are possible, for sure, but notice how they're always fairly complicated to enact. It's almost as if the people behind the social networks don't really have any interest in making it so you don't share everything with everyone all the time...

3) Well, then what's the point of being online? What's the point of socializing at all if you're not going to say anything interesting, make a few mistakes, learn from them, and then make a few more in the process of being a human being?

Back in the old days, we socialized in private. We talked on phones, we hung out in rooms, we passed notes that were quickly disposed of, lest our various social circles intersect, with disastrous results. The sender of the message and the receiver understood the rules of the particular circle that they shared: some circles were playgrounds, some were deadly serious, many were in-between. You could tell a friend about a serious misgiving you were having about, say, your current relationship and get some meaningful feedback as well as a little discretion. Now imagine having this same conversation in the middle of a restaurant completely full of people you know while your friend sits at the other end of the room and the only way you're allowed to talk to him/her is by using a megaphone set to full blast. It's easier to not talk at all.

And that's what I worry about, at least for myself, if not for Today's Society at large: that these "heavy" conversations are happening less and less because it's just difficult to have them within the logistics of online impression management. Certainly, not all talk goes on online, but more and more of it does. Research from the aforementioned book by Sherry Turkle shows teens growing up today frequently prefer texting to phone calls because the phone "gives too much away": emotions can be betrayed, information is conveyed through the voice, breathing, inflection, and it's much less scary to be able to perfect the presentation of your message, of yourself, by writing a text, posting on Facebook, etc.

This scares me. If people aren't willing to let the "information" of emotions into their discourse because it's "easier" to reduce everything to text, short paragraphs, soundbites, then it can't be terribly far from here to the point where bothering to think about difficult things becomes inconvenient enough that we simply prefer to pretend our problems, our brain-knots as it were, don't exist. Some people might be okay with this, and that's fine. This isn't a call to arms by any means. I'm not okay with it, though, not for me, and I'm hoping to get back to something that strikes me as a more holistic form of online communication and expression in the hopes that it'll bleed back into my "real life" a bit. I guess what I've been trying to say is that my day-to-day life is, to the smallest but scariest degree, starting to resemble Facebook. When I write in tweets all day, I start thinking in tweets all day. And there's too much world out there for tweets to contain.

Obviously, I'm going to keep using social networks. For one thing, they're part of what I study. For another, there's a lot of social inertia built up there; if I simply disappear from the FaceScape, Bad Things will happen personally, professionally, etc. I spend the majority of my 50 or so hours of work per week in front of an internet connected computer, and it's fun to tab away from work from time to time to throw up an observation on Facebook or a photo on Twitter or Instagram. But, as I said above, these networks are, from my perspective, mostly about the self, built on the supposition that if you have something to say, not just someone but everyone you know is potentially interested in hearing it. I think that's pretty rarely true (at least for me), and I don't write to get people's attention. 95% of the posts I make on social networking sites are made to specifically inform a few people of something (to share pictures with my parents, to let a few of my friends know where I'm going to be in 4 hours, etc.); it just so happens that because we're all in the same restaurant and all I've got is this damn megaphone, everyone else gets to hear, too, whether I/they like it or not.

So I want to go back to having a space where I know who I'm writing to, just one space, because I think that might just make a lot of difference for me. I've created this new blog, Recomposing the Pines, so that I can continue to cross-post academically relevant and publicly-appropriate thoughts (and then, yes, share those thoughts on Facebook and Twitter), but Defiantly, Maybe will be invite-only.

And yes, I realize the irony of making yet another blog as a response to profile fatigue, but whatever. This felt right, so I'm giving it a shot.

The reason I really wrote all of this, if you're still with me, was to say this: anybody who is interested in reading my old, less orderly and less presentable blog: I would absolutely love for you to. I'll continue posting, hopefully more frequently and more freakishly now, and we'll probably all have a good ol' time together. You don't have to comment, you don't have to post back; just knowing people are reading is a neat thing, and will likely keep me more interested in keeping the project going than if I'm just writing into (virtual) space. Get ahold of me in whatever way is easiest, and I'll add you to the reader list. I know there are a few of you out there who still read that blog now. If you can bring yourself to click a few times on the internet on behalf, I would like to add you to the reader list, because you are all awesome and I like having you read my stuff and things. I would be sad if you got shut out at this point. It's at locke456.blogspot.com.

For the 95% of you who aren't interested in the weirdo-Ben blog at all, awesome! You can do nothing at all and your reward will likely be having your Facebook and Twitter feeds a little more free of my inane rambling. What a deal, right?!

I'm certainly not doing this to hide anything (except the GPS coordinates of the bodies), but instead to just make sure I know who's looking when I jump from video game reviews to webcam videos of yoga ball Plinko to the blog equivalent of going on a walk to clear my head all in the space of a week. The web used to be a wilderness and now there are fences everywhere and gates and fees and frowning police officers and it'd be nice to just have a little piece of land to build something that's just mine again.

Is that stretching the metaphor too far?

Ah, overburdened metaphors and pages and pages of rambling text typed in the middle of the night. It's already starting to feel like old times in here...

Thanks for paying attention, and at the very least look for future academic-y, smart-people discussions about pedagogy and new media and conferences and whatnot here on Recomposing the Pines!

* I should clarify here that when I say that "nobody was reading/writing blogs anymore," I mean specifically as a means of general, day-to-day narration and mundane philosophizing. Of course blogs continued to be (and continue to be) a popular medium in other contexts.