Thursday, November 21, 2013

So, Getting Used To Working At A Teaching School Is Weird, But Good.

I want to start this post by mentioning some things that I like. I'm going to do this because after that, I'm going to talk about how I don't like some of them as much as I like others, or, at least, how some have become more important to me than others, their value more obvious, and I don't want to give the impression that I dislike the less-liked things...I just like them less than maybe I once did.

When I switched my undergraduate major from Computer Science to English Literature way back in 2001 (or maybe 2002), I did it because over the first 2.5 or so years of college, I'd realized that I fundamentally, absolutely loved the lifestyle higher education allowed me to have. I was perfectly happy working 12 hours a day if that work was reading, learning, questioning, writing, exploring, and if that time was spent with and around others who also valued those activities. I was happy enough with this lifestyle and wanted to live it badly enough to give up a major that would have easily (in the early 2000s) landed me a job right out of college that would have paid significantly more from the onset than my current tenure track job will ever pay even at the full Professor level.*

My enthusiasm for this lifestyle has waxed rather than waned over the years. It's become less and less financially practical to actually make a living by working in higher education as an actual educator (it's another story if you want to be an administrator), but that continues to not really matter much to me. I still love writing, I still love reading. During my time at WSU, where I was required to teach courses while pursuing my graduate degrees in order to receive funding, I even discovered that I actually really like teaching, something I would never have imagined about myself back in the KSU days, when I was just desperately looking for an excuse to stay enmeshed in the intellectual environment of a large public university after getting my bachelor's degree. That said, some of my favorite memories of my graduate career involve participating in great seminar discussions after wading through hundreds of pages of theory, and condensing that theory and tweaking it to plot and write up a dissertation that I thoroughly enjoyed and am (mostly) proud of. So, by the time I received my Ph.D, I had, usefully, realized that I loved both teaching and researching.

Interestingly, my new job wants me to do one of these and not the other. It's 2013, so I'm pretty sure you can guess which is which.

When I first learned this, during a post-interview phone call with my then-potential employer, my heart sank a little. Taking a job where I taught a heavy course load and wasn't expected to engage in research (read: "We won't give you time or money to do any research"), seemed utterly depressing to me. I loved the research I was doing, and couldn't imagine giving it up to "just" teach more classes.

Well, as you can probably guess, this is the part where I decide that I don't like some things as much as others: flash forward about seven months, and I feel completely different about the whole teaching vs. research thing. I realize that my opinion isn't for everybody, and I'm not writing it here to force it on anyone else, but just to share my own experience. So.

Working at a teaching-oriented university is awesome.

See, the always-underlying issue I had with the work I was doing before was that it was pulling me in two opposing directions basically constantly. I love teaching and I love doing research, but when you're doing both at the same time, you're always doing both to about half of your ability. Or, at least that's how I've always felt. When I was able to just sit and read and write for a week, I was able to create some of my best-ever work. When I was able to just focus on lesson planning and teaching for a week, I was able to teach some of my best-ever classes. Out of eight years of work at WSU, how many times did I get these "one-only" opportunities? Maybe 2-3 times apiece. That's not an indictment of any particular person or department or policy, just an observation of how the general expectations that come down from on high at a huge public research university force you to mix, mingle, and often ultimately shortchange your priorities.

Here, it's easy: I work on teaching all the time, except for when there's a committee meeting or another service-related opportunity, and then I put the teaching aside for an hour and focus on that. The transition back is much easier, because helping to guide the course of a small department has a lot in common with teaching (and learning). Research, though it can certainly inform one's teaching, doesn't have a lot in common with pedagogy itself. Unless, I suppose, you are researching pedagogy. Which I was not.

So, practically speaking, my job's not nearly as confusing. I'm switching gears much less often, and though I work just as hard and often get just as tired, I'm not burdened with that feeling of being stretched too thin, of knowing I could teach better if I didn't have to attend my fourth conference of the year or knowing that I could really knock this article out of the park if I didn't have to teach three classes this semester, and so on.

Existentially, focusing just on teaching is a lot nicer, too. This is what's really surprised me, and what's possibly going to piss off some of the gentle readers out there. Surprisingly, "service" is no longer a bad word to me. Once, it was that pesky thing that you half-assed on your CV, the grad student equivalent of putting Who's Who Of American High School Students on your college application. Now, it's things that I do naturally every day as a result of being part of a small department, in a small town, with a recognizable and manageable community of students and faculty and administrators.

Because this is such a small school and the only thing that's more important on faculty evaluations than "service" is teaching, there's (generally speaking, at least) a positive attitude toward university- and department-related activities built into the way most people approach their work here. And this is great for me. Since I started in September, I've already made progress on helping to introduce a new academic program to the school, created six new courses that should all be seeing the light of day over the next 2 years or so, made inroads toward getting placed on two faculty committees whose work I think of as very important to the university and community at large, and met informally with a large number of faculty members to discuss how we can improve our Humanities offerings to more directly suit their programs' needs. I have a lot more time to meet with students individually, I've talked to every member of my department at length, both about work and more mundane things, and I'm setting up one-on-one, face-to-face meetings with the Provost and President of the university to discuss my thoughts on revamping the Humanities offerings and general education requirements.

I don't list all of those things to brag, but to make the points that a) I'm doing lots of things that are going to (hopefully) be directly valuable to the university and its student population and b) all of these things are way more meaningful to me than getting another article published in some obscure journal after months of toiling away alone in my office.

I once read an article (and I really wish I could remember where, because it would help this post out a lot) that suggested, pretty compellingly, that requirements for "research" in the humanities was something colleges and universities just sort of started requiring middle-level faculty to engage in before they could qualify for tenure for the sole reason that there wasn't much else for them to do besides teach classes. In larger universities, departments got so big that there was really no meaningful way to include all faculty in the processes of governance and upper-level decision-making, so they were, essentially, given something else to do instead to hide the fact that the system had become too cumbersome to care about their input anymore. Now, I don't know how accurate that is (like I said, the article was pretty convincing, but I also didn't follow up by investigating its claims in detail or anything), but it rings fairly true based on my own experience.

Like I said at the onset, I love doing research and writing articles, and writing my dissertation is one of the most rewarding things I've ever done academically. But it's hard to regularly put so much time and effort into publications that are likely to only be read by a few people who are already hanging out in the same echo chamber as you, especially when the time spent writing could be applied to teaching or to working to make a positive change in your immediate department or university. Research is great, but if given the choice between working on an essay of mine for an hour or serving on a committee that's working toward a positive infrastructural change at my university for an hour, I'll always choose the latter. It's more immediate, and to me, more ultimately meaningful.

Certainly, when you write to publish, you're learning a lot about your field and then you're "creating new knowledge." But I can still do that by reading a bunch of books before choosing one for a class (which I do every semester) and then discussing that text in class, requiring my students (and thus myself) to critically think about the material in it for a couple of weeks. I can blog, tweet, talk to my colleagues, and engage in other forms of meaning-making based off of the things I'm reading and the concepts I'm pondering, all without having to resort to the draconian and ultimately somewhat hollow experience of hurling myself into the maw of academic publishing again and again. I haven't written any new article material yet this year and I don't really plan to, but I don't feel dumber as a result. I feel both smarter and more like my energies and talents are being applied to tasks that have a more immediate benefit both for me and for the rest of the university.

Of course, I don't mean this as a condemnation of humanities research or anything dramatic like that; it's just that it's been striking to me to realize how much more focused and simultaneously more relevant and productive I feel without having to worry about it as much anymore. I've been surprised at how hollow it seems (again, just to me) now that I have an actually well-defined and relatively powerful role to play in a small department that's excited about my input in a way that a larger department just couldn't be. It's a combination of factors that obviously isn't available to everyone who is in this line of work, and even if it was, I imagine a lot of people wouldn't want it. But it's working out really well for me by allowing me to stop focusing on the things that I've been told Matter Greatly and focus instead on things that matter to me intrinsically.

* At least two friends of mine who majored in CS around this time immediately started jobs with six-figure salaries after receiving their B.S.

Monday, November 18, 2013

Low-Attendance Classes Are Awesome. I Think.

This is something I think about and talk about face-to-face, in meatspace, with friends and colleagues a lot. But I've always been hesitant to post about it on the internet in fear that it will get a lot of those knee-jerk, judgmental responses the internet is so famous for. I've had a few good discussion arise from my blog posts lately, though, and very few heads have been bitten off. So I thought I'd try.

I take attendance when it's required that I do so by the school. Insofar as I'm allowed to formulate my own attendance policy, I have none. I make it clear to my students up front that a big portion of my class is discussion, participating and "meaning-making" in the classroom, on a day-to-day basis. If they choose not to show up for this, I don't directly penalize them, because I believe (and have seen after teaching over 1,000 students) that they are indirectly penalized and almost entirely across the board perform worse than students who have perfect or near-perfect attendance and participation.
Pudding, meat, etc.
For better or worse, this is generally my approach to teaching at a university: you, the student, are (nominally, at least) an adult, and you're free to make your decisions about attending class, about doing the work, about putting in the time, etc. I make my expectations clear up front, I make myself readily available to you if you want to do the work but don't maybe know how to do the work. Even if you don't know how to correctly want to do the work, I'm willing to help. But I'm hesitant to force students to put in a good effort because then it won't be a good effort...at least that's how it seems to me. This has been my approach for a long time, and I'm mostly happy with it, though sometimes I feel bad about not being more prescriptive, as in "Student A wouldn't have failed if I'd have forced him to turn in more drafts of his essay!" But that brings it's own set of problems, I suppose.

Anyway, the point I'm trying to get at is that I find myself frequently happy when I have a low-attendance day or week in class rather than frustrated (which seems to be the "normal" faculty reaction to such things). I think it's because it reinforces the implicit assumptions I'm making in my attendance policy as I've described it above: if the students who show up are choosing to show up, and their motivation for showing up is interest, if not in the subject matter then at least in GPA-based success, then a low-attendance day actually raises the percentage of interested students in the classroom.

As someone whose courses are dialogically-based 95% of the time, I'm all in favor of smaller class sizes as a rule, but even courses with a twenty-student cap are typically going to have around ten students who would just barely not rather be dead than be in my classroom. On the other hand, every single class session I've had where there were 30-40 students enrolled and only 3-4 showed up (something that happens depressingly frequently) have been absolutely brilliant class sessions.

There's a part of my brain that feels like I should feel guilty for thinking this, but I can't tell if it's a logical part, or the part that occasionally insists that we should give a trophy to every child, even if they come in last place, because trophies make people feel good and sad people are sad.

Monday, November 11, 2013

An Ode To Instagram

So, thus far on my new work-related blog, I've had three posts. One is arguably work-related and the other two not so much.

But I realized something awesome about Instagram the other day and wanted to share, and since I at least nominally research and write on new media for a living, I don't think it's too far off the mark to post it here as well as my Super-Secret Blog of Feelings and Secrets.

As if it wasn't clear from my original post on this blog, I have a pretty ambivalent relationship with social media. I used MySpace for awhile, but I was never quite sure what it was for besides streaming other people's music. I started using Facebook because my friend Toria signed me up using my laptop while I was in the bathroom one day at work. I started using Twitter because it was required for a class. I started using Foursquare because I needed another example of digital storytelling to talk about in my dissertation. I started using Google+ because I was just curious if it was actually better than Facebook.*

In short, I didn't really start using any of these services for what they were meant for; namely, for socializing. Like a dog with a type of food he's never encountered before, I've always been content to just take a lick here and there and then spit the thing back out, because, frankly, the food I usually eat has always been good enough.** I started researching and writing on Americans' use of social media not because I was a huge fan, but because it was so fascinating to me that we could collectively become obsessed with a form of expression that was simultaneously so full of promise and yet so empty in execution.

I post a lot on Facebook, but it feels fairly pointless, unless I'm sharing news that I know will interest or entertain someone specific. I post a lot on Twitter, but it's the internet equivalent of having "conversations" with the colleagues that speed past my open door en route to their class: it's nice for a quip or shout-out here or there, but ultimately not much beyond that. I still use Foursquare from time to time, but only because it's fun from a gaming perspective: I like earning new badges and mayorships in that shallow Unlock Steam Achievement way. If each Foursquare interaction took about a second longer, I'd probably stop using it entirely.

Instagram, though...Instagram is cool. I realized this the other day as I was waiting for one of those things you occasionally find yourself waiting for, like someone to call you back or your girlfriend to use the restaurant bathroom. During those little spaces nowadays, when you know you don't have enough time to Accomplish Something but there's also not enough time to Fall Into A Deep Reverie And Reflect On The Meaning Of Life, I usually pull out my phone and tool around. In this case, I went to my Instagram feed and started looking at the pictures posted recently by the people I follow. And I realized after about four pictures that it was just really awesome, interesting, and creative in a way that all those other social networks rarely are.

Part of it, of course, is the requirement that one uses visuality to express themselves on Instagram. There's a huge gap between posting "I ate veal tonight" on Facebook or Twitter and turning that written expression of mundane fact into poetry.*** However, even if you take the most soulless, robotlike photograph of your food and post it to Instagram, you're still making decisions about how to frame the photo, what light you're taking it in, the angle you're taking it at, etc., etc. You are forced, at the options screen, to make choices about the frame that goes around your picture, if you want a caption, if you want to apply a filter, and so on. Even if you opt out of all of these choices, that itself is a choice. It's a social media app that requires you to express yourself artistically and creatively, even if only in the smallest of ways, and for that I love it.

Of course, this also democratizes interesting content creation in the sense that you don't need to be able to describe your plate of veal like Whitman or Keats to make it interesting: visually, to me at least, it's really exciting to be able to actually see what other people I know find important in their lives. Written language is a fun tool to play with, but not everybody likes to play with it. Not everybody is comfortable enough using it to play with it. A neat photo on Instagram is just a click away (a few more if you want to apply the Amaro filter first, which you should probably do).

There hasn't been a single time over the last year or so that I've checked my Instagram feed that I haven't found something that struck me as a legitimate piece of point-and-click art. I'm lucky if that happens once a week or three on Facebook or Twitter. Other social media networks feel a bit like I'm living on a street with everyone I know and we all leave our curtains open and our windows open and talk really loudly, even when we're just talking to ourselves. Instagram, instead, feels a little bit like those times in my undergrad Creative Writing classes when we'd all write some super-sloppy, semi-outrageous, but totally earnest poem and read it out loud to the rest of the class. There was always something a little bit silly about everyone's poem (including your own), but the fact that everyone was sharing made it so everyone deserved to be taken seriously.

I think this happens with Instagram at least in part because the platform forces you to share in this way if you choose to use it at all, but I think I'm okay with that.

* It is, but nobody uses it, so it's sort of a hollow victory.
** This is a strained metaphor, but it's also happening on the carpet in front of me right now, so it's what came to mind.
***And here I mean lowercase-"p" poetry, not uppercase-"P" Poetry.

Sunday, November 10, 2013

Linux Mint Wants To Be Your (Work) Friend

Note: This was cross-posted on Recomposing the Pines today.

It's true. I won't go into depth on why it's true, I'd rather leave you to figure that out yourself if you're so inclined. But I wanted to do a little mini-post here simply to say that I've been using Linux Mint (first version 13 and then version 15) for all of my work-related tasks since mid-summer, and it's been absolutely fine.

A lot of people seem to have the perception that Linux = Difficult, or that you have to be some sort of crazy computer whiz to be able to use it. I say this because that was exactly my perception until I tried out Mint.

There are, it turns out, a lot of different distributions of Linux. Many are obviously built to serve particular purposes (for example, Puppy Linux is built to work well on old computers and netbooks, but has less flash and functionality than something like Ubuntu). Many are unique in ways that would be basically unnoticeable to the non-guru, but are, I'm told, mightily significant to The People Who Know Computer Things.*

But, generally speaking, Linux is appealing to a certain kind of person (read: "not just a computer person") because it is the early-90s, manual-transmission car of the operating system world. At a time when it's getting harder and harder to look under our "car"'s hood and find the radiator, to learn to change the oil by ourselves, or to diagnose a fluid leak without hooking everything up to a computer, Linux still has an easily navigable, wide-open space under the hood, easily identifiable and removable (and installable) parts, and so on.

Obviously, in this metaphor, a lot of people are going to prefer to have a car that "just works." Of course, that is, until it doesn't work, in which case you either have to take it in to the mechanic (Windows) or just buy an entirely new car (Mac). But if you have a modicum of interest in at least slightly understanding the machine in front of you that likely contains your entire professional (and maybe personal) life and on which your ability to discharge your workday responsibilities relies, Linux gives you a foot in the door.

Mint, specifically, is built to mimic the Windows UI (pre Win7) in a lot of ways. It comes prepackaged with a lot of software and drivers, to the point that if you're one of those weirdos who can happily sit at a work-provided computer for years and never think about installing new software, or changing your desktop background, or changing the arrangement of the taskbar icons.
Mint 15 UI with the taskbar and an image preview open.
I've never really had any trouble with Mint on install before, and I've installed it on five different computers now, from three different manufacturers. The only times I've had issues that required some serious troubleshooting was when I started trying to install new programs, change system settings, etc., and most of those "errors" and system failures were more a result of my own clueless flailing around than any inherent instability in Mint.

Fortunately, basic functions work well "out of the box," as it were, and pretty much all the software you'll need to perform typical work-related tasks are already installed, and your sound, graphics, etc. should work right away, to a surprising degree. I installed Mint recently on a two-monitor setup in my new office, and both monitors came online immediately -- something that didn't happen even when I installed Windows 7 on the same computer.
Different is terrifying!
I guess the big catch for using Linux for typical work tasks would be that you'll need to learn how to navigate some programs that are going to be a little different than what you're used to. For example, there is no Microsoft Office. There is, however, LibreOffice, which for my money is faster, more reliable, and nicer-looking than MS Office anyway.
Plus, it can save documents, slide shows, spreadsheets, etc. in all the typical Microsoft formats, so you can pull the same documents up later on a computer in the meeting room and have them run just fine. Admittedly, sometimes there's some formatting, etc. lost in the process of moving from LibreOffice to MS Office or vice versa, but honestly I've never had as much trouble as I've had moving from, say, MS Office 97 to 2003 and back or 2003 to 2010 and back, and so on.

If you want to try Linux Mint, but not totally commit, the site I linked to above will explain how to make a LiveCD and boot your computer from it. In this state, you'll be able to play around with Mint "on" the CD, so there are no changes made to your computer while you test it out and see how you like it. If you want to install Mint alongside Windows 7 (or lower), so you're not making One Big Choice as much as letting yourself make the choice every single time you boot up the computer, you can do this. In fact, this is how all my computers are set up: they start up to a simple menu at which I choose either Linux (for 95% of my day-to-day tasks) or Windows (for the occasional movie-editing or sound-editing task that it does better than Linux, plus video games). There's a really easy-to-follow tutorial for this, which is right here.

Anyway, once I got everything installed, I found Mint to generally be more reliable, faster, and better-looking than Windows, at least as far as doing the typical word-processing, emailing, and messaging tasks that I do at work every day. I boots up way faster, too.

If you want to explore the system a little bit more, it actually lets you, too, and as long as you don't jump into doing things like editing system kernels right off the bat, the learning curve is pretty shallow and there's a number of great communities online that are willing to help you figure things out and answer newbie questions.

Oh, plus everything's free. I suppose I should have mentioned that before. Mint itself is free, and all the software that comes with it is free. All the programs you might want to install are free. Don't like the default music player? There are literally hundreds more to choose from, and none of them cost a cent. The Adobe Photoshop analogue? Free. High-end music editing programs used by real DJs? Free.

You get the idea.

Anyway, just something to think about. Switching to Mint will likely stretch your brain just a little bit, but in a good way. Plus, over the last few months I've come to find that it's not just a good alternative for work than your typical paid operating system, it's actually better in a lot of significant ways.

* I finished my undergrad a measly nine credits short of a Bachelor of Science in Computer Science, and do not consider myself smart enough to be counted among these people.