Why Simplify and not Forgive

A friend emailed a week ago with a challenge. Pick a word and sent an intention to focus on that word in 2017.

Two words immediately sprung to mind: play & forgive.

Educators know that play…well…plays a significant role in human development and in human social endeavors. As an educator, I try (or rather USED to try) to incorporate play into my classes. Students completed simulation games to learn about technical report writing or journal and freewrite in creative writing classes to come up with a kernel of an idea.

Play involves both winning and losing. Failure opens the portal to success by building neural structures. Closing off avenues to play harms humans and leads to emotional and social disorders.

Over the course of the last few years as I earned tenure at one school, promptly resigned that position and moved from Oregon to Arkansas, had an unexpected (and WONDERFUL) pregnancy at 38, started a new job, worked to earn tenure AGAIN, and moved into new residences three additional times, I didn’t always have (or MAKE) time for play.

But even now, I have other work to do mentally before I really dedicate myself to a year of play. Some of that work revolves around forgiveness. Forgiving myself for less than stellar performance (the perfect is the enemy of the good) and forgiving others for a variety of slights. For the last few decades, I have been living in a constant Festivus Airing of Grievances.

While I need to center on both play and forgiveness, the more I dwelled on those terms, the less I felt ready for them. What I need to do first is simplify.

A few years ago, I subscribed to Real Simple in an effort to declutter, destress, and clarify. I never made time to read it, and the issues that I didn’t read gathered cluttered and stressed my life.

What I need now is that decluttering and destressing, but first, I need to clean out my head. So, I have decided to ask, when I am charged up, angry, or depressed, “What is the most simple choice I can make right now?”

I am a professional at worrying over the possibilities of damn near every decision I have to make. When I put on a shirt in the morning, I mentally picture myself going through the day…considering what possible scenarios I might face in said shirt. Would that shirt be comfortable if I sat at my desk? If I walked to the copier? In various meetings? Sprinting across campus in case of a….you get the picture.

Instead of what I call “running the plays,” I need to work on simplicity. So, instead of asking myself the barrage of questions above, I need to just say, “What is the most simple decision I can make right now?” If the simple decision is to put on a shirt that is comfortable RIGHT NOW, then I assume that the shirt will be sufficiently comfortable the remainder of the day. Done.

My new simplify my mind (and live in simple mindfulness) motto can be found at the Word Porn website:

blog image.jpg

I have in this first 10 days of January not been so successful at this. Last night, I spent a good two hours running the plays on various imagined scenarios instead of simply being present.

The good news? I have 355 more days (and hopefully many years after that) to get it right(er).

Why I Have(n’t) the Time to Blog

Last night, I spend an hour and a half playing a game called Jelly Splash on my cell phone. I played from when the kids went to bed until my phone battery died.

And, for an hour and a half, I cursed that game for clearly cheating me by not acting in the ways I wanted to act. I wanted to win. And, like a gambling addict tapping the “ROLL!” button at a casino slot machine, I kept hitting “Play Again!”

So, this morning, when I faced the first day of classes with coffee-fueled energy, I considered blogging.

“You probably don’t have time,” my Self said, cradling the cell phone where Jelly Splash rested behind the sleepy screen.

But the truth is, I have as much time as I make. Like Rumplestiltskin’s gold-spinning trickster, my Self spins all the time it needs for shiny things (JELLY SPLASH) while talking me out of blogging or exercising or eating a salad.

No more lack of time! So far, this post has taken five minutes to write, and while it isn’t the most shining example of professional prose I have ever developed, it’s writing. It’s five more minutes of writing than I had completed five minutes ago.

Last year, I decided to write for just five minutes a day and downloaded vJournal (which works with Evernote) to capture my five minutes of golden straw. I journaled for five days.

This year, I am asking my students to keep a blog, and as I am a masochist, I need to make myself suffer through what they are suffering through so that I can understand their plight. Which means…blogging. And preparing for blogging. Modeling good blogging practice. Talking about and researching blogging.

To be honest, the thought that I will write out words that have an instant (potential but not likely) audience makes me a little woozy. Sure, I can toss out something quick and easy on Facebook, the chocolate protein bar in the online publishing buffet. But to actually put thought into something that I push out nearly instantly into the netherworld of the interwebs?? Ack. I don’t want you to read this. Stop reading. I insist.

—————-

So, here’s my goal. I will blog at least 500 words. I won’t blog every day. On good weeks, I want to blog Monday morning and Friday morning. On bad weeks, I want to blog Monday mornings only.

After all, young Doogie Howser, M.D. wrote after every episode, and he wasn’t even a REAL doctor. And there was no such thing as the internet! He just wrote for a journal that no one would read (except all the show’s viewers, but still).

See? I’ve already written 450 words!

Not only will I write 500 words weekly (apparently, numbers don’t count as words…), I will announce at the end of each blog what the next blog is about so I will be motivated to think about that next blog when I am oh, say, working on trying to possibly…

Time’s up! 🙂

(p.s. The next blog will be about my theme for the year: Simplify! And a bit about why I tossed out the previous theme: Forgiveness.)

 

 

Introducing Online Writing Instruction

So, as is custom, I am posting my once-yearly blog on the first day of the fall term in hopes that I will, you know, blog more than once a year. I am certain that, had I a magic 8-ball, it would read “Outlook not so good.”

But still. Here it is.

So, this fall, I am teaching two courses in what will hopefully be the Graduate Certificate in Online Writing Instruction*: Intro to Online Writing Instruction and Multimedia in Online Writing Instruction. These courses, and the potential certificate, have taken four years to materialize. I first pitched them as an e-Learning Certificate at Eastern Oregon University. No go. So, when I came to UALR in 2013, I revamped my proposal to focus more on writing instruction than e-Learning in general, and voila! Only two years later, the courses are in the books, up in Blackboard, and ready to go.

For those of you who haven’t been close to me in the last two years and thus haven’t heard me incessantly discuss this, the OWI Certificate program is designed to help faculty develop courses based on the CCCC OWI Position Statement of Principles and Example Effective Practices. In designing the first class, I modified Principles 1, 3, 4 & 11 to make them operational as learning outcomes. The process of doing this was fascinating. How do you indicate, in an online class for online teachers who are online learners, how you want them to be able to demonstrate, understand, etc. effective principles of online writing instruction? I found myself combining some example effective practice statements from different areas of the document that were very similar and modifying the language on other statements that didn’t really make sense to me when operationalized.

The fact that the course is both 1) online and 2) teaching students about online principles has kept me up at night. Am I demonstrating every principle I am asking my students to demonstrate??? Is my course fully accessible? Are my assignments chunky enough? Am I allowing multiple opportunities for interaction?

Time will tell. I hope that I am practicing what I preach and preaching what I practice.

In the meantime, here’s an inspirational video that my daughter’s first grade teacher showed at orientation. It pretty much sums up what I hope to do/be this term as I learn how to teach by teaching.

Reconstructing Rhetoric

My M.A. in Writing and Ph.D. in Comp/Rhet required six rhetoric classes (classical rhetoric, history of rhetoric, medieval rhetoric, 19th century rhetoric, modern rhetoric, contemporary rhetoric). Because I assumed I would always teach at an institution that was primarily undergraduate as more of a composition scholar than a rhetorician, the study of rhetoric for me was less about preparing to teach it as it was having six semesters to play with theories I could pull on at leisure for other academic pursuits (and to bore people at parties with my esoteric ability to parse etymology with a shallow knowledge of Greek root words). Dr. Linda Hanson at Ball State University taught us that the 19th century basically began in the mid 1700s and extended into the early 1900s. Her seventeen-page syllabus, replete with about 5 pages of “recommended” (read: also essential) readings was the first syllabus I received in grad school…and almost the last. Dr. Paul Ranieri so vehemently and eloquently disagreed with my ideas about critical pedagogy that I immediately wanted him for the second reader on my dissertation. A single sentence he mentioned in an office chat about the nature of Platonic reality still haunts my dreams.

This fall, my first graduate class in Rhetorical Theory goes online. All graduate students in our professional and technical writing program (with tracks in technical writing, nonfiction, and editing) are required to take this class…and it is the only rhetoric class required. I am facing the mind-numbing task of condensing approximately 2500 years of rhetorical theory and history into sixteen weeks. In doing so, I’ve been rethinking what makes rhetoric a fascinating study for me and how best to communicate that passion to students who will primarily be technical communicators, nonfiction writers, editors, and composition teachers. In other words, students who are very much who I was as a graduate student…not necessarily seeking to be rhetorical scholars but who very much need to understand the foundations and theory of rhetoric in order to navigate 21st century communicative possibilities.

In keeping with the list-happy nature of popular online media in early- to mid-2014, here are five reasons that I was happy I had those six courses in rhetoric (and why nearly everyone would benefit from a little bit of rhetoric in their lives).

1. Most of what holds true about rhetorical theory today was laid out 2500 years ago by a man named Aristotle and some guys called the Sophists.

If you were to study nothing but the first 500 years or so of rhetorical history, you’d have a pretty good grasp on most of the key terms, concepts, and structure of rhetorical theory. In fact, Aristotle’s Rhetoric alone (written B.C.E. 350) will provide about 80% of what you need to know in order to be either a practicing rhetorician (provided you are a good man speaking well–sorry about the sexism) or pull off a solid rhetorical analysis. For grins, you can also dabble in the Sophists, a much maligned but fascinating group of teachers who, again, practiced before the Common Era. A good chunk of the last 2014 years of rhetorical writing from the Middle East to the Midwest involves destroying, upholding, apologizing for, condemning, modifying, sanctifying, or reclaiming these basic rhetorical principles. In other words, we have a few basic precepts, then 2000 years of variations on a theme.

2. Rhetoric is more art (techne) than it is a science or philosophy.

Much of rhetoric’s history has been spent in a shoving match with science and philosophy, or rhetoric has been relegated to the handmaiden of the above. However, to me, the discussion around the epistemological nature of rhetoric (i.e. does rhetoric create or convey truth) begs the question, what’s so great about the search for “truth” (other than most academic fields are fairly obsessed with it)? Rhetoric’s true nature as an art form, as the infinite realm of potential not the finite realm of the real or “true,” makes rhetoric more a dance than a dissection, more cantata than cadaver. Although there is some dissection involved…just not the bloody kind.

3. Studying rhetoric is like buying a blue car. Once you buy a blue car, you start seeing blue cars everywhere. The same is true with rhetoric.

That commercial for bath soap? Rhetoric. Stump speech for your favorite (or least favorite) candidate for Congress? Rhetoric. Infuriating Facebook post? Argument about socks with your significant other? Yep and yep. Study rhetoric and you will soon find otherwise innocuous daily activities imbued with a special hidden meaning. Dabble in stasis theory, and you might look at CSI Miami in a whole new light. Delve into visual rhetoric, and you’ll begin questioning why your children’s teacher feels it so necessary to use comic sans as a default font. (Ok, to be fair, you might question that choice without the formal study of rhetoric).

4. You have been operating under the Platonic view of reality and you probably didn’t even know it. And that is a terrible, terrible shame.

No, the Platonic view of reality does not mean you can be really great friends with the opposite sex. What Plato did was set up a view of reality that…he posed the ephemeral against the…damn it, Dr. Ranieri! I can’t even start talking about it without getting angry. Read this excellent blog that explains the problem with Plato as it relates to the sciences while I gather myself.

5. No matter what you do or choose not to do, rhetoric will enrich your understanding of language and human nature (and might help you win a few more word battles with the important people in your life).

Aristotle defines rhetoric as “the faculty of observing in any given case the available means of persuasion.” The study of rhetoric will make you a more observant human, better able to navigate the core of most human endeavors, the rock-riddled waters of speaking with, writing to, and/or gesturing at other humans. The ability to open our mouths and stick our feet directly in them separates us from the rest of the animal kingdom (except for cats, who can literally open their mouths and…well…you know). And no field…not even literature or creative writing, rhetoric’s closest cohorts (in academe, at least)…will help better equip an individual for a life lived among others in the polis. Or in the country, for that matter.

One of my favorite quotes is from a comp/rhetorist named James Berlin: “To teach writing is to argue for a version of reality and the best way of knowing and communicating it.” If this is true, then to teach rhetoric is to challenge students to argue for their version of reality with all the means available (which they are pretty much forced to do because Plato…no…I won’t get started on this again…).

In short (she says after nine paragraphs of rattle), I can sit in front of my computer every day and open my online rhetoric course with a smile because the study of rhetoric makes us ontologically more human.

That and coffee. But mostly rhetoric.

MOOCs: Menace or Magic? Both?

This morning, I read the NY Times article The Year of the MOOC. The title is interesting. One year. “I like to call this the year of disruption,” says Anant Agarwal, president of edX, “and the year is not over yet.” The discussion about MOOCs, of course, has gone on for much longer than a year (the first MOOC, ostensibly, was taught in 2007).

But MOOC news has gone mainstream. Because it is a part of the larger discourse, and as part of a project for my day job, I am in the process of gathering all of the news about MOOCs. How I went about doing so says something about why MOOCs exist, why so many people are worried, and why they are the natural product of a number of factors that pull higher education and technology together and apart.

I’ve followed the MOOC debate mostly through news digests from Inside Higher Ed, the Chronicle of Higher Education, and Campus Technology (or as I call them Nerd Breakfast Reading). I receive links about MOOCs from people also interested in online learning and digital technology through email and Facebook, and occasionally, articles will pop up on the WPA-listserv that feeds into my private email account. Because I am working on this project, my supervisor sent me a list of MOOC sources she had compiled, links pasted into a Word Document. So up to this point, news about MOOCs has come to me. I haven’t gone to it.

But this morning, after getting all jazzed up about the NYT article (more on that later), I decided I would go and search for sources on the either disastrous or miraculous (and certainly game-changing, no matter what your stance) Massive Open Online Course (which, as I composed this post, I realized I had been calling the Massively Open Online Course).

If only there was an aggregator that would search for content (like Google) but not have Wikipedia be the inevitable first result (unlike Google). And if only that aggregator would allow me to arrange the sources so that I can then share them and allow comments (like Facebook) and allow access to those sources that wouldn’t require people to also endure endless pictures of my amazingly cute daughter (unlike Facebook).

Ah…yes…there is. Scoop.it. I can now gather, curate, and disseminate information (with commentary) quickly and easily, and reach a much wider audience (potentially) who can then use that information to curate, create, disseminate…etc. etc. For nearly free.

My point? My search for information, and the places that I went to get that information, is the first step in understanding how online learning (and particularly MOOCs) change the game in education. Part of that starts in the tools to which almost everyone has access, and the nature of how those tools work in the complex reality of human lives. As I see it, MOOCs are going to revolutionize higher education, for good or bad, because:

1) MOOCs have the potential to use the affordances of technology that already exist for the forces of good…education. 

The NY Times article states that “In September, Google unleashed a MOOC-building online tool, and Stanford unveiled Class2Go with two courses.” Right. But MOOC-building online tools have always existed in bits and pieces on the internet: Google Search, Facebook, blogs, YouTube. Google is just packaging existing concepts into a concept for a specific purpose…open source education. What higher education as a bureaucratic (and revenue-requiring) entity has resisted, for the most part, is the systematic integration of technology into learning and support for that integration.

The most recent Department of Commerce report emphasizes the connections between education, technology, innovation, and a strong economy. The strong economies of the past have fueled higher education, and conversely, our current stagnant education hinders higher education from supporting innovative teaching using technology by overloading and overworking faculty (popular opinion to the contrary). Because of fundings, tradition, or what have you, traditional higher education minimizes the use of the same tools MOOCs embrace, or uses these tools in very limited ways while charging increasingly steep tuition. Add to this print textbooks and other materials that are outrageously expensive (an advisee just told me she was hesitant about changing to a course that would better suit her learning needs because she had already purchased the $200 textbook for a different course), the high cost of living in a dormitory and eating on a meal plan, or the changing nature of the student body, and you have the perfect storm that is allowing MOOCs to come mainstream. MOOCs explore the boundaries of what technology, particularly well-designed and delivered courses based on learning theory and leveraging elements like adaptive release and adaptive learning, can provide for learners at a greatly reduced cost to students (albeit at the price of venture capitalism).

2) College students seek and education to enhance their lives, which often means first and foremost supporting themselves and their families. 

Coming from a background in the liberal arts as a first generation student from a rural area, I can understand the desire for education to be the halcyon days of Alcyone, the beautiful period where the indecencies of life do not interfere with the pursuit of cerebral bliss. Alas, few students coming to the gates of higher education are doing so to find seven days in winter without a storm. Or in some cases, they are. More and more undergrads (and grads) are seeking a path through the grounds of higher education that will lead them to a brighter future. Higher education is a road to travel quickly, not a garden to explore at leisure.

This metaphor is particularly true for students taking online courses. The average age of students in online courses, according to a study in the Online Journal of Distance Learning Administration, is 24.4 for women and 24.5 for men. Surveys by Noel-Levitz show that 80% of respondents in their online priorities surveys were over 25. To make a sweeping generalization, students in that age range might be looking for a way to begin or support families, move out of their parents’ basements, and otherwise become self-sufficient (and or sufficient for a group of dependents). Because the affective is a powerful motivator (and the root of all long-lasting learning), those students are most likely going to be looking for institutions  programs, courses, and educations that will benefit them and their families and keep them active in those families. As much as those of us who came from a leisurely, government-subsidized and loan-supported liberal education wish, our students are immersed in a stark reality where education = job = better life. MOOCs offer, at least on the surface, a means to an education where students are, in their homes, at the 6-year-old’s soccer match, in Afghanistan.

MOOCs come to students in the places life happens, on a road that might not travel through a traditional brick-and-mortar university. MOOCs can be rewound, fast-forwarded. Concepts can be mastered or reviewed. For free (or for the price of a laptop and Starbucks wireless). And while at this point, only Colorado State University offers to translate MOOCs into college credits, the deal between Coursera and Antioch University indicates that MOOCs might be a path for students to bypass traditional brick-and-mortar. Which might be because…

3) The traditional brick-and-mortar university is not designed for these students. 

Not to say that all universities aren’t. But I would venture to say that most universities are designed for the 18-22 demographic. Welcome Weeks involve trust-building activities, a tour of campus clubs and organizations, information on campus life. But the students above are focused more on their off-campus lives. A majority of traditional courses (again, wildly generalizing here) are also based toward the 18-22 demographic, a population comfortable with (if not satisfied by) the lecture-and-test mentality they are trained to master by the misfortunes of teach-to-the test regulations in K-12. In the STEM fields (where American desperately needs innovation per the Department of Commerce report), the primary method of instruction is still lecture and test (sorry to those who cannot access that article without a subscription). Adult learning theory demonstrates that adult learners are motivated and challenged by a different set of conditions (cognitively and affectively) than the target audience/consumer at the traditional university. Which, not ironically, are the skills that are necessary and lead to the next point, that

4) In the 21st century world, where high-paying jobs are increasingly linked to technology, students will need to effectively use digital tools and technologies to critically consume and rhetorically create and distribute information. 

Clay Shirky brilliantly elaborates on this point in his TED Talks on social media, institutions, and government. His concept of cognitive surplus, the “shared online work we do with our spare brain cycles,” pairs with increased access to technological tools that allow that cognitive surplus to be channeled into causes (both good and LOLCat) and the idea of many to many distribution. Thus, in the digital, crowdsourced, multi-verse reality of the internet, people have the opportunity to both critically consume and rhetorically create (as James Gee and Elizabeth Hayes confirm).

What makes these elements possible are the affordances of technology, some of the same tools that I used to research and create this blog–all for free (or the price of a laptop and the taxes to pay for the public library where I am using free wi-fi access). Higher education (again, speaking in broad generalizations to which there are always exceptions) fights the two primary elements of Shirky’s theory: many-to-many distribution and the use of technological tools by students.

The traditional higher education classroom, in spite of great leaps to transform the “sage from the stage” to “the guide on the side,” still operates primarily on the principle of “the sage” or “the guide” as bearer and creator of the knowledge (measured by his or her production of scholarly text in the form of text-based, print publications that are “peer-reviewed” and studiously guarded). Meanwhile, outside the classroom, students are operating in a completely different reality. Online students, primarily non-traditional students, are mixing, mashing, and maintaining complex lives, all while getting educations mediated (though not always well) in the digital realm.

My online students look more like I do now that what I looked like at 18. As I type this, I am at the public library parenting a four-year-old who needs attention every, oh, 15 to 20 seconds (I will not say where I am standing while I type this sentence). My online adult learners live in this reality. And many of my traditional (and non-traditional students) also live in this technological reality. I have the luxury of creating this MOOC manifesto using the internet, TED, online journals and other open-source and freely-provided resources, and I get to hit “Publish” and have it read. It is published in the same medium as the NY Times piece that spawned my blog.

In the academic world, I would write this article, submit it to a publisher, and wait…oh, sometimes years…to have a few people possibly read my work. Few of our students will ever exist in that reality.

Is the blog more exciting? You bet! Am I being any less “academic” or “scholarly” standing here in the library and blogging? Maybe. Would my sources be taken more seriously had I photocopied them from print books at a library? Perhaps.

But I digress. Shirky’s points relate to the MOOCs because the nature of how humans access, organize, process, manipulate, and communicate knowledge in the digital realm. Reality isn’t changing. It HAS already changed. Not for all, but for many.

Faculty and administrators in higher education have tough choices to make in deciding whether they are going to adjust to and position themselves within this reality, which can be uncomfortable for those of us who grew up and experienced the “sage on the stage” 19th century Germanic model of education and thrived (read: most college professors). MOOCs have the resources and the vision to suggest something different, albeit on an unsustainable scale (most of the eager MOOC registrants do not actually complete the courses and earn the “certificates” for their labors). They are, at least in their public statements, putting teaching ahead of research, a distinctly anti-19th century Germanic model of education. From the NYT article: “In a poke at its university-based competition, Dr. Stavens [Udacity] says they pick instructors not because of their academic research, as universities do, but because of how they teach. “We reject about 98 percent of faculty who want to teach with us,” he says. “Just because a person is the world’s most famous economist doesn’t mean they are the best person to teach the subject.”

Not to say that research is not an important part of innovation. But for years, universities have been putting researchers in the classroom with little to no pedagogical training or desire to teach. Or, in order to reserve upper-division or graduate courses for experienced educators who have clawed and scratched their way through the tenure process and want their just desserts, graduate assistants (who also, conveniently, provide enrollments for those graduate classes) cut their teaching teeth on in those undergrad classrooms with students who most need the experience and skills of those faculty teaching higher-level courses. Once those graduates graduate, many will find themselves in contingent lines, teaching the same courses year after year, as the tenure-track lines they dreamed of dry up (why this is happening is a separate argument).  MOOCs might offer a different vision and model of undergraduate education.

Some of my on-campus students came to my class a few days ago carrying the Norton Guides to Literature (which they were assigned for a different course) that I so painfully remember from my undergraduate English degree. These 10,000 page volumes (and there are always multiple volumes) with translucent pages and 10-point font were the bane of my undergraduate existence in the early ’90s. My traditionally aged, very diligent students discussed the material in these volumes before my class began. They were engaged, challenged. In my class, we watched the Presidential Debates on YouTube and analyzed a cartoon from the Daily Kos and an article from the Atlantic online about political responses to Hurricane Sandy. Students were engaged, challenged.

My point is that there is room for traditional, liberal arts education (even the sentence diagramming I so loved in 6th grade), the leisurely garden of knowledge and technological advancement and change. Because human affairs exist solely in the realm of the probable and not the certain, and because my crystal ball is on the fritz (poor wireless connection), I am not sure what that combination will look like in ten months or ten years. And because, for the most part, higher education has not believed this combination can be so, MOOCs are moving in and testing the margins, the only place Clay Shirky claims revolution in education, government, institutions, is possible.

With economic, political, and social uncertainty, we now, more than ever, need students who have access to a challenging, motivating, and quality education. If higher education wants to provide this education, then faculty and administrators need to seriously consider what that education looks like and how we can provide a quality, engaging education to a broad spectrum of students, where those students are, in spite of how WE learned or what the academy has “always done.” Just as I leverage technology to make MOOC news reach me, so can education leverage technology so that teaching and learning can effectively reach 21st century students.

Bottom line: this educational revolution will not be televised. It will be tweeted, Facebook, streamed, remixed, and mashed up. The research I did for this blog, the ways I gathered, processed, re-combined, and distributed information gives us insight to the possibilities for our students, for the educational future of post-empire America. Where we as faculty stand in the revolution will depend on our doing exactly what we ask from our students: making informed, educated decisions that might challenge our values and beliefs about education and why we embrace (or dismiss) technological change.

Some Thoughts on Opinion and Individuality

“Ideologies exist in language, but they are worked out in practices.” — Ancient Rhetorics for Contemporary Students

In our Writing 222: Introduction to Rhetoric class this fall, we started out by playing with the terms “ideology,” “commonplace,” and “truth/Truth” (through the lenses of the Platonists and Sophists). For those not familiar with these terms, Plato and his gang believed that we could (and should) use rhetoric and philosophy to reach the Ideal (Truth). The Sophists (including Isocrates) said, “Eh.” They were more concerned with the role of language in the everyday sphere, in how humans used language to negotiate the complex reality of everyday life.

For those who do know these terms, forgive my reductionist interpretation for the sake of time and space.

The textbook chapter that my students have to read for Tuesday next adamantly disconnects “opinion” from the individual and locates it in the community. The authors claim that the ancients (listed above) would “find fault with the equation of opinion and personality on three grounds…there is no such thing as “just your opinion,” … they would object to the assumption that opinions aren’t important, [and] they would argue that opinions can be changed.” (15)

Hold on…I’m going somewhere with this…so if your eyes are glazing, snap to!

So after reading these passages, I see this morning on my Facebook news feed an interesting image.

Hmmm…I thought. My immediate reaction was a lump somewhere in my gut (as most sensate humans — particularly those with offspring — will react given the emotional appeal of the visual). But removing the visual, you are left with this sentence.

“Women’s rights do not trump human rights.”

Of course, taken on its own, this sentence is a logical fallacy. Women’s rights and human rights are not two opposite ends of a spectrum. Last time I checked, I am both woman and human (even though I feel like a robot some days).

But one might say this is an argument of degree…one of the two sides (still a false dichotomy) is not more than the other. But it also isn’t less than the other, which is the opposite or tacit side of that argument.

What the image is is a demonstration of how an ideology is worked out in practice that becomes fallacy. The ideology is that human life (at a particular stage of existence) is precious and valuable. Once that child grows up to be a woman, that is another story…

And the practice of that ideology often plays out in emotional posts to Facebook, or shouting rants on circus shows or television. The points our WR 222 textbook make quite eloquently are that opinion, when tied to personal identity, cannot afford the playful nature of considering opposite sides, and that rhetoric comes into play at exactly this point: the point at which we disagree about ideology and the commonplaces that underly it.

20120719_192434

The Tomato Jungle

Summer is the only time that I really get to do two things that I year to do muchly all year: grow food and turn that food into more elaborate other foods. Sure, I cook in the winter, but my school schedule leaves much to be desired in the “leisurely home cooking department.”

A unique summer delicacy that I have been missing after growing up near the south is okra. Those who know okra fall into two categories, generally: the okraphiles and the haters (see definition here just to be sure you understand fully those in category number two). Yes, there are those who love the veggie, and those who just can’t be happy for it, no matter what, or criticize it unjustly. “Oh, it’s too slimy!” or “I had it this one time, and it sucked.” Well, maybe the cook sucked. Good okra, cooked well, is delicious. Period. If you don’t like it, chances are you haven’t had it cooked properly. Or you are just jealous of its green goodness.

Image

Last Saturday at the Farmer’s Market, I was humbled and gleeful to find okra! Two years ago, a woman in a stand had okra. When I acted like she was actually selling golden goose eggs, she said, “I’m glad somebody knows what to do with it.” Somebody indeed! Last summer, no okra. Then again this summer, there it was! I bought a cup Saturday and two cups on Tuesday.

Over the last few days, I have had not one, but two batches of fried okra, and tonight, I am experimenting with a curried okra dish. The Indians call okra “bhindi.” You have not quite lived until you have had bhindi Indian style. Apparently, Indians love okra so much, they name jewelry stores after it. That’s right. Okra Jewellers.

But I digress.

My little garden patch is teeming with two things: tomatoes and green beans. The green beans are volunteer, coming up at the edge of the garden. They are mediocre as far as green beans go. But tonight, I will be jazzing them up with some…you guessed it…curry, which should make them fabulous!

The tomatoes have become the tomato jungle. I have never successfully grown a tomato plant.

No. That isn’t true. I successfully grew some cherry tomatoes one year that a bird planted in my garden. The last year we were in Springfield, I grew one plant that produced two hard, nearly inedible, fruit.

I planted four cherry tomatoes because of my ultimate goal to dehydrate roughly one million of them for snacks all winter. My mother-in-law dehydrated cherry tomatoes this last winter, and they were AMAZING snacks right up through this spring. So…I planted four plants without nary a bird’s help. Normally, cherry tomatoes are like zucchini. One plant is not enough, two is too many. But dehydrated? I am hoping for as many as I can get.

Image

In the foreground of the above picture are the volunteer beans. In the background is what I now fondly refer to as “The Tomato Jungle.” Other plants are either lettuce going to seed, volunteer potatoes that don’t seem to be doing anything, or weeds I have been too lazy to pull. Also, the kinder souls among you will suspend judgement on my bean trellis, which was very last minute once I realized that the volunteer beans were actually going to produce something. I am positive that better bean trellis technology exists, and next year, I will investigate and implement said technology.

In addition to the cherry Tomato Jungle, I have a single plant called “Oregon Spring” whose info card assured me that she would be an “early producer.” So far, no ripe tomatoes off her. She has some tomatoes, still green. So much for “early.”

In addition to those five, I have three mystery tomatoes given to us by a friend…in the back yard plot. Two look like they might possibly be Romas. But really, who knows? Actually, my friend who gave them to me probably knows, and when she is done galavanting across Germany, maybe I will ask her.

So, the total number of tomato plants in my backyard is 8.

THEN…the same friend who gave me the three mystery tomato plants gave me an additional 10 or so other mystery plants, which I promptly ignored while I traveled to the AP reading. They started to look pretty puny. So…I shoved them in around an azalea plant that seemed to be doing pretty well in this big bare patch in our front yard. I used the logic that azaleas love acid, and tomatoes love acid, and the azalea loved that patch of earth, so “What the hell?” I thought. I could have 10 dead tomato plants in pots in my carport, or 10 dead tomato plants in a patch in my front yard.

Here are those ten plants today…

Image

Yep. Every one survived. Most of them have blooms. One of them has an actual almost ripe tomato.

So, to recap, I now have four cherry tomatoes, one Oregon Spring, and 13 mystery tomato plants for a total of 18 tomato plants, alive, in my yard right now.

Next year, Universe-willing, I will have as many okra plants. And I will not have to depend on a few good vendors at the farmer’s market to determine my okra universe.

In addition to the tomato/okrathon, I have decided to venture into the world of the refrigerator pickle. A woman who teaches here makes delicious ones, so I thought I would give it a shot. I bought some pickling cucumbers at the same market, gathered spices, and found a recipe online. I rounded up my two quart jars, trounced all over La Grande to find regular jar lids, then…

I was immediately foiled by having no fresh dill. I take for granted, having lived in places with more than a few grocery stores, that fresh dill would be available in July. In a grocery store. HAHAHAHAHA!

So I substituted freeze-dried dill, and now my pickles look like they are swimming in a very think dill-laden lake of brine. We will see what happens. If they turn out even close to amazing, I will grow pickling cucumbers and dill in next year’s garden. And okra.

When we asked our friend Jason, who tilled the garden this year having just received his master gardner certification, what we should plant, he answered, “Plant what you like to eat.” Now, I am remembering, too late, that some of the things that I like to eat are things that I need to plant to make other things (like dill), and things that I forget that I like (like green beans), and things that I have no confidence in growing (like tomatoes and okra). Maybe the garden this year will actually yield enough to build my confidence in growing not just things that I like, but the foods that sustain me.

Also, here are our newbie asparagus.

Image