Thursday, July 30, 2015
There’s a piece making the rounds about Baumol’s cost disease that’s both smart and confused.
It starts by implying that explanations of tuition spirals that draw on Baumol’s cost disease are mistaken. But it sort of shifts ground halfway through, implying that it actually has been true until now, but that technical innovations will render it moot.
Then it claims that people use Baumol’s cost disease as an excuse not to innovate. At that point, it’s both factually incorrect and in tension with its claims in the first part of the piece.
People who put stock in Baumol’s as a driver of cost increases -- and I am one who does -- often use that as an argument for innovation. Either we change what we’re doing, or we’re locked into a cost spiral indefinitely. Very few people just shrug at the prospect of an indefinite cost spiral. (To be fair, Baumol himself sort of does in his recent book. I took him to task for that.)
Arguing that the status quo is structurally flawed is not arguing in favor of the status quo. If anything, it’s creating space for innovations to thrive.
As a writer, seeing the phrase “the collected works of Donald J. Trump” in the Washington Post is sort of bracing. It tends to generate a crisis of purpose.
Still, the review is worth reading. I salute the author’s tenacity.
On the topic of disruptions, I’m thoroughly convinced at this point that the real estate industry needs to be disrupted. Interstate moving is a nightmare.
Here’s an occupation waiting to be born: an event planner for long-distance moves. Basically a project manager who knows the sequence of what needs to happen, and who can follow up on the cascade of phone calls, emails, and requests to fax (!) documents. (My favorite was the request for a copy of a check I deposit two weeks prior. I’d love to make a copy but, um, I deposited it…)
The number of calls and emails with variations on “I need this obscure document faxed to me RIGHT NOW” is absurd. Everyone acts like it’s the first time they’ve ever worked over distance. That cannot possibly be true.
On the bright side, though, I’ll finally be able to bid Comcast adieu. There’s something deeply satisfying about that.
Program note: every summer, I take a week-long break from blogging. Most years, it’s for a vacation. This year, it’s for moving a house full of stuff.
Assuming catastrophe doesn’t strike, the blog will be back on Monday, August 10, once again reporting from New Jersey.
Wednesday, July 29, 2015
Meanwhile, in the Provinces…
I have a history of living in places that get overshadowed by other places. Growing up around Rochester, New York, I learned quickly that when I told people where I was from, I couldn’t say “New York.” New York City was the center of gravity for the entire state. It was a solid seven-hour drive from where I lived, but that didn’t matter; as far as the state was concerned, if you didn’t live in the metro NYC area, you were an afterthought. (People from western New York are the only ones who know this answer quickly and without Googling it: Name the only NFL team that plays its home games in New York.)
Living in Western Massachusetts brought a similar feeling. To the rest of the world, Massachusetts is Boston. In fact, within the state, it often felt like the state was Boston. But living on the other end of the state from Boston, in a region with a completely different economy (and even accent), it was hard not to feel ignored. And of course, New Jersey has long been overshadowed by New York City and Philadelphia; Ben Franklin’s characterization of it as a barrel tapped at both ends is still recognizable.
Working in the community college sector, I get the same feeling when I hear popular discussion of higher education that takes Ivies or flagship state universities as the norm. The Ivy League consists of eight schools, and there are maybe sixty state flagships (NY has four, California has, well, never mind…), but there are over 1100 community colleges in the United States. Still, in our politics, community and state colleges are afterthoughts.
That’s probably why IHE’s piece about the University of Wisconsin at Eau Claire, and its efforts to deal with catastrophic cuts in state funding, resonated with me as much as it did. Political decisions based on stereotypes about research universities have real consequences out in the provinces, even though nobody even bothered to address conditions there before making drastic changes. So the folks in the provinces are forced to make the best of a bad situation created by people who never had them in mind. They cut financial aid staff, advisors, and numerous behind-the-scenes people, as well as some nontenured faculty, while being told to do a better job.
From the article, it sounds like the leadership at UW-Eau Claire is playing a bad hand pretty well. It avoided costly consultants and chose to be as transparent as it could, given that it was dealing with personnel issues. (With personnel matters, there is always a limit to transparency.) It acknowledged the impact on morale, and is trying to get acknowledgement at the state level. Those are all to the good, even though the battle is distinctly uphill.
The travails at UW-Madison have received tremendous coverage, both in formal and social media. Some of that is due to the monumental efforts of Sara Goldrick-Rab, and some of it stems from a combination of Madison’s prominence and Governor Walker’s presidential ambitions. Eau Claire hasn’t had nearly the exposure, though one could argue that the impact on Eau Claire is at least as bad, if not worse.
To the Eau Claires of the world, I say, keep fighting. No disrespect to Madison, or Boston, or New York City, but sometimes things look different from here. It would be nice to see that acknowledged before terrible decisions were made.
Tuesday, July 28, 2015
Multifactor Placement for Adults
How should a college determine whether an adult needs developmental coursework?
For years, most community colleges used a single placement exam and a set of “cut scores” for nearly everybody. (You could avoid them if you had AP credit, high SAT’s, or transfer credits in math, but most students didn’t.) An entering student would be directed to the testing center, where she would take tests cold in both math and English. (English was often divided into reading and writing, but the point still stands.) In states in which high school students were required to take four years of math, a new grad was probably still fairly fluent in what she knew.
Of course, many states don’t require four years of math in high school, so even new high school grads can show up with some serious rust on their algebra skills. But that’s another issue.
Even when memories of high school math are relatively fresh, though, recent research has shown pretty strongly that a single high-stakes exam is a poor predictor. That makes sense, if you think about it. On any given day, a student could be sleep-deprived, nervous, sick, or distracted. And performance on a single test doesn’t indicate tenacity, which isn’t evenly distributed. A high school GPA indicates performance over several years, and offers some insight into tenacity. A consistent record of good academic performance in high school isn’t a perfect indicator, but it’s the best one we’ve found.
But common sense -- and more practical issues -- suggest that a high school GPA is a better indicator when it’s a year old than when it’s ten years old. So using high school GPA as part of a “multi-factor placement” system is fine for traditional-age students, but not so great for adults.
If high school GPA isn’t a good or practical indicator, though, what should we use?
For most returning adults, SAT’s are not a serious option. Many colleges use some version of a one or two week “boot camp” for test preparation, to remind students of what they knew in the past and to help them shake off some rust before the placement test. That’s a good idea, as far as it goes, but it’s still basically fine-tuning the single high-stakes test.
To be fair, issues of placement apply to English as well as to math. With English, in some ways, the boot camp approach is even less promising. Solving the sides of a 30-60-90 triangle is easy enough once you remember it, but writing well typically requires more than a quick refresher.
So, wise and worldly readers, I’m hoping some of you have found reasonably elegant solutions to this one. Other than the high-stakes placement test, are there specific practical, fair, and effective ways to assess entering adult students?
Monday, July 27, 2015
A Few More Rookie Dean Mistakes
For obvious reasons, I enjoyed the recent IHE piece on the mistakes that rookie deans sometimes make. The five mistakes it highlights, it gets right, but I’d add a few.
For the record, none of these reflect on anyone I know, since they are practically perfect in every way. I’ll apply these to academic deanships and to folks who come up through the faculty ranks; deanships in, say, student services may have very different issues.
In no particular order…
Applying the standards of proof for an academic publication to daily decision making.
This is a tough one to get over, because it comes from good intentions and years of training.
I remember being struck, when I first moved into administration, by how quickly a few facts or anecdotes became conclusive. If you start picking those apart, though, you quickly discover why it happens: if you wait for anything decisive, you will wait years, by which time the issue at hand will have become moot. The owl of Minerva spreads its wings at dusk, but you don’t have that long. So you have to learn when the call for more analysis is actually helpful, as opposed to when it comes across simply as a delaying tactic. “Paralysis by Analysis” is a real thing.
That’s not to excuse ignorance or write off data, by any means; every place I’ve worked, I’ve given the Institutional Research office a hearty workout. (Hi, Veena!) It means you need some basic truths grounded in serious research, coupled with the ability to improvise on application. If you wait for publishable certainty, circumstances will run you over.
Taking the first answer as the last answer.
Managing people means managing people, with all of their quirks. Many people will respond to any suggestion with a knee-jerk “no” that sounds definitive, but is really a version of “I’m not used to that yet.” Early in my career, I missed some windows because I mistakenly took the knee-jerk no as definitive.
Acceptance of new ideas isn’t usually instant or automatic. It’s a process. It takes time, it progresses unevenly, and while it can be encouraged, it can’t be forced. That means building some of that time into your process, and accepting that some initial reactions may be discouraging. Don’t give the first response undue weight.
Being the smartest person in the room.
When teaching, it’s easy to fall into the trap of being the smartest person in the room most of the time. But in administration, if you feel the need to prove yourself all the time, you’ll burn bridges and make yourself look ridiculous.
The best administrators I’ve known make a point of surrounding themselves with very smart people, and listening to them. That can mean allowing someone lower on the food chain to win, simply by having the better argument. When you defer to the better argument -- when you allow truth to trump rank -- you create an environment in which all that intelligence becomes an asset. If the chief has to win every time, then the organization is limited to the vision of the chief.
Closely related to that is the ability to laugh at yourself. If you don’t have that, you might want to find another line of work…
Every college I’ve seen has some quirky arrangements that make no sense on paper, but that work as functions of the personalities involved. Or sometimes they’re the least-bad available compromises among warring factions. Sometimes they’re historical holdovers with an emotional resonance for some folks that couldn’t be explained rationally.
It can be tempting to regard those as low-hanging fruit, and to make a splash by bringing rationality to where before there was darkness. (The logical implication of that sentence is “don’t pick dark fruit.” Don’t mix metaphors like this at home, people. I’m a trained professional.) But be careful. Ask some questions first, and listen for the pauses when people answer. The part of the sentence that tails off in the telling is often the most important. “We would have changed that, but, well, you know…”
Cultural shifts are glacial in both senses: gradual, but drastic. Like glaciers, they remake the terrain. If you can harness those forces, instead of fighting them, you’ll be far more successful.
Remembering Too Much
Finally, accept the reality that you’ll make some mistakes, and sometimes have best-available moves seen as mistakes. Learn from them, but don’t dwell on them. Forgive yourself the honest goofs, own them, and move on. Getting caught up in “I shoulda…” doesn’t lead anywhere good.
Even the best ones mess up from time to time. The trick is in moving forward anyway.
Wise and worldly readers, what would you add?
Sunday, July 26, 2015
Transparency, Debt, and Context
Should community college students borrow more?
It seems like a ridiculous question, given the recent focus on student loan burdens and the move towards free community college. But there’s actually a serious argument for it.
Libby Nelson’s recent piece in Vox focuses on Indiana University’s successful effort to reduce student borrowing by actually disclosing to students how much they already owe, and what their future monthly payments will be. The idea behind IU’s effort is that students are less likely to be cavalier about future debt if they see it in terms as concrete and clear as the disclosures on a car loan or a credit card bill. (I give credit card bills less, well, credit, for transparency, but that’s a side issue.) Now IU gives students a one-page statement showing total loans taken out so far, the expected payment period, the expected interest rate, and the expected monthly payment.
Apparently, since it started disclosing this information to students, students reduced their borrowing by eleven percent. In a context of increasing costs, that’s striking. And all else being equal, it’s probably a good thing.
In a community college context, though, I suspect the results might be more complicated.
It’s true that community college students typically borrow much less than their counterparts at four-year colleges. That’s to be expected; tuition is far lower, and most community colleges don’t have dorms. But their default rates tend to be higher, as I noted last week. That’s largely because of a higher dropout rate. Graduates do quite well with repayment, but dropouts don’t. I’ve mentioned before some serious concerns with taking a “graduation rate” as if it were some sort of transparent indicator -- for my money, a student who transfers after the freshman year and graduates on time with a bachelor’s is a success, not a dropout -- but it’s still true that too many students either stop out and get lost or walk away altogether.
One reason that students walk away is that they work so many hours per week for pay. Being a “full-time” student is supposed to be a full-time job. When you add a second full-time job on top of that -- or, worse, a full-time equivalent with unstable and shifting hours -- it’s that much harder. Add family obligations, and even the most dedicated student will face an uphill battle.
Loans can reduce the amount of hours a student has to work. They can open up the time and space to focus on academic success.
For people from college-educated families, that may seem like a tautology or a no-brainer. But to someone from a background where the idea of borrowing money for school seems profoundly suspect, that may seem bizarre. And that’s why I’d worry about applying IU’s formula too quickly in a community college context.
Transparency works really well when there’s a context for understanding what you see. But if the context is missing, or an inappropriate one is applied, transparency can be detrimental.
In my observation, the problem isn’t that community college students are too cavalier about borrowing. It’s that they’re too afraid of not finishing. If finishing seems like a risky proposition, then hedging one’s bet by minimizing borrowing and maximizing work hours makes sense. But that hedge often becomes self-fulfilling. Work a lot of hours to avoid the debt that would be hard to repay without a degree, and you probably won’t finish the degree. Jump in to the degree program with both feet -- which may entail borrowing enough money that you don’t have to work ungodly amounts of time -- and you’re much likelier to finish. And the statistics show clearly that the folks who finish are far more able to repay debts than the folks that don’t.
It’s a sort of academic Calvinism. Act like the sort of person who graduates, and you’re much likelier to graduate. Put differently, fake it ‘til you make it.
I’d like to see us supply the students with the payoff information IU does, but in a context of information about the effects of working too many hours a week for pay on the chances of finishing. Show the downside risk, yes, but also the upside. Because the student loan crisis doesn’t stem from cardiologists who have trouble paying down $100,000 from medical school. They’re doing just fine. It stems from students who walk away owing $2,000, but who couldn’t keep up because they were putting in too many hours at Foot Locker. If we give that student only the information that IU gives its students, she may double down on her hedges, to no good effect.
IU’s move is admirable, in its way, and probably appropriate to its context. But if we were to try something similar, we’d have to build in a lot more context. Without it, we’d be transparently showing only half of the picture, and doing real harm to our students.
Thursday, July 23, 2015
This may be the loveliest sentence I’ve read in years. Thanks to Mary-Kim Arnold for highlighting it.
This piece on succession planning in higher education occasioned some snark on Twitter, but it has its merits. Briefly, it argues that a paucity of good candidates for advancement within colleges and universities speaks badly of higher ed’s personnel practices.
I’d add that the taboo against “crossing over to the dark side” is stronger in this industry than in most. When actors say they really want to direct, nobody blinks. But for an early-career professor to declare an intention to become a dean would be considered odd in most places.
It shouldn’t be. Having folks in these roles who understand the reality of teaching makes a difference. The core of the administrative role is working to create the environment -- within the myriad of constraints that can’t be wished away -- in which the best teaching and learning can happen. Having a practitioner’s sense of teaching and learning to inform decisions can only help .
The other variable, of course, is the trend towards adjunctification of the faculty. To the extent that future administrators come from the full-time faculty, hollowing out the full-time faculty will have predictable effects farther on down the pipeline over time. It’s possible to hire deans from the ranks of the adjuncts -- I did it once at Holyoke -- but it’s still unusual.
For any early-career faculty considering a move, I can recommend a pretty good book...
The family is still in Massachusetts; I’m doing weekdays in NJ and driving back on weekends until the big move.
It’s a kind of flashback to grad school. My room here has a little kitchen, so I’m doing the kind of cooking I did as a grad student. I’m writing every night -- again like grad school -- but getting up much earlier.
It was fun for the first day. But I’ll be really, really glad when the band is back together.
Wednesday, July 22, 2015
The Dreaded Computer Lab
(I should clarify: “dreaded” modifies “lab,” not “computer.” Although they aren’t mutually exclusive…)
Like any good future-oriented sort with a nerdy streak, I harbor a host of techno-fantasies about the future. They have varying degrees of plausibility, to the extent that it’s possible to estimate these things. (Tricorders? Check. Beaming up? Not so much…) But this one seems plausible enough, given technology that already exists.
Imagine: a campus with good enough wifi, and a sophisticated enough printing system, that students could borrow or bring chromebooks or something similarly simple and cheap, and work on papers or assignments wherever they happen to be. We wouldn’t need to dedicate nearly as much square footage to general-purpose computer labs. Students wouldn’t have to fight for spaces in labs, or try to concentrate on a paper while surrounded by other students clackety-clacking away under fluorescent light.
We could still have specialized labs in areas where that makes sense: graphic design, say, or CIS. And we’d still need a few open-access labs for folks with miscellaneous circumstances. But homework would be infinitely easier if students could work on it wherever, whenever.
Even better, ubiquitous access would enable ‘flipped’ classrooms, hybrid delivery, and all sorts of improved access for students with disabilities.
In a comment to yesterday’s post, an astute reader noted that many students who currently lack access to a laptop try to do homework on their phones. Leaving aside the issue of typing -- the typical phone keyboard is not a friendly environment for a six-foot-two adult male -- many sites and formats don’t adapt well to mobile. Tablets are less bad, but the issue of a keyboard remains. With a cheap laptop, though -- or even a tablet-and-keyboard combo that doesn’t suck -- a student should be able to do most of what would realistically be assigned.
General-purpose computer labs have their charms, but at this point, they seem like monuments to the 1990’s. They’re often impersonal, cluttered, and dreary. From an institutional perspective, they’re expensive and high-maintenance. They’re the current version of ditto machines.
Has anyone out there seen a large-scale chromebook or laptop rental program work on a commuter campus? Is there a trick to it? Is there an obstacle I’m not seeing?
Tuesday, July 21, 2015
Offsite and Online
This interview with Christine Chairsell, the Vice President for Academic and Student Affairs at Portland Community College (OR), piqued my curiosity. It’s about the relationship between off-campus instructional sites or centers, and online education.
In other settings, I’ve seen online enrollments chip away at evening and weekend programs. The working adults who poured into evening programs in the 90’s have largely moved online, where the convenience of not having to drive to and from campus after work makes a real difference. Given that offsite locations are often justified in terms of physical distance, and that online education is often called “distance” education because it defeats the barrier of physical distance, I’d expect to see online instruction chip away at the underpinnings of offsite locations.
But generally, that doesn’t seem to be happening.
Some of that seems to stem from programs in which physical presence matters. In a CNA training course, for instance, students use mannikins to learn how to move patients, take blood pressure, and the like; it’s hard to duplicate that kind of experience online. Culinary programs require actual kitchen experience. As Chairsell notes, you don’t want a mechanic who has never lifted a wrench. For many vocational programs, online instruction is still of only limited relevance.
But the staying power of onsite instruction isn’t limited to facility-specific programs. And that’s where I’d like to supplement Chairsell’s thoughts, and get some thoughts from my wise and worldly readers.
In many cases, community colleges set up sites specifically to reach students who otherwise might not have an easily (or realistically) available option. Many of those students are first-generation, and most of them work significant numbers of hours per week for pay. We know from the literature that one of the strongest predictors of success for first-generation students is the presence or absence of a personal relationship with someone at the college. Small sites lend themselves to that in a way that online instruction generally doesn’t.
The students may not know the literature, but they know that personal relationships matter. Having people staffing those centers who are genuinely welcoming and supportive makes a difference.
Chairsell refers to hybrid instruction as an increasingly popular option, and I agree. But I wonder if centers could take it to the next level. Susanna Williams and I did a think piece last March trying to envision what that might look like: the short version is that centers provide human connection and social support even while students do online work.
Wise and worldly readers, what kind of relationships have you seen between online instruction and offsite locations? Can they support each other?
Monday, July 20, 2015
When Too Little is Too Much
Quick, who’s likelier to default on a loan: someone with $100,000 of debt, or someone with $2000 of debt?
If you answered $100,000, sorry. The correct answer is $2,000. And the reasons for that shed light on the whole “student loan crisis” debate.
Robert Kelchen had a great piece yesterday pointing out that, among other things, “[s]ixty-three percent of students who started college in 2003-04 and defaulted on their loans by 2009 were college dropouts, while students with a bachelor’s or associate’s degree were only 4% of defaults.” Dropouts often leave with far less debt than graduates, which makes sense; they left sooner. But they left with far less earning power, making their smaller debt loads much harder to pay. What looks like a “high loan balance” issue is, in fact, a “low earning power” issue.
The Boston Globe offered confirmation with more recent data. Yesterday’s story included a handy table listing the default rates for the various colleges throughout Massachusetts. A sample:
Greenfield Community College: 22% default rate
Westfield State University: 4.6% default rate
Williams College: 1.8% default rate
As the cost of the college goes up, the default rate goes down. The relationship isn’t perfect: despite nearly identical costs, Amherst College’s default rate was more than double that of Williams’. But in broad terms, the community college sector -- the most affordable, by far -- had the highest default rates. And that’s in a state in which the community college default rate was lower than the national average.
Though counterintuitive in some ways, these findings actually offer some grounds for optimism. At a really basic level, I don’t think it’s a coincidence that loan default rates spiked during the Great Recession; as the job market recovers -- slowly, but still -- I’d expect to see default rates retreat. And to the extent that colleges do a better job of keeping students on track to graduation, default rates should drop even as average balances increase. That’s because attending long enough to graduate may entail taking out more loans than walking away after a semester or two, but the payoff from the degree is more than worth it.
Are these findings a rebuttal to the “free community college” movement? If less debt is more bad, would no debt be the worst of all?
Um, no. Because the default problem is really a wage problem. For students who drop out anyway, less debt is better than more debt. The real public policy problem is the student who drops out with enough debt to matter, but without a serious income. The “free community college” movement addresses the first directly, by eliminating tuition, and the second indirectly, by making it easier to pay living costs. Students who can cover their living costs are likelier to persist and graduate.
Statistically, too little debt behaves like too much. But that’s a trick of the math. The real crisis isn’t around student loans; it’s around entry-level salaries. Until we figure that out, we’ll keep trying to solve the wrong problems.
Sunday, July 19, 2015
Ask the Administrator: Pictures of Muhammad
I could feel the urge to duck as I read this one. A returning correspondent, who is preparing a class on the history of the Middle East, writes:
An issue has, however, arisen regarding one of my new lectures. I happen to have an extremely image-intense lecture style, often showing over a hundred slides in a 50 minute lecture. I confidently assume most of my not-particularly-diverse student body doesn't know anything about Islam. So, I'm doing a lecture explaining the basics of Muhammad's life and teachings. I've therefore made slides showing images depicting various moments in Muhammad's life. On second thought, however, I wonder if showing those images is wise.
The details matter, so, specifically, I've got (1) a sixteenth-century picture of Muhammad going up to his cave to meditate now held in Topkapi palace museum in Istanbul (2) a fifteenth-century afghan image of Muhammad being visited by the archangel Gabriel, (3) an Algerian postcard from the 1920s showing him hiding from the pagan Meccans in the cave with the spiderweb, and (4) nineteenth-century Ottoman image of Muhammad's army destroying the idols upon the conquest of Mecca. Some of the images show Muhammad without a face: (1) shows him as a white outline with no detail, (4) shows him as a pillar of flame. Image (2) shows him with a face, presumably because it's a shia drawing and one school of shia thought holds that depicting Muhammad before he gives his first recitation is allowed, since before he becomes a prophet he's an ordinary man. Image (4), finally, shows him as a sort of cartoon figure. Image (4) is thus the most sensitive, and the only image not from a Muslim-governed society, but since it's bilingual in Arabic and French, it still seems aimed at a Muslim audience.
The case for showing the images runs as follows. The existence of all these Islamic depictions of Muhammad is itself a teachable moment, because they illustrate that the taboo against depicting the prophet has been more or less strict with time. The shia rule that it's okay to depict Muhammad before his revelations is also interesting. Furthermore, I'm not a Muslim and feel some desire to assert my right to show whatever historical artifacts as I please, given that it's not against my own beliefs to do so. In the 1980s, Adam Michnik, a Polish dissident I admire, urged his fellow Poles to "live as if you lived in a free country," and I've often tried to take his advice to heart. Well, if I lived as if I enjoyed the freedom of speech I think ought to exist, I'd show these historical artifacts: they're cool images that will help explain the points I want to make.
On the other hand, I'm not keen to get in the newspaper as the controversial professor who offends Islam. Actually, I'm not keen to get in the news at all. I'm also not keen to make friends with the Islamophobic types who would applaud me for doing so. Furthermore, I'm very keen to not get death threats, which I suppose is a possibility that can't be ruled out. I like my quiet life, and if I am honest with myself, I must admit that I lack the courage of Adam Michnik. So, perhaps I should censor my own slides, particularly image (4) but possibly (2) as well, and explain to the class that I've done so out of fear. So, those are the options that I've been considering.
Pondering these options, however, I wondered what the stance of the university administration would be. Perhaps the administration would like to not be blindsided by this lecture? So I wrote an email explaining my thinking to the head of school, and asking for feedback from the powers that be. And that's what leads me to write you, oh community college dean: he forwarded the email to a dean and said he'd get back to me, but now, six weeks later, it seems that everybody has simply forgotten about me. The course won't be taught for several months, there's plenty of time ... but I think it's just fallen off everybody's radar. I find being ignored over this issue discouraging and unsettling.
So, while you and your wise and worldly readers may have opinions about the issue of "showing the images" vs. "censoring my own slides," I'm really writing to ask: what do you think about the administration's silence? Do you think I'm right to be upset? Or am I taking it too hard? Should I just ... well, send a reminder?
I’ll start by answering the question that was asked. Send a reminder. From an administrator’s perspective, it’s easy sometimes to slip into “triage” mode, especially with email. That involves sorting emails into “on fire” and “not on fire,” and then (often) forgetting the latter group. With a topic as complex as this one, the temptation to look at the date, decide it can wait, and move on to something both simpler and more urgent is real. So I’d start with the simplest, lowest-cost strategy. You can always raise a bigger stink later.
Of course, they may be quite aware of it, and sort of hoping that it goes away. A tactful reminder can let them know that it isn’t going away.
In terms of the heart of the matter, I’m thinking this may be the best case I’ve seen for “trigger warnings.” There’s a strong academic freedom argument for following the research where it goes. There’s also a strong pedagogical argument for not jamming students’ radar and overshadowing the point of it all. I’d be inclined to suggest splitting the difference by offering links to the pictures, without showing the pictures themselves, so the students could decide individually whether they wanted to see them or not. Those students who would be severely offended by seeing it, wouldn’t be subjected to it; those who are curious to see, could. That would also offer a reasonably elegant opportunity to discuss the point about changing rules, without turning the class into a circus.
In talking with your dean, you might want to raise the possibility of having someone from student services alerted to it, so if a student has a strong visceral response, you’ll have a trained professional at the ready to help them process it. You also might want to time that particular lecture to avoid certain holidays, since that would just add insult to injury.
The closest parallel to that I’ve dealt with in my own teaching involved historical documents that included what we’d now consider racial slurs. I didn’t eliminate every source that included them -- history isn’t always pretty -- but I did make a point of providing extra context both before and after. I told, and showed, the students that I assumed they were adults, and they could handle difficult issues if they chose to. They always did, which was gratifying. That said, this case is much more electric, so I’m thinking a little extra circumspection would be in order.
If you come to the discussion from a framework of how best to reach students, rather than how to make a political point, you’ll probably have better results.
Wise and worldly readers, what say you?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, July 16, 2015
Apparently, Udacity is doing a variation on the old “security deposit refunded upon graduation” idea. It will refund half of tuition upon completion of any “nanodegree.”
Some of us are old enough to remember when Udacity and other MOOC providers were going to be entirely free. Now they’re half off, if you act quickly. Economic gravity is a real thing.
I’ve toyed with the idea of the graduation security deposit before. It has a certain appeal, especially for the student who completed nearly everything but just never bothered with the paperwork. But I have to think it would tend to reward the students who are already the wealthiest, and further punish the ones who are already struggling the most.
As Tressie McMillan Cottom likes to point out, one of the reasons that for-profit providers were successful for so long at attracting low-income students was that they understood the difference between money now and money in the distant future. When you’re riding the ragged edge of disaster financially, being told that you need to put aside a significant chunk of money now in order to get it back later is a non-starter. You need it now. For-profits get that, and go out of their way to move costs into “later.”
Some read that as a character flaw in the poor, but I think of it as a reasonable response to an unreasonable situation. If your life is unstable and unpredictable, wagering too much on any single outcome is a bad bet.
In the rush to improve graduation rates, it’s easy to fall back on versions of cherry-picking, and to focus only on the students who are likeliest to make it anyway. The real challenge is in improving outcomes among students for whom it matters most.
Packing sucks. And that is all I will say about that.
On the bright side of packing, though, Thursday was my first day at Brookdale Community College. The sun shone, everyone was gracious and welcoming, and even the parking was easy.
A first day, in the middle of July, is a representative sample, right? Right?...
Wednesday, July 15, 2015
Thoughts on “Rise of the Robots”
In what feels like a previous life, I used to read a lot of midcentury social theory. Some very smart people -- I’m thinking here of folks like David Riesman -- used to argue that the great crisis of the coming decades would be the sudden abundance of leisure. Given the outsize productivity gains of the previous decades, and the declines in the average workweek, it seemed reasonable to look forward and suggest that the future held ever-shorter workweeks. In that scenario, what to do with so much “found time” seemed like a burning issue. (Riesman titled one of his books Abundance for What?, which gives a pretty accurate idea of what he took as given.) No less an economic thinker than Keynes believed that workweeks would get steadily shorter, freeing people up to pursue more interesting things.
Somewhere along the way, that didn’t happen.
Productivity is much higher now than it was then, though living standards for middle-class Americans peaked somewhere around 1970. We’ve become astoundingly good at coming up with ways to consume leisure time. My kids have trouble believing that there were once only four channels on tv, and the idea that we didn’t have the internet strikes them as horrifying. I’m old enough to remember when seeing a movie meant hoping that it came to the local theater, or that it would be “movie of the week” on one of the four channels. Now, I have to explain to the kids why they shouldn’t stream Netflix over 4G. The “abundance of leisure” problem has been solved.
But the “how to make a living” problem is real. Rise of the Robots, by Martin Ford, makes a strong case that technology-driven job elimination is outpacing technology-driven job creation, and it isn’t confined to blue-collar jobs anymore. As robotics and artificial intelligence have advanced with increasing speed, they’ve been able to displace humans in progressively higher-skilled occupations. Over time, human labor is split: a small elite either owns or designs the machines, and makes an absolute killing. A majority is shunted into work that’s both idiosyncratic -- and therefore hard to automate -- and low-value, making it not worth automating. (Much low-end service sector work fits this description.) And the rest are pushed outside of formal employment altogether.
In the world predicted at midcentury, this wouldn’t have been a problem. Many critics at that time assumed that increased wealth would be spread evenly across society; they assumed a de facto decoupling of ownership from profits. Instead, we’ve doubled down on the tight connection between ownership and profits. In many fields, we’ve increasingly decoupled work from meaningful income.
Ford notes that healthcare and education have been relatively immune to the rise of the robots so far, with a few exceptions. (He claims that pharmacists have taken it on the chin; I don’t know whether he’s right about that.) By the logic of Baumol’s cost disease -- which he briefly outlines but never names -- we should expect cost increases in healthcare and education to continue to outstrip most other sectors, simply because their productivity is increasing much more slowly than everyone else’s. As that happens, we should expect the pincer movement of higher prices and internal austerity to tighten. (Ford may be more right than he knows; for example, he places far greater faith in robo-graders for papers than I do. If I’m right, then the productivity nut is even harder to crack than he thinks it is.)
What Ford gets right, and the midcentury critics got wrong, is that a strong middle class is neither inevitable nor natural. It’s a relatively recent, and fragile, development, and it was the result of a set of conscious political choices. As those choices are reversed, the conditions under which a middle class can thrive go away, and the middle class struggles to reproduce itself. The savvier members engage in “opportunity hoarding,” or pulling up the drawbridge behind them; the less savvy ones gradually (or quickly) lose ground, and wonder just what the hell happened.
From the perspective of someone working in the community college world, Ford’s diagnosis is bracing. Community colleges are designed to create a middle class for a society that’s increasingly moving away from a middle class economy. The task is getting objectively harder. Jobs that once seemed like sure bets for long-term economic security come under attack, one after the other. Yes, some still exist, and I’m happy to prepare students for them. But if you compare, say, the number of people who work at Amazon to the number of people who used to work at Borders, you can see the problem.
Perversely, I actually draw some hope from the staggering wrongness of Riesman’s and Keynes’ predictions. They were very smart people who wrote books that captured important truths about their times, but they got the future badly wrong. As smart as they were, they couldn’t capture the entire picture, and the cracks in their systems let in so much water that the whole thing sunk. If we’re lucky, fifty years from now, someone will say the same about Martin Ford.