Of course, we've all heard about how any system with rigid requirements can be manipulated by the undeserving and crush those who rightfully belong. Ultimately, some balance must be found between rigidity and flexibility to enable creativity and true learning while keeping out various manipulative "quick" elements, all while remaining resistant to subjective biases and 'unfairness'. This problem actually runs even deeper, and seems to be an integral element to many, if not post, political debates plaguing humanity: should every individual have an equal starting point, or an equal end point? In very crude terms, capitalism favours the former while socialism favours the latter. This debate has raged on long before those concepts were even conceived, perhaps as old as communicating structured society itself. Education and academia are no exception.
So to sum up my views on rigidity of rules and regulations: Hell if I know. I'll get back to you once someone definitively and unarguably proves the ultimate superiority of either the capitalistic (equal at start) or socialistic (equal at finish) approach. We've got plenty of time...
But aside from that, there are still plenty of other issues that need attention. And perhaps they are actually much less controversial, and would therefore yield a rather productive pursuit to solve them. Among the many, I can point out two somewhat related problems: the obsession with knowledge, and the failure to encourage or even enable personal curiosity in the students.
The futility of pursuit of knowledge
We all know that universities are centres for the pursuit of knowledge, and that knowledge is priceless, long-lasting and leads to power. However, I'll argue that is not actually the case, or what the case should be anyway. For one thing, knowledge is highly perishable - for example, can you remember what you so desperately crammed (or even wisely accrued gradually over the whole term) for your organic chemistry final? Calculus? Even an exam from your own field? Doesn't it get rather fuzzy? I like to think of stocking up on knowledge as being similar to buying 10 years worth of groceries in a single trip -- would be rather useless as food is perishable. Knowledge too is highly perishable.
That said, there are aspects of knowledge that seem to last. While you probably forgot all the intricate equations and reaction names from o-chem, there may be some foundational relics lingering on in your memory: resonance, or the mutorotation of sugars (had to google to remember the exact term...), or the concept of how electrons can move. The nomenclature probably all but vanished (unless you are a biochemist), although there are some basic concepts that may remain -- the backbone goes last, for example. Now, this is still very unproductive for the countless hours that were spent slaving over the subject. Beyond those very primitive concepts, I can recall very little usable information. It's pretty much impossible for me to read and understand an organic chem paper, which indicates that despite the courses, I'm still embarrassingly illiterate in the field.
I'm no model student (by FAR...heh), nor does my chemistry talent shine particularly brightly (in fact, I REALLY suck at it), but I find it difficult to believe that a better student has a brain that is fundamentally different from mine in what it prefers to retain. Thus, in our above grocery store analogy, what is important is not to grab a crapload of food, nor even remember where all the food is sold (every store is a bit different), but rather to get an idea of the type of food that exists, and how it tends to be organised, and how to find it.
I [used to?] consider myself a cell biologist, and have been involved in research in that field for three years now. I'm not claiming to be anywhere near an expert in ANY of it, but I feel I've been acquiring some 'sense' for how cells work. In any case, I can read cell biology papers with a certain degree of confidence, especially once I look up a few obscure gene names. So I was quite excited for this cell physiology course: I was even prepared to tolerate its anti-comparative 'phylogenetically-uninformed' approach.
Curiously enough, the course is turning out to be a bit of a disaster. Despite my experience in the field, I find it extremely difficult to focus on the content, and feel rather overwhelmed by the sheer volume of stuff. For example, I've actually worked quite a bit with microtubules, doing all sorts of fun (reads: destructive) stuff with them and reading quite a bit of literature. I find cell shaping and the molecular processes behind it simply fascinating. Don't get me started on MTOCs and nucleation and regulation of dynamic instability, etc. Oh, and unlike quite possibly every other student in the class, I've actually watched microtubules grow in real time, in vivo, with my own eyes, and made my own timelapses. And we're going to talk about them -- what could possibly go wrong?
Turns out, when presented with slide after slide of info on various molecules and their interactions and all this data about them, I simply blank out. The material magically becomes beyond me (somehow, it makes A LOT more sense in a dry research paper...), and even a bit...boring! When I first read about gamma-TuRCs (microtubule nucleating complexes present in MTOCs, and regulated in potentially very interesting ways to control cell shape!) in a Nature paper for a lab meeting, I could devour the material rather smoothly and quickly, and it all made sense, and was awesome, and I just couldn't wait to get the time to look up more papers on gamma-TuRCs in various systems and contexts and OMG IT WAS VERY EXCITING! It helped that at the time, we were working on an endomembrane trafficking mutant that showed cell shape defects, so it was interesting to see how this could relate. And we were going to cover this stuff in class!
First off, the whole approach to the cytoskeleton seems to be very bottom-up: first we will examine the chemistry of tubulin polymerisation, and only afterwards [rather briefly] look at how any of this is relevant to cell biology. I prefer a top-down approach: We have a problem -- how does the cell exhibit the necessary cytoskeletal form/organisation and switch between them? Eg. Take plant cells -- the microtubules must go from a fairly chaotic cortical array to then form a band of 'tubes perpendicular to axis, then mitotic spindle (parallel to axis), then phragmoplast (perpendicular), then back to cortical arrays (as in this diagram). Basically, a crapload of moving around. How the hell does this work? First off, these changes require some instability, and yet a fair degree of stability at the same time. This must be regulated. Then properties of tubulin polymerisation, then microtubule-associated proteins, then organisation by nucleation, etc. Rather similar material is covered, but in the latter case we have context. We students know why we should care!
Those generalities are what stick, not mundane specifics. Seriously, am I expected to remember 10 years (or even 10 days) after the final how exactly gamma-tubulin looks like and where the alpha and beta tubulins bind it? No? Then why the hell does the stuff get tested? Even presenting it as an aside is questionable, as it may be distracting despite being potentially cool. I can always look up those details myself. Meanwhile, the question of how cytoskeletal organisation may be regulated by these gamma-TuRCs (eg. adaptor proteins regulating where those TuRCs localise) remains unexplored. Personally, I find the latter a much more memorable and interesting topic, to a cell biologist anyway. And note, much less specific.
(Again, not to pick on this cell physiol class -- despite its fallbacks, it's actually quite a reasonable course compared to some others I've taken. I know the instructor tries; I'm just using this as an example to illustrate issues with the overall prevalent approach in undergrad biology, at least here. I do not intend to slander the course!)
So I think there are roughly three points to make:
- necessity induces relevance, induces better learning
When you have some sort of own particular problem to solve (either research-related or out of personal interest -- artifical problem sets do not work, in my view), you crave any piece of information that can even slightly help you solve the problem. This is why lab research doesn't actually have to be particularly narrow -- a researcher must scour many distant fields in hopes of finding data or ideas that may come in handy with their own problem. And the more distant fields an investigator explores, the more material they have for some potentially awesome idea. Of course, this must be balanced with the requirement to focus, creating a bit of a scale between what I heard being described as "fuzzbrain vs pinhead" mentalities. (you can probably guess what end I gravitate towards...)
- excess information overwhelms, inhibits learning
One of the ways to induce one to seriously loathe Powerpoint is to flash those ridiculously complicated slides with MASSIVE GIGANTIC BLOCKS OF 8PT TEXT. Even if you don't use blocks of text, but instead use 20 concise bullet points per slide, the situation is still rather loathesome. And ineffective. There is a reason for this: the brain actually has limits to how much stuff it can process at once. If you're going at 20 bullets per slide, most of your talk will be a blur to the audience. When cutting down your slides, you often feel very attached to the pieces of information you intended to present. But it helps drammatically to ask yourself: Does anybody actually care? Is it absolutely essential for the main point? Does it really matter how many species are in Phylum Porifera if you're discussing their evolution? This may seem pedantic, but those tiny excess bits of information pile up, and overwhelm.
This applies even more strongly to course lectures. For some reason, all those great skills the research profs must have in presenting at conferences seem to evaporate instantenously in front of a lecture hall. Much irrelevant information is crammed into an already-long (and attention-taxing) lecture, such a protein structures in a cell biol course. As a result, it's hard to keep track of the central ideas, the things that are much like likely to persist past the final.
- understanding is closer to modelling, not knowing facts
Ultimately, we can't really know anything. We can only create models to make predictions from. Luckily, not all models are equally good at making accurate predictions, and thus we have science. Understanding that science -- nay, the very attempt to comprehend our environments -- is essentially an optimisation algorithm (Bayesian MCMC, anyone?) rather than makes it so much more alive and dynamic and interesting! Thus, it is the modelling, even the optimisation seeking algorithms themselves, that must be the focus of education, not memorising the underlying data!
Thus, the point of education is not to fill with knowledge, but rather to provide the tools for acquiring further knowledge (eg. literature research skills) and provide a conceptual outline of where stuff fits, particularly interesting questions and basically providing ideas for further [personal] investigation, which is the topic of my next point. To me, science is more about asking questions than having answers, since seeking the latter invariably leads to the former. And that's what's exciting about it -- if you want to know things, perhaps theology is the better way to go!
I realise I'm not being too concise here myself, but this is a rant, not an article or anything.
The value of pursuit of curiosity
We taught from a very young age that curiosity is dangerous. Children run around asking questions about everything, much to the great annoyance and suffering of their parents. Curiosity leads to nasty things, from injuries to social failures. Unrestrained curiosity is dangerous, and to a large extent, curiosity isn't really a necessity for a good life. In fact, it seems curiosity tends to lead to a poorer quality of life, especially where it results in rather stupid career choices, like academic research. But just like curiosity leads to awful career choices, those careers provide an opportunity for one to achieve fulfillment and gratification from its pursuit. In fact, that's probably the only gratification you can get from an academic job -- it seems to really suck in all other regards.
Most science students will probably eventually wisen up and make good career choices and run the hell away from science, and that's great because they'll be productive and make lots of money and pay taxes and ultimately fund our research. However, while they are in science in whatever form, I think curiosity is essential. Without passionate interest in the subject, the entire degree is just a waste of time. Maybe that's a bit extreme, but I stand by it. I don't understand why someone would suffer through four years of generally horrible classes with dry material just to get a Bachelor's degree if they don't give a flying fuck about the subject. (ignoring the premed problem for now; North Americans really need to start medschool straight from highschool like they do everywhere else in the world...) Diligent studying does not contribute to the giving-a-flying-fuck index (that is, passionate interest), but is instead an execution of one's duties as a student (something I admit to failing miserably at). In fact, I'm gonna get extreme enough to make this statement:
A 'true' biology student must, from time to time, peruse scholarly literature at their own leisure, for fun.In other words, being a fucking nerd, right? Well, if you chose to study science, being nerd-o-phobic is rather weird, if not just plain dumb. In my eyes, you are not a student of your discipline unless you read relevant materials for your own enjoyment. At the very least, reading popular literature in your field is a start, especially in first and second year. Otherwise, you'd just be flotsam as far as your majors program is concerned.
What would be really cool is if the programs that exercised those awful things called internal admissions (that is, bottlenecks after you start attending the university -- borderline fraud in my opinion, but no one cares...) actually considered the give-a-flying-fuck index when choosing applicants. I think that's more relevant to the program's mandate -- [supposedly] training biologists -- than how well a student performs in the coursework. In other words, the admissions programs should seek out 'nerds', for those are the people who may actually become scientists.
Otherwise, you end up with practising biologists with several years of real research experience and a few hundred of read papers and impending publications of their own locked out of a program full of dull flotsam drifting by oblivious to the awesome wonders of the very fields surrounding them. This may sound quite arrogant, but the onus is on the student to be honest to themselves whether they truly care about their discipline, and then decide whether they should really spend time there. And I once naïvely thought the university would like to educate its future crop of academics. Hah. They'd go after the easier choice even at the expense of their own kind. Which kind of makes sense -- do they really need more competition? Judging from the quality of some of the research out there, probably not...