Math Education in the Face of Climate Change

Our profession is denying climate change.  We continue to create greenhouse gasses with little regard for the planet nor for the vulnerable organisms who become collateral damage for our ignorance.  Our organizations celebrate the isolated experiment in ‘smaller carbon footprint’ while the vast majority of our companies are focused more on tradition than on science.  We are the enemy.

 

 

 

 

 

 

 

This metaphor is meant to convey the tragic condition of mathematics education in the year 2019.  Some of us have had major changes imposed on us relative to developmental mathematics, which generally leave all other college mathematics unscathed.  Even in some of those places, we still offer just as many traditional developmental math courses.  In almost all cases, the important processes in our industry remain as they were 40 and 50 years ago.

One of the attack lines against developmental mathematics has been “remedial math is where dreams go to die”, and it is true that our traditional developmental math courses did not serve our students well.   The response data to this criticism is trimodal — some of us replaced the traditional courses with fewer and more modern courses, some of us eliminated dev math with ‘corequisite’ strategies, and the rest of us continue business as usual.

If you really want to see dream death in mathematics, study our ‘pre-calculus’ content and courses.  Students enter into the prep for calculus with dreams of being a scientist or engineer or computer scientist; they almost always experience a brain-deadening mix of algebraic procedures and memorization which seems to have the goal of eliminating the ‘unfit’ before calculus I.

 

 

 

 

 

 

 

 

In mainstream college mathematics, we hold to tradition … all of this ‘stuff’ is needed for calculus; the rationale:  we have always taught this ‘stuff’ in pre-calculus.  We sometimes can justify the greenhouse gasses of pre-calculus by citing a contrived calculus problem which happens to require this contrived pre-calculus topic.  Our current books — including “OER” materials — for pre-calculus are still descendants of a gen-ed college algebra course never intended to be in a calculus path (see College Algebra … an Archeological Study).

Evidence of this bizarre mix of dream killing curriculum is our habit of having “college algebra for pre-calculus” as the prerequisite for pre-calculus.  College algebra has nothing to do with pre-calculus, just as pre-calculus has nothing to do with calculus (see College Algebra is Not Pre-Calculus, and Neither is Pre-calc and College Algebra is Still Not Pre-Calculus 🙁 ).

Pre-calculus is where STEM dreams go to die.

The most egregious contribution to ‘climate change’ in mathematics?  The fact that numerical methods and modeling are not integrated into the curriculum (at all levels).  All of our client disciplines are heavily dependent on a collection of matrix and modeling methods and technology.  Nobody ever needed all of the manual calculus methods we taught, but many of them were critical before computational mathematics.  With computational mathematics, fewer manual methods are needed — more conceptual rigor is required, along with content to support appropriate numerical methods.

 

 

 

 

 

 

This image comes from a page on this Envisioning Our Future, and is an attempt to envision a solution to our climate change problem.  Eliminating wasted energy and keeping dreams alive are essential criteria for judging the validity of such solutions, and I have no doubt that my ‘vision’ will not be our shared solution.  We need to work together to create viable solutions locally, share these solutions regionally, and eventually develop a national pattern of ‘good college mathematics’ courses.

 

 

 

 

 

 

 

 

Are you so attached to tradition that you are willing to contribute to this ‘climate change’ in higher education?  Or, do you want STEM dreams to live and thrive?  Perhaps you are willing to consider fundamental change to our curriculum just based on the criteria “teach good mathematics”. We might have pride in our individual teaching practices, but none of us can have pride in delivering bad or awful mathematics to tomorrow’s scientists.

Our traditional courses create dangerous levels of greenhouse gasses (bad mathematics) and contribute directly to climate change (the death of dreams).  We need to reduce our carbon footprint (more ‘good mathematics’) and actively improve the climate (student dreams).

What are you doing to ‘fight climate change’ in college mathematics?

 

Math Education: Changes, Progress and the Club

What works?  How do we help more of our students … all students … achieve goals supported by understanding mathematics in college?  To answer those questions, we need to correct our confusion between measurements and objects; we need to develop cohesive theories based on scientific knowledge.  We are far from that stage, and policy makers tend to be just as ill-equipped.

Co-requisite remediation in mathematics has been broadly implemented.  The process which led to widespread use of a non-solution illustrates some of my points.

 

 

 

 

 

 

 

 

 

The rush towards co-requisite models is based on two patterns in ‘data’:

  • Students who place into remedial mathematics courses tend to have significantly lower completion rates in programs
  • Students who pass a college math course in their first year tend to have significantly higher completion rates in programs

These patterns in the data are not in dispute.  The reason I call co-requisite remediation a non-solution, however, is based on the faulty reasoning which bases a solution on these ‘facts’.  I find it surprising that our administrators accept this treatment for a problem which never addresses the needs of the students — it’s all about responding to the data.  That approach, in fact, is a political method for resolving a disagreement where people on one side hold the power to make decisions.

 

 

 

 

 

 

 

 

 

Eliminating stand-alone developmental math courses will not enable more students to complete their college program, beyond the 15% to 25% who were underplaced by testing — and then only to the extent that this group failed to complete their remedial math courses.  The data on this question usually shows that the issue is not failing remedial math courses; it’s retention or attempt.

Tennessee has been the most publicized ‘innovator’, and I have been addressed with the “club” that the leader of that effort was a mathematician — with the handle of the club saying “He’s a mathematician; and he thinks this is a good solution!”  Like mathematicians are never wrong :(.  Some evidence on related work in Tennessee is showing a lack of impact (https://www.insidehighered.com/news/2018/10/29/few-achievement-gains-tennessee-remedial-education-initiative).  I would expect that evidence to show that the main benefit of corequisite remediation is an increase in the proportion of students getting credit for a college math course in either statistics or quantitative reasoning (QR)/liberal arts (LAM).

When data shows that many students can pass a college math course in statistics, QR, or LAM I don’t see any evidence of corequisite working.  Many of my trusted colleagues believe that stat, QR and LAM have very minimal mathematics required; some believe that the only real requirement is a pulse as long as there is effort.

Why do students placed in remedial mathematics have lower completion rates?  The properties of this group of students is not homogeneous.  For some of them, they have reasonably solid mathematical knowledge that is ‘rusty’ … a short treatment, or just opportunity to review, is sufficient.  A significant portion have a combination of ‘rust’ and ‘absence’; the absence usually involves core ideas of fraction, percent, and basic algebraic reasoning.  Corequisite remediation presumes to address these needs via a ‘just in time’ approach; ‘rust’ responds well to short-term treatments — absence not so much.  Another portion of remedial math students have mostly ‘absence’ of mathematical knowledge.

I’ve written previously on the question of ‘when will co-requisites work’ (Co-requisite Remediation: When it’s likely to work).  My comments here are more related to the meaning of ‘work’ — what is the problem we are addressing, and what measures show the degree to which we are solving it?

One of the primary reasons we have today’s mess in college mathematics is … ourselves.  For decades, we presented an image of mathematics where calculus was the default goal as shown in the image at the start of this post.  This image is so flawed that external attacks had an easy time imposing a political solution to an educational problem:  avoidance.  Of course, courses not preparing students for calculus are generally ‘easier’ to complete.  That is not the problem.  The problem is our lack of understanding related to the mathematical needs of students beyond what we find in guided pathways.

To clarify, the ‘our’ in that last sentence is a reference to everybody engaged with both policy and curriculum.  As an example of the lack of understanding, consider the question of the mathematics needed for biology in general and future medical doctors in particular.  We hear comments like “doctors don’t use calculus” on the job; I’m pretty confident that my doctor is not finding derivatives nor integrating a rational function when she determines appropriate treatment options.  However, mastering modern biology depends on conceptual understanding of rates of change along with decent modeling skills.  The problem is not that doctors don’t use calculus on the job; the problem is that our mathematics courses do not address their mathematical needs.

So, back to the two data points mentioned earlier.  The second of these dealt with an observation that students who complete a college mathematics in their first year are more likely to complete their program.  This type of data is often used as a second rationale in the political effort to impose co-requisite remediation … get more students into that college math course right away.  Of course, the reasoning is based on confusing correlation with causation; do groups of similar students have better completion with a college math course right away compared to delays?  Historically, students who are able to complete a college math course in the first year have one or more of these advantages:

  • Higher quality opportunities in their K-12 experience
  • Better family support
  • Lack of social risk factors
  • Presence of enabling economic factors

If these factors are involved, then we would expect a ‘math in first year’ effort to primarily mean that more students complete a college math course — not that more students complete their program.  This is, in fact, what the latest Tennessee data shows.  I have not been able to find much research on the question of whether ‘first year math’ produces the results I predict versus what the policy marketing teams suggest.  If you know of any research on this, please pass it along.

 

 

 

 

 

 

 

 

We need to own our problems.  These problems include an antiquated curriculum in college mathematics combined with instructional practices which tend to not serve the majority of students.  Clubs with data engraved on the handle are poor choices for the forces needed to make changes; understanding based on long-term scientific theories provides a sustainable basis for progress which will serve all of our students.

 

Increasing Success in Developmental Mathematics: Do we Have Any Idea?

Back in 2007, I had a conversation with two other leaders in AMATYC (Rikki Blair and Rob Kimball) concerning what we should do to make major and quantum improvements in developmental mathematics.  We shared a belief that the existing developmental mathematics courses were a disservice to students, and would remain a disservice even if we achieved 100% pass rates in those courses.  That conversation … and a whole lot of hard work … eventually led to a convening of content experts in 2009 (Seattle), the launching of the AMATYC New Life Project, and the beginnings of Carnegie Statway™ & Quantway™ as well as the Dana Center Mathways.

Imagine my surprise, then, when a publication was released dealing with “Increasing Success in Developmental Mathematics” by the National Academies of Science (see https://www.nap.edu/catalog/25547/increasing-student-success-in-developmental-mathematics-proceedings-of-a-workshop) based on a workshop this past winter.  It is true that the planning team for this workshop did not invite me; they know that I am close to retirement, and travel has become less comfortable, so my absence is actually fine with me.  No, the surprises deal with the premise and outcomes of the workshop.

One premise of the workshop included this phrase:

” … particular attention to students who are unsuccessful in developmental mathematics …”

This phrase reflects a very biased point of view — that there is something about students which causes them to be unsuccessful in some developmental mathematics (course? courses?).  This is the biggest surprise in the report, and not a good surprise: we continue to believe and act as if there are properties of students which mean that we should change our treatment.

If you read that last sentence carefully, you may be surprised yourself at the fact that I am surprised.  After all, it’s very logical that our instructional and/or curricular treatment should vary based on some collection of student characteristics.  Yes, it is logical.  Unfortunately, the statement is anything but reasonable.  Matching instructional methods, or curricular treatments, is an age-old problem of education.  Research on the efficacy of trait-treatment matching has a long and somewhat depressing history, going back at least to the early 1970s.  (http://www.ets.org/research/policy_research_reports/publications/report/1972/hrav)

 

 

 

 

 

 

 

 

 

Trait-Treatment interactions can be managed well enough at micro-levels.  My classes, like most of ours, adjust for student differences.  The problems arise when we seek to match at higher levels of abstraction — whether it’s “learning styles” or “meta majors”.  What we know about traits quickly becomes overwhelmed by the unknown and unknowable.  A similar flaw becomes apparent when we try to impose a closed system (ALEKS for example) on an organic component of a process (a student).  Sure, if I had a series of one-on-one interviews (perhaps 10 hours worth), I could match a student with a treatment that is very likely to work for them — given what was known at the time.  That knowledge might become horribly incomplete within days, depending on the nature of changes the student is experiencing.

 

 

 

This is an optimistic ‘theoretical’ error rate:  are you willing to be one of the “30%”?

 

 

The report itself spends quite a bit of time describing and documenting ‘promising’ practices.  The most common of these seems to involve one of various forms of tracking — a student is in major A, or in meta-major α, so the math course is matched to the ‘needs’ of that major or meta-major.  I suspect that several people attending the workshop would disagree with my use of the word “tracking”; they might prefer the word “aligned with the major”.  I am surprised at the ease of which we allow this alignment/tracking to determine what is done in mathematics classes.  Actually, I should use the word “discouraged” — because in most cases the mathematics chosen for a major deals with the specific quantitative work a student needs, often focusing on a minimal standard.  Are we really this willing to surrender our discipline?  Does the phrase “mathematically educated” now mean “avoids failing a low standard”?

To the extent the report deals with specific ways to ‘increase’ success in mathematics, I would rate the report as a failure.  However, when the report deals with concepts underlying our future work, you will find valid statements.  One of my favorites is from Linda Braddy:

Braddy asserted that administrators and educators are guilty of “educational malpractice” if they do not stop offering outdated, ineffective systems
of mathematics instruction.  [pg 56]

I suspect the authors would not agree with me on what we should stop doing.  I also don’t know how to interpret the comma in “outdated, ineffective” — does something need to meet both conditions in order to be malpractice?  Should we insert an “or” in the statement where a comma shows?  How about if we drop the word “instruction”?  Seems like we should also address the outdated mathematics in our courses.

Although the workshop never claimed to be inclusive, I am also disappointed that the AMATYC New Life Project never gets mentioned.  Not only did our Project produce as many (or more) implementations as the specifics described in the report, the genetic material from the Project was used to begin two efforts which are mentioned.  The result is a report which supports that notion that AMATYC has done nothing to advance ‘success in developmental mathematics’ in the past 12 years.

 

Do we Confuse Good Pedagogy for Good Teaching?

Our professional organizations (both MAA and AMATYC) have published references related to good pedagogy within the last two years.  MAA had the Instructional Practices guide (https://www.maa.org/programs-and-communities/curriculum%20resources/instructional-practices-guide), and AMATYC has IMPACT (http://www.myamatyc.org/).  Lots of good ideas.  References to decent research.  What could be wrong?

Let me use an illustration from the other side of our ‘desk’.  When a student uses procedures without understanding them, we uniformly provide feedback that this is not sufficient.  When a student has some understanding of procedures, but can not understand connections between topics … we tell them that the connections are important.  If a student gets those connections to a reasonable level but can not transfer the knowledge to a slightly different context, we tell them that their learning is not good enough.

 

 

 

 

 

 

However, we tolerate — or even encourage — corresponding misuses of teaching pedagogy.  We see a pedagogy at a conference and ask questions about what to to in the process but rarely a question about WHY does it seem to work.  Very seldom do we even reach the low standard of minimal understanding of the procedures.  Rarely do any of us reach the expert level of knowing how to transfer our understanding to a new situation.

Now, it’s true that ‘good pedagogy’ (like good procedures) create some correct answers … ‘learning’, even if performed without much understanding.  However, the same can be said for some ‘bad pedagogy’; certainly bad stuff has worked reasonably well for me (though I try to not do that stuff anymore).  How do we even identify a method as “good” for teaching?

Sadly, we seem to have only two standards we apply to the process of identifying good pedagogy:

  1. It’s good if the method feels right to us.
  2. It’s good if somebody has seen good results with it (either better grades or some ‘research’)

Of course, our students do a lot of bad learning by the first standard (such as ideas about fractions or percents).  Students don’t generally use the second standard, and the second standard is actually not a totally bad thing when the results are solid research comparing two or more treatments with somewhat equivalent students.  I don’t expect us to use a gold-standard for research prior to using a pedagogy; I do expect us to do a better job of judging elements of a pedagogy based on understanding the process and the validation of those elements in research over time.

We also fall in to the trap of saying that diverse teaching methods are good.  Now, it is reasonable to assume that a given pedagogy might be well matched to a certain situation; we might even believe that a pedagogy is especially suited to a given mathematical topic (though this is difficult to justify by research).

 

 

 

 

 

 

What should we do differently?   My advice is to keep the classroom pedagogy simple from a student point of view.  I’ve seen teachers use multiple complicated methods over a semester, which requires students to learn our methods which will necessarily have a reduction in their learning of mathematics.  [Students have a finite supply of ‘learning energy’.]

My teaching methods are very simple.  Every day (besides tests) are team based with two activities for learning (start and end of class).  We don’t have assigned roles, and we don’t create artifacts to share with the entire class.  There is only one criteria for measuring the value of our teaching methods:

  • Every student learns the most possible mathematics with the highest level of rigor possible every day.

Making this simple method work depends upon my understanding of learning processes as they relate to each topic and concept we explore.  I have studied the learning process for my entire professional career (it’s what my graduate work was in), and what has been shown by research supported by theory is:

The amount and quality of learning are functions of the intellectual interaction of the learner with the material to be learned.

In other words, maximize a quality interaction for each student in order to impact their learning every day.  The learning needs vary with the individual, so the pedagogy must provide a structure for my intervention (based on instant interviews) during class.  My assessment of my methods involves global and individual progress in learning mathematics (including how much rigor is achieved).

One of my students commented last semester: “We could not help but learn.”  I have had more dramatic comments (usually good 🙂 ).  However, this ‘we … learn’ comment is the most valued comment I have received.

My concern involves the frequent copying of teaching methods (often based on the ‘seems right’ criteria).  If you don’t understand how it works … you don’t understand who you will harm with the method.  Although we don’t take a professional oath about this, seeing ourselves as a profession suggests a ‘do no harm’ standard of practice.  Any specific pedagogy has the capacity to harm students; some pedagogies have a decent chance of helping students.

 

 

 

 

 

 

We should not settle for “it works for most students”.  We certainly should not settle for “this generally works, but I do not understand how it works”.  Our lack of understanding will cause harm to students.  Being an expert means that we see simpler solutions that produce broad benefits; using complicated ‘solutions’ means that we don’t understand the problem.

Resist the temptation to copy ‘methods that work’.  Copying methods is not productive for our students; it’s harmful to students if we copy methods without understanding the processes involved.  Your best bet is to keep it simple and interact with students constantly.

WordPress Themes