Assessment: Is this “what is wrong” with math education?

I have been thinking about a problem.  This problem is seen in too many of our students … after passing a test (or a course) the proficiency level is still low and understanding fragile.  Even accepting the fact that not all students achieve high levels of learning, the results are disappointing to us and sometimes tragic for students.

Few concepts are more basic to mathematics than ‘order of operations’, so we “cover” this topic in all developmental math classes … just like it’s covered in most K-12 math classes.  In spite of this, college students fail items such as:

  • Simplify    12 – 9(2 – 7) ÷(5)(3)
  • Write  3x²y without exponents

I could blame these difficulties on the inaccurate crutch called “PEMDAS”, and it’s true that somebody’s aunt sally is causing problems.  I might explore that angle (again).

However, I think the basic fault is fundamental to math education at all levels.  This fault deals with the purpose of assessment.  Our courses are driven by outcomes and measurable objectives.  What does it mean to “correctly use exponential notation”?  Does such an outcome have an implication of “know when this does not apply?”  Or, are we only interested in completion of tasks following explicit directions with no need for analysis?

Some of my colleagues consider the order of operations question above to be ‘tricky’, due to the parentheses showing a product.  Some of my colleagues also do not like multiple choice questions.  However, I think we often miss the greatest opportunities in our math classes.

Students completing a math course successfully should have fundamentally different capabilities than they had at the start.

In other words, if all we do is add a bunch of specific skills, we have failed.  Students completing mathematics are going to be asked to either apply that knowledge to novel situations OR show conceptual reasoning.  [This will happen in further college courses and/or on most jobs above minimum wage.]  The vast majority of mathematical needs are not just procedural, rather involve deeper understanding and reasoning.

Our assessments often do not reach for any discrimination among levels of knowledge.  We have a series of problems on ‘solving’ equations … all of which can be solved with the same basic three moves (often in the same order).  Do we ask students ‘which of these problems can be solved by the addition or multiplication properties of equality?’  Do we ask students to ‘write an equation that can not be solved just by adding, subtracting, multiplying or dividing?’

For order of operations, we miss opportunities by not asking:

Identify at least two DIFFERENT ways to do this problem that will all result in the same (correct) answer.

When I teach beginning algebra, the first important thing I say is this:

Algebra is all about meaning and choices.

If all students can do is follow directions, we should not be surprised when their learning is weak or when they struggle in the next course.  When our courses are primarily densely packed sequences of topics requiring a rush to finish, students gain little of value … those procedures they ‘learn’ [sic] during the course have little to no staying power, and are not generally important anyway.

The solution to these problems is a basic change in assessment practices.  Analysis and communication, at a level appropriate for the course outcomes, should be a significant part of assessment.  My own assessments are not good enough yet for the courses I am generally teaching; the ‘rush to complete’ is a challenge.

Which is better:  100 objectives learned at a rote level OR 60 objectives learned at some level of analysis?

This is a big challenge.  The Common Core math standards describe a K-12 experience that will always be a rush to complete; the best performing students will be fine (as always) … others, not so much.  Our college courses (especially developmental) are so focused on ‘procedural’ topics that we generally fail to assess properly.  We often avoid strong types of assessment items (such as well-crafted multiple-choice items, or matching) with the false belief that correct steps show understanding.

We need conversations about which capabilities are most important for course levels, followed by a re-focusing of the courses with deep assessment.  The New Life courses (Math Literacy, Algebraic Literacy) were developed with these ideas in mind … so they might form a good starting point at the developmental level.  The risk with these courses is that we might not emphasize procedures enough; we need both understanding and procedures as core goals of any math class.

Students should be fundamentally different at the end of the course.

 Join Dev Math Revival on Facebook:

Trump Method: Complete College America

Whatever your political persuasion, I hope this comparison makes sense to you.  Most politicians use selective fact usage, and it’s normal to have candidates repeat ‘information’ that fails the fact-checking process.  Mr. Trump is just a bit more extreme in his use of these strategies.  I’m actually not saying anything against “the Donald”.

However, the Trump Method is being employed by the folks at Complete College America (CCA).  The CCA is a change agent, advocating for a select set of ‘game changers’ … which are based on a conclusion about remedial education as a useful construct.  The CCA repeats the same information that does not pass the fact-checking process, much to the detriment of developmental education and community colleges in general.

It’s not that professionals in the field believe that our traditional curriculum and methods are anywhere near what they should be.  I’ve talked with hundreds of teaching faculty over the past ten years, relative to various constructs and methods to use; though we differ on eventual solutions and how to get there, we have a strong consensus that basic changes are needed in remedial mathematics.

However, the CCA brings its anvil and hammer communication … promising simple solutions to complicated problems (just like Mr. Trump).  The recent email newsletter has this headline:

Stuck at Square One
College Students Increasingly Caught in Remedial Education Trap
[http://www.apmreports.org/story/2016/08/18/remedial-education-trap?utm_campaign=APM%20Reports%2020160902%20Weekend%20Listening&utm_medium=email&utm_source=Eloqua&utm_content=Weekend%20listening%3A%20New%20education%20documentaries]

Following up on this headline leads one to a profession-bashing ‘documentary’ about how bad things are.  Did you notice the word “increasingly”?  Things getting worse clearly calls for change … if only there was evidence of things getting worse.  Not only are the facts cited in the documentary old (some from 2004), there is no discussion of any change in the results.

Like “immigrants” for Mr. Trump, remedial education is a bad thing in the view of the CCA.  Since remedial education can not be deported or locked up, the only option is to get rid of it.  The headline says that we ‘trap’ students in our remedial courses, as if we had criminal intent to limit students.  No evidence is presented that the outcomes are a ‘trap’; the word ‘trap’ is more negative than ‘limitations’ or ‘inefficient’ … never mind the lack of accuracy.

Some people have theorized that Mr. Trump appeals to less educated voters.  Who does the CCA material appeal to?  Their intended audience is not ‘us’ … it’s policy makers and state leaders.  These policy makers and state leaders are not generally ignorant nor mean-spirited.  However, the CCA has succeeded in creating an atmosphere of panic relative to remedial education.  Because of the long-term repetition of simplistic conclusions (lacking research evidence) we have this situation at state level groups and college campuses:

Remedial education is a failure, because the CCA has data [sic].
Everybody is working on basic changes, and getting rid of stand-alone remediation.
We better get with the band-wagon, or risk looking like we don’t care (‘unpatriotic’).

This is why the CCA work is so harmful to community colleges.  Instead of academia and local needs driving changes, we have a ‘one size fits all’ mania sweeping the country.  Was this the intent of the CCA?  I doubt it; I think there intent was to destroy remediation as it’s been practiced in this country.  Under the right conditions, I could even work with the CCA on this goal: if ‘destroy’ involved a reasoned examination of all alternatives within the framework present at individual community colleges, with transparent use of data on results.

Sadly, the debate … the academic process for creating long-lasting change … has been usurped by the Trump Method of the CCA.  I can only hope that our policy makers and college leaders will discover their proven change methods; at that point, all of us can work together to create changes that both serve our students and have the stability to remain in place after the CCA is long gone.

 Join Dev Math Revival on Facebook:

Scaling Mathematical Literacy Courses

My college (Lansing CC) has implemented a second version of Math Literacy, which allowed us to drop our pre-algebra course.  I posted previously on the ‘without a prerequisite (see https://www.devmathrevival.net/?p=2516).

Here is a summary of where we ended up in the first semester of having both Math Lit courses.

  • Math105 (Math Lit [with math prereq] has 8 sections with about 165 students.
  • Math106 (Math Lit with Review [no math prereq] has 9 sections with about 225 students

With these 17 sections of Mathematical Literacy, we have quite a few instructors teaching the course for the first time.  Most of the instructors new to this teaching have been involved with the development of the course and policies, where we discussed text coverage and technology expectations for students.

As part of our collaboration, we are having bi-weekly meetings with as many of the instructors as can manage the ‘best time’.  The leading issue being dealt with is the textbook purchase; we’re helping as best we can with that, but buying the textbook is outside of our control.

We are talking about learning and teaching issues.  For example … how to balance an emphasis on concepts to enable reasoning with an emphasis on procedures so that students can actually ‘do something’ with the math (like have an answer to communicate).  We are talking about which small-group structures seem to work well in this course.

Our approach to scaling up Math Literacy is based on a long-term professional development approach.  Our bi-weekly meetings will continue as long as there seems to be a need (one semester, one year, or longer).  We are looking in to setting up a shared collaboration space for the instructors, which will enable those not able to attend to be involved.

In our structure, students who do not place at beginning algebra (or higher) are required to start with Math Literacy; those at the beginning algebra have the option to use the Math Literacy course.  After the Math Literacy course, students have 3 options:

  1. Take our Quantitative Reasoning course (Math119) … required for most health careers
  2. Take out Intro Statistics course (Stat170) … required for some other programs
  3. Take our Fast-Track Algebra (Math109) which allows progression to pre-calculus

Unlike some implementations, our vision of Math Literacy includes all students … even “STEM-bound”.  Faculty teaching our STEM math courses are pleased with the strong reasoning component of Math Literacy.  We will be collecting data on how the various progressions work for students, and can implement needed adjustments to make improvements.

For those near Michigan, we will be making a presentation at our affiliate (MichMATYC) conference next month (Oct 15 at Delta College).  See http://websites.delta.edu/math/michmatyc2016/ for details.

 
Join Dev Math Revival on Facebook:

Alignment of Remediation with Student Programs

My college is one of the institutions in the AACC Pathways Project; we’ve got a meeting coming up, for which we were directed to read some documents … including the famous (or infamous) “Core Principles” for remediation.  [See http://www.core-principles.org/uploads/2/6/4/5/26458024/core_principles_nov4.pdf]  In that list of Core Principles, this is #4:

Students for whom the default college-level course placement is not appropriate, even with additional mandatory support, are enrolled in rigorous, streamlined remediation options that align with the knowledge and skills required for success in gateway courses in their academic or career area of interest.

What does that word “align” mean?  It seems to be a key focus of this principle … and the principle also implies that colleges are failing if they can not implement co-requisite remediation.  In early posts, I have shared data which suggests that stand-alone remediation can be effective; the issue is length-of-sequence, meaning that we can not justify a sequence of 3 or 4 developmental courses (up to and including intermediate algebra).

The general meaning of “align” simply means to put items in their proper position.  The ‘align’ in the Core Principles must mean something more than that … ‘proper position’ does not add any meaning to the statement.  [It already said ‘streamlined’ and later says ‘required or’.]  What do they really mean by ‘align’?

In the supporting narrative, the document actually talks more about co-requisite remediation than alignment.  That does not help us understand what was intended.

The policy makers and leaders I’ve heard on this issue often use this type of statement about aligning remediation:

The remediation covers skills and applications like those the student will encounter in their required math course.

In other words, what ‘align’ means is “restricted” … restricted to those mathematical concepts or procedures that the student will directly use in the required math course.  The result is that the remedial math course will consist of the same stuff included in the mandatory support course in the co-requisite model.  The authors, then, are saying that we need to do co-requisite remediation … or co-requisite remediation; the only option is concurrent versus preceding.

If the only quantitative needs a student faced were restricted to the required math course, this might be reasonable.

I again find a basic flaw in this use of co-requisite remediation in two flavors (concurrent, sequential).  We fail to serve our fundamental charge to prepare students for success in their PROGRAM … not just one math course.  As long as the student’s program requires any quantitative work in courses such as these, the ‘aligned’ remediation will fail to serve student needs:

  • Chemistry
  • Physiology
  • Economics
  • Political science
  • Psychology
  • Basic Physics

Dozens of non-math courses on each campus have strong quantitative components.  Should we avoid remedial math courses just to get students through one required math course … and cause them to face unnecessary challenges in several other courses in their program?

In some rare cases, the required math course actually covers most of the quantitative knowledge a student needs for their program.  However, in my experience, the required math course only partially provides that background … or has absolutely no connection to those needs.

Whom does remediation serve?  Policy makers … or students?

 Join Dev Math Revival on Facebook:

WordPress Themes