Category: Research connected to practice

Does the HS GPA Mean Anything?

In the context of succeeding in college-level mathematics, does the HS GPA have any meaning?  Specifically, does a high school grade point average over some arbitrary value (such as 3.2) indicate that a given student can succeed in college algebra or quantitative reasoning with appropriate support?

Statistically, the HS GPA should not exist.  The reason is the the original measures (semester grades on a scale from 0 to 4 with 0.5 increments) is an ordinal measure; higher values are greater than smaller values.  A mean of a measure depends upon a presumption of “interval measures” — the difference between 0 and 1 is the same as the difference between 3 and 4.  The existence of GPA (whether HS or college) is based on convenient ignorance of statistics.

Given the long-standing existence of the HS GPA, one can not hope for leaders to suddenly recognize the basic error.  Therefore, let’s assume that the HS GPA is a statistically valid measure of SOMETHING.  What is that something?  Is there a connection between that something and readiness for college mathematics?

The structure of the data used for the HS GPA varies considerably by region and state.  In some areas, the HS GPA is the mean of 48 values … 6 courses at 2 semesters per year for 4 years.  If the school schedule allows for 7 classes, then there are 56 values; that type of variation is probably not very significant for our discussion.  The meaning of the HS GPA is more impacted by the nature of the 6 (or 7) classes each semester.  How many of these courses are mathematical in nature?  In most situations, at the current time, we might see 8 of the 48 grades coming from a mathematics course with another 4 to 8 coming from a science course.  Although most students take “algebra II” in high school, a smaller portion take a mathematically intense science course (such as physics).

In other words, we have a measure which has approximately a 20% basis in mathematics alone.  The other 80% represent “english”, social science, foreign language, technology, and various electives.  Would we expect this “20% weighting” to produce useful connections between HS GPA and readiness for college mathematics?  If these connections exist, we should see meaningful relationships between HS GPA and accepted measures of readiness.

So, I have spent some time looking at our local data.  We have only been collecting HS GPA data for a short time (less than one year), and this data can be compared to other measures.  Here are the correlation coefficients for the sample (n>600 for all combinations):

  • HS GPA with SAT Math: r = 0.377
  • HS GPA with Accuplacer College Level Math: r = 0.164
  • HS GPA with Accuplacer Algebra:   r = 0.338

Compare this with the correlations of the math measures:

  • SAT Math with Accuplacer College Level Math: r = 0.560
  • SAT Math with Accuplacer Algebra: r = 0.627
  • Accuplacer College Level Math with Accuplacer Algebra: r = 0.526

Of course, correlation coefficients are crude measures of association.  In some cases, the measures can have a useful association.  Here is a scatterplot of SAT Math by HS GPA:

 

 

 

 

 

 

 

 

 

The horizontal lines represent our cut scores for college level mathematics (550 for college algebra, 520 for quantitative reasoning). As you can see from this graph, the HS GPA is a very poor predictor of SAT Math.  We have, of course, examined the validity of the traditional measures of readiness for our college math courses.  The overall ranking, starting with the most valid, is:

  1. Accuplacer Algebra
  2. Accuplacer College Level Math
  3. SAT Math

The order of the first two differs whether the context is college algebra or quantitative reasoning.  In all cases, the measures show consistent validity to promote student success.

Here is a display of related data, this time relative to ACT Math and HS GPA.  The curves represent the probability of passing college algebra for scores on SAT Math.

 

 

 

 

 

 

 

 

 

[Source:  http://www.act.org/content/act/en/research/act-scores-and-high-school-gpa-for-predicting-success-at-community-colleges.html ]

For math, this graph is saying that basing a student’s chance of success just on the HS GPA is a very risky proposition.  Even at the extreme (a 4.0 HS GPA), the probability of passing college algebra ranges from about 20% to about 80%.  The ACT Math score, by itself, is a better predictor. The data suggests, in fact, that the use of the HS GPA should be limited to predicting who is not going to pass college algebra in spite of their ACT Math score … ACT Math 25 with HS GPA below 3.0 means “needs more support”.

So, back to the basic question: What does the HS GPA mean? Well, if one ignores the statistical violation, the HS GPA has considerable meaning — just not for mathematics.  The HS GPA has long been used as the primary predictor of “first year success in college” (often measured by 1st year GPA, another mistake).  Clearly, there is an element of “academic maturity or lack thereof” in the HS GPA measure.  The GPA below 3.0 seems to indicate insufficient academic maturity to succeed in a traditional college algebra course (see the graph above).

We know that mathematics forms a minor portion of the HS GPA for most students.  Although a small portion of students might have 50% of their HS GPA based on mathematically intense courses, the mode is probably closer to 20%.  Therefore, it is not surprising that the HS GPA is not a strong indicator of readiness for a given college mathematics course.

My college has recently implemented a policy to allow students with a HS GPA 2.6 or higher to enroll in our quantitative reasoning course, regardless of any math measures.  The first semester of data indicates that there may be a problem with this … about a third of these students passed the course, compared to the overall pass rate of about 75%.

I suggest that the meaning of the HS GPA is that the measure can identify students at risk, who perhaps should not be placed in college level math courses even if their test scores qualify them. In some cases, “co-requisite remediation” might be appropriate; in others, stand-alone developmental mathematics courses are more appropriate.  My conjecture is that this scheme would support student success:

  • Qualifying test score with HS GPA > 3.00, place into college mathematics
  • Qualifying test score with 2.6≤HS GPA<3.0, place into co-requisite if available, developmental if not
  • Qualifying test score with HS GPA < 2.6, place into developmental math

This, of course, is not what the “policy influencers” want to hear (ie, complete college america and related groups).  They suggest that we can solve a problem by both ignoring prior learning of mathematics and applying bad statistics.  Our responsibility, as professionals, is to articulate a clear assessment based on evidence to support the success of our students in their college program.

 

Learning, Success and Mathematics: 100%??

A few years ago, the chief academic officer (aka “Provost”) at my institution proposed that we adopt an “Operation 100%” which involved committing ourselves every student passing each course and every student completing their program of study.  Faculty reaction was more negative than positive, especially about a goal of 100% success rate in every course.

Eventually, the 100% success rate was dropped and the 100% program completion goal was kept.  This was driven, in large part, by the faculty reaction; we correctly pointed out that the 100% success rate was not a reasonable goal, especially in a community college setting.  Although it was a relief to not have the 100% success rate goal, I have to admit … we should have taken the challenge.

In most cases, we design our courses with the assumption that a significant proportion of students will not succeed.  More specifically, courses are designed based on a ‘definition’ that some students will be unable to learn the material in the allowed time frame.  Sometimes, we say “this group of students were not ready for my course” or “that group of students has trouble understanding, and they try to memorize”.  We tell ourselves that many of our students have challenges in their lives which make success in a course unlikely.

And, in terms of data, each of those statements can be shown to be ‘true’.  However, that is simply proving a point of view which justifies the acceptance of low pass rates as ‘normal’.  Another point of view, equally justified by data, is that faculty don’t know how to help students learn and succeed if the student actually needed their help in doing so.

So, let me frame the issue more scientifically:

Instead of designing a course assuming that some students won’t learn or pass, we should consider designing our courses so that we help all students learn and succeed.

You are probably thinking that this is exactly what we do right now.  Read the statement again … it does not say that we “try to help students learn and succeed”; it says “we help all students learn and succeed”.  Of course, not all students will succeed … not all students will learn.  However, 100% success (and learning) should be our fundamental design principle.  Accepting failure, and taking ‘lack of learning’ as a given, is an exceptionally weak design goal.

Imagine a surgeon who says “Well, I know some patients will not survive heart surgery so I am not going to stress myself out with worries about whether this patient survives.”

What does “design for learning and success” look like?  I am working on a complete answer to that question.  In the meantime, here are some implications I see in “100% success” as a goal:

  • Every class is an opportunity to help every student learn more mathematics
  • Every student knows some mathematics, though some ‘knowledge’ is faulty
  • Readiness to learn a topic is part of the class where we ‘teach’ the topic
  • Every student is active, all of the time: engaging with work sequenced to proceed from readiness to learning to knowing

I’m using this design for learning and success in algebra courses.  If a class is primarily about learning to solve quadratic equations using square roots, the initiating activity makes sure that every student reviews basic concepts of radicals and the symmetry of square roots.  Teams of 4 or 5 are used, so that every student has a reasonable opportunity to contribute and participate in the process.  “Faulty knowledge” is caught by team members, or the instructor, or both — starting with the prerequisite knowledge.  The activity proceeds to explore the primary concept in the new material, often starting with a simple example to solve followed by a ‘cloze’ type statement (fill in blanks) to complete a summary of the concept.  Next, the activity involves the application of this concept to a more complex situation.

I have been engaged with the profession for a long time.  As you probably know, the fundamental ingredient for student success is MOTIVATION … it’s hard for a student to learn if they are not motivated to attend class.  Some of us use tricks to improve motivation — we have students play games in class, or we find some application using mathematics in a context that the student might care about.

What I am observing is that students find this intentional design innately motivating — especially the struggling student.  For example, in one class this semester I have 8 students with mathematical challenges that are significant enough that I might normally ‘expect’ them to fail.  In fact, prior to my current design, they all would have failed. [These challenges were obvious within the first week.]  However, all 8 of the students continued to attend class; they found it motivating that every class was designed so that they would learn some mathematics.  Initially, they did not learn enough mathematics … partly because these 8 students had a larger amount of faulty knowledge.  For 2 of these students, they eventually stopped attending class when it became clear to them that their test scores were too low for them to pass the course.    The other 6 are successful; none of these 6 struggling but succeeding students will receive a 4.0 grade; on the other hand, they are not all headed towards ‘barely passing’ either.

Do you want your students to succeed?  Well, you better start by designing a course which provides 100% of the students an opportunity to learn every day.  We can not afford to leave learning to unknown or random processes.  Some patients do not survive … some students will not succeed.  We should plan — and design — our classes for what we want to see, rather than what would happen without effective intervention on our part.

So, I am all in on “100% success in our courses”.  I realize that some readers are in states where they are subject to some arbitrary minimum pass rate within their courses.  That is not what I am talking about — I am talking about designing courses so that every student learns and can succeed.  The last thing we need is some uninformed arbitrary ‘standard’ being inflicted on us and our students; this can not help but cause harm to learning and to students.  We should focus on what we care about … learning mathematics, for every student.

If you want to base your career on failure being normal, go in to politics.  Education should be based on learning to success as the goal for everybody in our classes.

 

Using Data (AMATYC 2017)

For the session “Using Data to Improve Curriculum” (Nov 10), here is the ‘stuff’:

Presentation slides (all):  Using Data for Improve Curriculum

The Handout (shorter): Using Data for Curriculum AMATYC 2017 S116

 Join Dev Math Revival on Facebook:

Transitioning Learners to Calculus in Community Colleges (TLC3)

You might have heard of the MAA project “National Study of College Calculus”  (see http://www.maa.org/programs/faculty-and-departments/curriculum-development-resources/national-studies-college-calculus ).  That work was very broad, as it studied calculus in all 3 settings (high school, community colleges, and universities).

A recent effort is focused on community colleges  with the title “Transitioning Learners to Calculus in Community Colleges”   (info at http://occrl.illinois.edu/tlc3  )  Take a look at their web site!

One component of their research is an extensive survey being completed by administrators of mathematics at associate degree granting public community colleges, including the collection of outcomes data.  A focus is on “under represented minorities” (URM), which relates closely to a number of recent posts here (on equity in college mathematics).

I am expecting that the TLC3 data will show that very few community colleges are successful in getting significant numbers of “URM” students through calculus II (the target of this project).  The ‘outliers’, especially community colleges succeeding with numbers proportional to the local population of URM, will provide us with some ideas about what needs to change.

Further, I think the recent emphasis on ‘pathways’ has actually decreased our effectiveness at getting URM students through calculus; the primary assumption behind this (based on available data) is that minorities tend to come from under-performing K-12 systems which then results in larger portions placed in developmental mathematics.  The focus on pathways and ‘completion’ then results in more URM students being tracked into statistics or quantitative reasoning (QR) pathways — which do not prepare them for the calculus path.  [Note that the basic “New Life” curricular vision does not ‘track’ students; Math Literacy is part of the ‘STEM’ path. See https://www.devmathrevival.net/?page_id=8 ]

Some readers will respond with this thought:

Don’t you realize that the vast majority of students never intend to study calculus?

Of course I understand that; something like 80% of our remedial math students never even intend to take pre-calculus.  Nobody seems to worry about the implication of these trends.

Students are choosing (with encouragement from colleges) programs with lower probabilities of upward mobility.

The most common ‘major’ at my college is “general associates” degree.  Some of these students will transfer in order to work on a bachelor degree; most will not.  Most of the other common majors are health careers (a bit better choice) and a mix of business along with human services.  Upward mobility works when students get the education required for occupations with (1) predicted needs and (2) reasonable income levels.  Take a look at lists of jobs (such as the US News list at http://money.usnews.com/careers/best-jobs/rankings/the-100-best-jobs )  I do not expect 100% of our students to select a program requiring calculus, nor even 50%; I think the current rate (<20%) is artificially low … 30% to 40% would better reflect the occupational needs and opportunities.

Our colleges will not be successful in supporting our communities until URM students select programs for these jobs and then complete the programs (where URM students select and complete at the same rates as ‘majority’ students).  Quite a few of these ‘hot jobs’ require some calculus.  [Though I note that many of these programs are oriented towards the biological sciences, not the engineering that often drives the traditional calculus curriculum.]

I hope the TLC3 project produces some useful results; in other words, I hope that we pay attention to their results and take responsibility for correcting the inequities that may be highlighted.  We need to work with our colleges so that all societal groups select and achieve equally lofty academic goals.

 Join Dev Math Revival on Facebook:

WordPress Themes