banner



Pathways To College Mathematics 1st Edition Pdf Download

At least half of community college students enroll in developmental education, also referred to as remediation or dev-ed (Radford & Horn, 2012; Scott-Clayton, Crosta, & Belfield, 2014). Dev-ed courses aim to help academically underprepared students reach college-readiness standards and often must be completed prior to coursework that contributes toward degree requirements. The road to college-level coursework is particularly problematic in mathematics, where students are more likely to require remediation than in reading and writing—59% of community college students are referred to dev-ed courses in math (Bailey, Jeong, & Cho, 2010). Of the students requiring remediation in math, one third complete dev-ed coursework within 3 years (Bailey et al., 2010). Even fewer—20%—make it through their first college-level math course, also called a gateway course (Bailey et al., 2010).

Across the country, states and community colleges are working to improve dev-ed. Reforms include incorporating technology in the classroom, requiring corequisite success courses to cover study skills, offering tutoring resources and learning communities, accelerating dev-ed coursework, and/or placing students into college-level courses with additional supports (Bailey, 2009; Bonham & Boylan, 2011; Edgecombe, 2011; Hodara, 2011). Despite increased experimentation and newly implemented dev-ed reforms across the country, there is limited evidence regarding their effectiveness. Administrators and policy makers need more information about the impacts of programs as they make difficult choices to improve outcomes for students and the community. This study examined the impact of an accelerated dev-ed pathway among community college students in Texas.

Trends in Texas reflect those across the nation in terms of placement into dev-ed. Half of all first-time college students at Texas public 2-year institutions fail to meet college readiness standards for mathematics (Texas Higher Education Coordinating Board [THECB], 2016). Seeking stronger outcomes for students, 20 community colleges in the state implemented Dana Center Mathematics Pathways (DCMP)1 in fall 2014. DCMP is a broad model that aims to help students attain math skills applicable to their areas of interest—rather than focusing solely on algebra—and improve their progress toward a degree; it begins with dev-ed that is accelerated and includes revised content and support for students (Dana Center, 2013). We used state administrative data and propensity score matching (PSM) to compare students enrolled in DCMP's dev-ed pathway with those enrolled in traditional dev-ed math sequences, examining the impact of DCMP on college outcomes like persistence, enrollment and success in college math, and degree-bearing credit accumulation. In the semester after enrolling in DCMP, students showed greater momentum in college, accumulating more college-level credits and persisting at higher rates than their peers in traditional dev-ed coursework. DCMP students were more likely to pass college math and accumulate college-level credits than those in traditional dev-ed 3 years later.

Literature Review

The Impacts of Dev-Ed

Despite the sizable enrollment rate in dev-ed, there is conflicting evidence about the value of placing students into dev-ed courses. Some evidence suggests that students who complete dev-ed coursework in math are more likely to persist in college and earn a bachelor's degree than peers with similar abilities who fail to complete remedial math, suggesting some positive impact (Bettinger & Long, 2009). However, the modal result appears to show no effect, with students who place into dev-ed math experiencing outcomes similar to peers who did not place into dev-ed math (Attewell, Lavin, Domina, & Levey, 2006; Bahr, 2008; Bailey, Jaggars, & Scott-Clayton, 2013; Bettinger & Long, 2005; Boatman, 2012; Martorell & McFarlin, 2011; Melguizo, Bos, Ngo, Mills, & Prather, 2016). There is also evidence of some negative effects, particularly for students who placed one level below "college ready" and may have otherwise been able to pass college math (Boatman & Long, 2017; Dadgar, 2012; Logue, Watanabe, & Douglas, 2016; Scott-Clayton & Rodriguez, 2015).2 Placement into dev-ed math increases the amount of time enrolled prior to accumulating degree-bearing credit, costing students time and money (Deil-Amen & Rosenbaum, 2002; Melguizo et al., 2016; Monaghan & Attewell, 2015).

Many students with remedial needs never complete the sequences needed to catch them up to college level (Bailey et al., 2010; Clotfelter, Ladd, Muschkin, & Vigdor, 2015). Long multicourse sequences, especially in math, may impede student progress. Recent research suggests that students who are assigned to the lowest level in the dev-ed math sequence—those who require three dev-ed courses—benefit less from their dev-ed sequence than those who are statistically comparable but placed into a two-course sequence (Xu & Dadgar, 2018). Experimental and quasi-experimental evidence suggests that many students placed into dev-ed may be able to pass college-level gateway courses (the first college-level math course students take), where they would immediately earn college credit (Attewell et al., 2006; Logue et al., 2016; Scott-Clayton & Rodriguez, 2015; Scott-Clayton et al., 2014).

Dev-ed courses may impede a student's overall success in college through several mechanisms. Because they do not count for college credit, dev-ed courses increase time to graduation and the cost of a credential (Bailey et al., 2010; Bailey et al., 2013). Spending additional semesters without making substantial progress toward degree completion may discourage students, changing their degree valuation (Deil-Amen & Rosenbaum, 2002; Scott-Clayton, 2011; Scott-Clayton & Rodriguez, 2015). Many dev-ed math students fail to make it to their gateway math course (for math, it is often college algebra), which is a necessary precursor to program-specific coursework (Adelman, 2006; Dana Center, 2017a; Goldrick-Rab, 2007). In addition, studies indicate that traditional dev-ed courses focus on procedural "skill-and-drill" pedagogy with too little emphasis on applying the training to college curricula or real-world problems (Grubb, 2010; Grubb & Worthen, 1999; Hodara, 2011).

Updating Dev-Ed in Mathematics

Helping students get through their dev-ed requirements and gateway coursework has implications for students' momentum toward a degree (Adelman, 2006; Calcagno, Crosta, Bailey, & Jenkins, 2007; Jenkins & Bailey, 2017). Stakeholders in higher education acknowledge the challenges posed by traditional dev-ed, and in response, several states have initiated dev-ed reforms (Brower, Bertrand Jones, Tandberg, Hu, & Park, 2017; Edgecombe, Cormier, Bickerstaff, & Barragan, 2013). In this section, we describe the potential solutions, including components of ongoing reforms used across the country.

Structural Reforms

To improve students' progress, dev-ed pathways need to be structured in a way that enables students to accrue college-level credit more quickly. There are two main approaches to increase the speed with which students with remedial needs can earn college-level credits: (a) Allow them to enroll immediately in gateway courses with additional supports to help them with the material, or (b) accelerate the speed with which students can get through dev-ed coursework by reducing the number of classes in the sequence. The first option makes students eligible to immediately earn college-level credits and provides a corequisite developmental course to support students who are underprepared for college-level material (Logue et al., 2016; Scott-Clayton & Rodriguez, 2015). Experimental evidence suggests that the corequisite model improves the rate of passing the gateway math course by 16 percentage points over traditional dev-ed math (Logue et al., 2016).

However, some students are substantially underprepared for college-level coursework, requiring skill development and curricular knowledge (Deil-Amen & Rosenbaum, 2002; Jaggars & Hodara, 2011). At the same time, long developmental sequences may discourage them (Deil-Amen & Rosenbaum, 2002; Jaggars & Hodara, 2011). The second structural reform—acceleration—expedites dev-ed coursework by adjusting the course structure and curricula, which can allow students in need of more remediation to quickly cover material and complete the developmental requirement (Edgecombe, 2011). Research suggests that accelerated dev-ed coursework improves persistence, as do enrollment in and completion of subsequent college-level courses (Boatman, 2012; Edgecombe, Jaggars, Baker, & Bailey, 2013; Hodara & Jaggars, 2014; Jaggars, Hodara, Cho, & Xu, 2015; Weisburst, Daugherty, Miller, Martorell, & Cossairt, 2016).

Curricular and Advising Reforms

In addition to structural changes, there are a number of reforms, often implemented by faculty or advisors, that can be incorporated into the models noted above. For example, one challenge in improving long-term outcomes of dev-ed students is low enrollment in the next recommended course in the sequence after passing dev-ed requirements (Bailey et al., 2010; Edgecombe, 2011; Jaggars & Hodara, 2011). To address this problem and help students maintain momentum, colleges could provide tighter, more prescribed sequencing wherein students in dev-ed math must enroll in gateway math courses immediately upon passing (Jaggars & Hodara, 2011; Jenkins & Bailey, 2017).

Increasing the relevance of dev-ed math coursework to real-world applications and active learning opportunities also can improve progress to and through the gateway course (Carlson & Winquist, 2011; Epper & Baker, 2009; Goldstein, Burke, Getz, & Kennedy, 2011). Relating material to real-world situations improves students' abilities to apply math outside of the classroom, including future employment (Hodara, 2011; Marzinsky, 2002; Stigler, Givvin, & Thompson, 2010). Instructional changes that emphasize active learning help students engage with the material, improving their attitudes toward math and performance in math coursework (Carlson & Winquist, 2011; Epper & Baker, 2009; Goldstein et al., 2011; Hodara, 2011; Verhovsek & Striplin, 2003).

Although students typically are placed in dev-ed because they have inadequate knowledge of content, some students may have poor study habits and unclear educational goals (Prince & Jenkins, 2005). Building supports to improve "soft skills" (listening well, studying effectively, etc.) and connect students to campus resources, either through tutoring or corequisite "success courses," can improve students' ability to continue making progress in the sequence and beyond (Bettinger, Boatman, & Long, 2013; Cho & Karp, 2013; Zeidenberg, Jenkins, & Calcagno, 2007). Success courses can be particularly useful as part of a broader approach to dev-ed reform, helping students learn about college, hone study skills, and build relationships with professors and peers (O'Gara, Mechur Karp, & Hughes, 2009).

Program Overview and Contexts

DCMP

DCMP's dev-ed mathematics reform relies on the structural reform of accelerating dev-ed and incorporates all of the curricular and advising approaches noted above. The DCMP model was designed by the Charles A. Dana Center at the University of Texas at Austin (Dana Center). For students who do not place directly into college-level math, DCMP offers an accelerated dev-ed course that aims to broadly prepare students for entry-level math including nonalgebra options like statistics and quantitative reasoning, whereas traditional dev-ed coursework often prepares students for college algebra. Colleges using the DCMP model for dev-ed can use a curriculum called Foundations of Mathematical Reasoning, developed by the Dana Center, or their own curricular materials that align with Dana Center recommendations. In either case, the instructional approach used in classrooms following the DCMP model differs from those in traditional dev-ed courses, which tend to focus heavily on algebra and rely on lecture as the primary mode of instruction (Zachry Rutschow, Diamond, & Serna-Wallender, 2017). DCMP courses leverage a student-centered approach and present math problems using real-life examples. The approach aims to help students apply and interpret concepts rather than memorize abstract formulas and to make math feel more relevant to daily life (Hodara, 2011; Marzinsky, 2002; Zachry Rutschow et al., 2017).

The Dana Center recommends that the accelerated dev-ed course be taken with a corequisite student success course to help students connect to resources on campus, develop and maintain motivation, and build study skills and strategies (Dana Center, 2017b). Upon passing the course, students are encouraged to enroll immediately in college-level math in the subsequent semester to create a yearlong math experience and maintain momentum through the math pathway.

The accelerated dev-ed course—the focus of our study—is the first phase in DCMP's broader model of math education reform. Figure 1 presents a comparison of the accelerated dev-ed pathway under the DCMP model (Panel A) compared with both the one-course sequence (Panel B) and the two- or three-course sequence (Panel C) of traditional dev-ed math. Because DCMP's dev-ed course is accelerated, students who would otherwise take two or three dev-ed math courses instead take one (so long as they pass). This accelerates progress through dev-ed coursework, putting the course sequence in line with the one-course sequence of traditional dev-ed. The primary differences between DCMP and the one-course-sequence traditional dev-ed pathway are that students in the DCMP model are encouraged to enroll in college math immediately after passing dev-ed math and to take a college math course that is most appropriate to their major (i.e., they could take quantitative reasoning, statistics, or algebra rather than emphasizing only algebra). As we described above, there are also curricular differences between DCMP and traditional dev-ed—those differences are not represented in the figure and would be difficult for us to assess in the current study. We anticipate that the structural reforms contribute to positive relationships between DCMP and early college milestones like enrollment in college math, where students may be eligible to enroll more quickly in college math due to acceleration. The advising reform, where students are encouraged to enroll as soon as they complete dev-ed math requirements, may improve the attainment of early milestones.

                          figure

Figure 1. Dana Center Mathematics Pathways and Traditional Dev-Ed Math Pathways.

Note. The figure illustrates the dev-ed course sequences and subsequent milestones for students in the Dana Center Math Pathways accelerated dev-ed course (A) compared with students in a one-course sequence (B) and in a two- or three-course sequence (C) of traditional dev-ed math (these subgroups constituted our control groups, described in the Methods section). For each course represented in the pathways, students who receive a failing grade may retake the course (following the arrow back into that step), while students who pass may move to the next milestone. However, at each step, any student may leave college, following the arrow out of the pathway. In Panel C, the dotted line illustrates that only students in a three-course sequence would take the third dev-ed math course.

At the time of our study, the DCMP dev-ed course was geared toward students in majors that would not require algebra. DCMP includes three gateway math options, specified based on students' programs of interest: Statistical Reasoning for applied social science careers (e.g., government, psychology, allied health), Quantitative Reasoning for humanities and liberal arts, and Science, Technology, and Engineering, and Mathematics Prep (followed by calculus) for careers that require algebraic skills (Dana Center, 2013). Offering alternatives to college algebra, which is a barrier for many students, may increase enrollment and completion of gatekeeper math courses and, ultimately, allow students to attain their desired degrees (Roksa, Jenkins, Jaggars, Zeidenberg, & Cho, 2009). Other types of mathematical reasoning, including statistical reasoning or basic quantitative reasoning, may be more relevant to students seeking careers in non-STEM fields (Bryk & Treisman, 2010).

At the time of our study, DCMP was implemented in 22 colleges, including 20 community colleges in Texas—which constitute our population of interest for this study. Since fall 2014, 58 colleges and college systems in 17 states used the DCMP model or its curriculum. Thus, the effectiveness of DCMP has important implications for students across the country. Recent preliminary results from a randomized controlled trial in 4 colleges suggest positive effects of DCMP on passing dev-ed and college math coursework within 1 year (Zachry Rutschow, 2018). Our study uses statewide data from Texas to assess the success of the model for a broad set of outcomes, including persistence, college credit accumulation, and degree attainment, over 3 years for all implementing colleges in the state.

State Contexts

Texas's public higher education system is among the largest and most diverse in the country, second in size only to California's. As in other states, a substantial proportion of college-going Texans place into dev-ed, especially in the community college sector. In 2011, 48% of Texas community college students failed to meet college-readiness standards in at least one subject, and 44% failed to meet the required score on math placement tests (THECB, 2016). Of the students who scored below the math cutoff, only 29% passed out of dev-ed math, and 16% completed a college-level math course—which is required for many degrees—3 years later (THECB, 2016). These suboptimal early outcomes have important implications for further outcomes in college. Texas community college students in dev-ed graduate at half the rate of their college-ready peers (Jones & Elston, 2014).

The current standard for placement into dev-ed math in Texas is a score less than 350 on the Texas Success Initiative (TSI) test, mandated by state policy in 2013. The state required remediation for students below the cutoff, but colleges chose their own standards and procedures for placing students into specific dev-ed sequences. They were able to determine criteria for placement into specific dev-ed courses and the length of the sequence.

Methods

To respond to the pressing need for evidence regarding the effectiveness of accelerated dev-ed coursework and DCMP in particular, we employed state administrative data from Texas combined with institutional measures. At each college that offered the DCMP model, advisors and faculty had autonomy to place students into DCMP. Although the Dana Center recommended their accelerated dev-ed course for students who required at least two dev-ed math classes and planned to pursue non-STEM majors (with a particular emphasis on majors that should not require algebra), a number of factors likely influenced whether students ended up in DCMP's dev-ed course or a traditional dev-ed sequence. In an effort to model and control for the selection mechanism, we relied on PSM and regression.

Data

This study used state administrative data provided through a restricted-use agreement with the Texas Education Research Center, a research center and data clearinghouse at the University of Texas. The Education Research Center holds longitudinal, student-level data for the entire population of secondary and postsecondary students in the state. We primarily relied on data collected by THECB, including college student enrollment records, placement test scores and exemptions, credits, grades, and degree outcomes, along with financial aid (Free Application for Federal Student Aid [FAFSA]) application information and demographic measures. We supplemented the THECB data with measures of math course completion status and state exit exam test scores from Texas high schools, collected by the Texas Education Agency, to assess the robustness of our results to including precollege measures of academic achievement.

Sample restrictions and constructing treatment and control groups

DCMP was implemented by 20 of the 50 Texas community colleges in fall 2014. We restricted the sample to students attending those 20 colleges, as only students enrolled at DCMP-implementing colleges had the possibility of placing into the program's dev-ed course. PSM requires that both the treatment and control group have the potential of selection into treatment (Morgan & Winship, 2007; Rosenbaum & Rubin, 1983).

The THECB schedule data capture students' course enrollments (including course and section numbers), credits, and grades for each term enrolled. To construct the sample of students enrolled in dev-ed math courses, we first identified developmental math courses. We looked up dev-ed course numbers in the Texas Academic Course Guide Manual, a list of approved lower division academic courses that includes prescribed common course numbers, contact and credit hours, and course descriptions used by all community colleges. We restricted the sample to all dev-ed math enrollees in fall 2014.

To identify the treatment group among those enrolled in dev-ed math, we determined which students were enrolled in a DCMP course (either Foundations or an equivalent accelerated dev-ed course developed by the institution) using a list of DCMP course and section numbers provided by the Dana Center (n = 582). We verified that we identified the appropriate course/section by comparing course enrollment numbers provided by each DCMP-implementing college with those in the THECB schedule data. The remaining dev-ed math students—those not in a DCMP course—constituted the control conditions. Appendix Table A2 (online) provides a breakdown of DCMP students and other dev-ed students at each college included the sample. The number of DCMP students at each college was quite small (1–2 class sections, though there were some exceptions that offered more) compared with enrollment in the control conditions.

We created two separate control groups: students in a one-semester dev-ed math sequence (n = 6,064) and students in a two-or-three-semester sequence (n = 9,405). We expected that the latter might be the more appropriate comparison group. This group likely possesses similar academic ability to our treatment group, given that the Dana Center recommends that students required to take at least two semesters of dev-ed math register for DCMP. However, DCMP is an accelerated path to college-level coursework (it should be completed in one semester); therefore, DCMP dev-ed students face a course sequence similar to that taken by students placed in a one-semester traditional developmental math course. We cannot know the exact counterfactual for those in DCMP (it is unclear whether they would have been in a one- or two-/three-course dev-ed sequence), so we also ran analyses on this second comparison group, which demonstrated higher prior math ability than did DCMP students.

Because the data include information about all placement exams, we were able to use math scores to control for student ability, as measured by the placement test.3 Ideally, we would identify students of similar underlying ability using the mandated placement test in the state—TSI. TSI determines both placement into developmental math courses and, at some colleges, the developmental course sequence required (colleges vary in their policy regarding placement into different levels of dev-ed—several colleges use holistic placement, rather than a test score, to determine course sequences). However, we found that many community college students—more than two thirds in the sample—had non-TSI placement test scores (e.g., COMPASS, ACUPLACER) in fall 2014 rather than scores for the mandated TSI exam. For that reason, we calculated each student's z score on the test they took, compared with all other test takers of the same test taken in the same term, as a proxy for underlying ability. This is not a perfect solution, as tests vary in how they place students into dev-ed (Ngo & Melguizo, 2016), but was a necessary step to maintain the sample. We further discuss measures of academic ability in the variable selection section. Only students with placement exam math scores were included in the study.

Analytic Strategy

Without the option of random assignment, we sought to stratify students into subgroups in a manner that could control for the systematic differences between treatment (DCMP dev-ed math) and control (traditional dev-ed math). We followed Ho, Imai, King, and Stuart's (2007) recommendation to preprocess the data using PSM to make the treatment group as similar as possible to the control group, reducing systematic differences in assignment to DCMP in subsequent regression analyses.

First, we estimated the probability that an individual student was placed into DCMP by running a probit regression of his or her treatment status on demographic, academic, and institutional measures. We describe the variables included in the model in the subsequent section. The resulting propensity score sums up the probability of placement in DCMP in one number, modeling selection based on background characteristics and hypothesized selection mechanisms (Morgan & Winship, 2007; Rosenbaum & Rubin, 1983).

To estimate propensity scores, we used an Epanechnikov kernel matching estimator and a bandwidth of 0.06. Kernel matching uses weighted averages of all cases in the control group, maximizing the use of information. This technique creates a lower variance than nearest neighbor and radius matching, which do not use all available cases (Caliendo & Kopeinig, 2008). Although matching techniques that rely on matching without replacements, which throw out several observations with similar propensity scores, perform poorly compared to randomized controlled trials, matching algorithms that do not "engage in random pruning," like kernel matching, perform much better (Jann, 2017, p. 13; King & Nielsen, 2016). This supports our decision to rely on kernel matching over other matching approaches.

We restricted the analytic sample to observations on the common support to ensure sufficient overlap in propensity for participation across students in treatment and control groups and to ensure that any combination of characteristics observed in the treatment group also can be observed among the control group (Caliendo & Kopeinig, 2008). We dropped treatment observations whose propensity scores were greater than the maximum or less than the minimum propensity score of the controls. In both analytic samples, more than 85% of treated students were on the common support. Based on visual inspection of the propensity score distribution, we did not impose any additional trimming of the sample.

Next, we ran regressions (logistic regressions for dichotomous outcomes and ordinary least squares for continuous outcomes) using the propensity scores as weights to determine the effect of DCMP placement on course completion, persistence, subsequent course-taking patterns, and degree attainment. Compared with a simple comparison of means between matched groups used in traditional PSM, this "doubly robust" estimation strategy controls for the predictors of placement into treatment twice (once in the initial propensity score model and again in the model predicting the outcome). The final regression captures additional covariate imbalance across DCMP and traditional dev-ed participants who might remain after matching (Ho et al., 2007).

As with traditional PSM, the method we chose enables us to compare students with similar estimated propensities of enrolling in DCMP based on observed characteristics but different actual placement into dev-ed math coursework (DCMP vs. traditional dev-ed). We must invoke an "ignorability" assumption that, conditional on the pretreatment covariates, there are no additional confounders between students who were placed into DCMP and those who were not (Morgan & Winship, 2007). Although matching does not eliminate selection concerns, because it accounts for only observed differences between treatment and control, it is a valuable technique when used with a rich set of observed characteristics. Because PSM cannot account for preexistent unobserved differences between treatment and control groups, our findings represent associations, rather than causal estimates.

In this study, PSM served two important purposes. First, it enabled us to model and interpret how students were sorted into DCMP (the selection mechanism). Second, weighting the final regression models with propensity scores allowed us to obtain a more precise estimate of the relationship between the treatment and student outcomes. The final regression model included covariates from the propensity score model, an indicator of treatment status, and a measure capturing success course coenrollment, which would occur at the same time as the treatment/control dev-ed course.

To be prudent, we also ran regression models predicting each outcome without controlling for students' propensity scores. The results (available upon request) showed similar patterns of effects. Although preprocessing the data based on propensity resulted in different point estimates, the magnitude and significance were similar.

Variables included in the main model specification

Table 1 presents a complete description of covariates included in our main models and outcomes as well as the mean and standard deviation for each measure. We were able to include a variety of control measures, including demographic information such as race, gender, and age, which likely predict persistence at community colleges and, for those placed into dev-ed, progress through the full remedial sequence (Bailey et al., 2010; Feldman, 1993; Leppel, 2002). We also controlled for a host of educational measures, including information regarding prior course and enrollment history. Guided by prior research, we included indicators of whether students required dev-ed coursework in reading and writing (Fike & Fike, 2008; Hawley & Harris, 2005). In addition, we controlled for student's educational goals (e.g., certification, job skills, transfer) and major at the beginning of the term. We used Classification of Instructional Programs codes to develop broad major fields following examples from prior literature (Leppel, Williams, & Waldauer, 2001; Zafar, 2013). We also included an indicator of FAFSA-filing status. In the Education Research Center data, financial information is available only for students who filed FAFSAs—approximately one third of the sample. We ran an alternative specification using additional financial measures, as described in our section on sensitivity analyses.

Table

Table 1 Variable Names and Descriptions

The PSM approach relies on our ability to model selection into treatment. Because the assignment of students to DCMP courses was at the discretion of institutional agents, we obtained information from the colleges about factors that may affect selection. The measures included whether the schools had mandatory advising and whether they actively recruited students for DCMP (e.g., used marketing materials, like posters or pamphlets, or noted the option in campus orientation). We also captured the extent to which each college complied with recommendations to place students from non-STEM majors into DCMP, as those students likely did not need college algebra for future coursework, and indicators for which colleges were codevelopers. Codevelopers piloted DCMP programs in the 2013–2014 academic year and sent at least one advisor to a training hosted by the Dana Center, which might influence how they place students into the courses. Finally, we included several measures of college characteristics obtained from the Integrated Postsecondary Education Data System, including percentage of enrollees who receive Pell Grants, percentage who are non-White, total number of students enrolled, and student-faculty ratio.

We performed several alternative models using additional measures (described in the sensitivity analyses subsection and in Appendix A [online]). It is feasible that there are other individual factors that predict student success in dev-ed math that we cannot include in our models due to the limitations of administrative data. For example, noncognitive factors, such as motivation and self-efficacy, likely predict performance in dev-ed math, enrollment in college math, and other college outcomes (Scott-Clayton, 2012). Likewise, research suggests that faculty validation (Barnett, 2011) and student engagement (Schudde, 2019) improve persistence and degree attainment among community college students and are likely to improve performance in coursework. Unfortunately, it is not feasible to include measures of these constructs, often obtained via survey, since we rely on state administrative data.

Dependent variables

Our dependent variables included a variety of early college outcomes, capturing college momentum and early progression through math pathways, and longer-term measures following students through spring of the 3rd year since enrolling in dev-ed math. Measuring progress among dev-ed students using intermediate "milestones" can inform our understanding of the degree pathway and how students perform throughout the sequence (Calcagno et al., 2007; Goldrick-Rab, 2007; Jenkins & Bailey, 2017). In Year 1 (2014–2015), we captured whether students passed dev-ed math in the first term and whether they persisted in college, whether they enrolled in and passed college math (in either algebra; nonalgebra, which was emphasized in the DCMP model; or any type of college math—an important milestone according to the literature), and the number of college-level credits earned by the end of the academic year. In Year 3 (2016–2017), we captured whether they enrolled in and passed college math (algebra, nonalgebra, or any college math), cumulative college credits earned, and whether they earned associate degrees.

Sensitivity analyses

To assess the sensitivity of our models to additional observable measures and to potential unobserved confounders, we performed a series of robustness checks. We ran three alternative model specifications to our main analyses, including a model with high school math course completion and test scores, a model with additional financial aid data measures, and finally, a model that incorporates institutional fixed effects. Ideally, we would incorporate these measures into our main analytic models, but we found that including the covariates substantially reduced our sample size. We describe the motivation and measures for our alternative model specifications in Appendix A. As we explain in the Results section, the main model results are largely robust to the inclusion of the additional measures from high school and FAFSA and to the inclusion of institutional fixed effects.

Our second set of sensitivity analyses addresses the assumptions required of PSM. PSM assumes selection into treatment based on observable variables. However, unknown confounders may influence selection into DCMP and the outcomes. We assess the sensitivity of our estimates to potential unobserved confounders by simulating an unobserved confounder on program assignment and outcome. Following a procedure outlined by Ichino, Mealli, and Nannicini (2008), we estimate whether the findings are robust to the inclusion of a simulated unobserved binary covariate that relates to both DCMP assignment and the outcome. We include a greater description of this method and our results in Appendix B (online).

Descriptive statistics and covariate balance

Table 2 presents descriptive statistics of our treatment and control groups. To facilitate exploration of differences in the subgroups of interest and whether PSM reduced differences between treatment and control groups, the table allows for the comparison of covariate means and standard deviations before and after matching.

Table

Table 2 Covariate Balance: Means Before and After Matching

Before matching, the treatment group showed marked differences in observable characteristics from both control groups (we describe predictors of placement into DCMP in the Results section). After matching, the differences between the treatment and two-and-three semester control group were largely diminished. A few covariates still had significant differences, such as a measure for the college's compliance with the Dana Center's recommendations for preferred majors for DCMP participants and the indicator of majoring in education. After matching, observable differences between DCMP students and those assigned to the one-course traditional dev-ed sequence virtually disappeared; the only remaining difference was that control group students were slightly more likely to have filed FAFSAs. The remaining covariate imbalance after matching, though minimal, bolsters support for using Ho et al.'s (2007) doubly robust estimation strategy to reduce remaining bias.

Table 2 also shows the treatment and control group means and standard deviations of the outcomes before and after matching. The results suggest that students in DCMP were less likely to enroll in and pass college algebra than their peers in either control group but more likely to enroll in and pass nonalgebra college math (and enroll in any college-level math overall), particularly by the end of Year 3. However, as we present below, the patterns from the doubly robust estimation strategy are more conservative, likely because they allow us to further adjust for covariates in the model.

Results

Selection Into the DCMP's Dev-Ed Course

Table 3 presents the results from probit models predicting participation in DCMP. The leftmost columns provide results using the two-/three-course traditional dev-ed sequences as a control group, and the rightmost columns present results using the one-course sequence as the control. For each set of analyses, we present the coefficients (log-odds) and, for ease of interpretation, the marginal effects, along with standard errors. The results provide insight into the factors that predict placement into DCMP for students with remedial needs after controlling for other student background and institutional characteristics.

Table

Table 3 Probit Models Predicting Propensity for Placement Into DCMP

Enrollment in dev-ed writing increases the probability of placement into DCMP by 2.4 percentage points for the two- or three-semester control analysis (marginal effect = .024, SE = .005, p < .001). Meanwhile, enrollment in dev-ed reading negatively predicts placement into DCMP compared with the control group of students placed into two or three semesters of dev-ed math (marginal effect = .020, SE = .004, p < .001). There are no significant relationships between other dev-ed coursework and DCMP placement when using the one-semester control group. Enrollment in a social science or literature/linguistics major, which were among the recommended majors for the Dana Center's model, positively predict propensity to participate in DCMP compared to the reference category of liberal arts. Seeking transfer to a 4-year institution negatively predicts placement into DCMP's dev-ed pathway for both control groups. Some institutional factors that we expected to positively influence selection into DCMP predicted participation only when using the two-/three-course control group, like the percentage of Pell Recipients (which positively predicts propensity to enroll in the DCMP dev-ed course) and total enrollment (which negatively predicts propensity for DCMP). Recruiting for DCMP positively predicts placement, and mandatory advising appears to negatively predict placement into DCMP, though only in the analyses with the two-/three-semester control group (recruiting: marginal effect = .045, SE = .008, p < .001; mandatory advising: marginal effect = –.042, SE = .007, p < .001).

Although math placement score predicts DCMP placement in both samples, it positively predicts placement into DCMP when using a control group of students in two-/three-course sequence dev-ed (marginal effect = .015, SE = .002, p < .001) but negatively predicts placement compared with those in a one-course sequence (marginal effect = –.020, SE = .004, p < .001). This suggests that even after controlling for other background measures, students in the two-/three-course control group present lower math ability, as measured by the placement test, than DCMP students; those in the one-course control group demonstrate a higher tested math ability than do DCMP students.

Relationships Between DCMP and College Milestones

In Table 4 we present our main results, which illustrate the relationship between participation in DCMP's dev-ed pathway and a variety of college milestones. For each outcome, the table shows the average marginal effect—the predicted change in probability for students in DCMP compared with each control group, while holding all covariates at their mean—and the control group mean for the outcome. We first describe the results for the two- or three-semester sequence control group, followed by the one-semester sequence control group.

Table

Table 4 Relationship Between Dana Center Mathematics Pathways (DCMP) and Outcomes: Average Marginal Effects and Control Group Means

Results from analyses with two- or three-semester control group

Table 4, Column 1, presents our preferred results, comparing DCMP students with those in a two- or three-semester traditional developmental math sequence. We anticipated a positive relationship between the treatment and college math enrollment in the next semester. However, after doubly controlling for student and institutional characteristics, there was little evidence of a relationship between DCMP status and college math course enrollment measures at the end of Year 1. We did see a positive relationship between DCMP participation and persistence, where participating in DCMP was associated with an 8.4 percentage point increase in persistence (SE = .035, p = .017). By the end of the next term, DCMP students also took, on average, 1.9 more hours of college-level coursework compared with their peers who were placed in the two-/three-semester sequence of dev-ed math (SE = .389, p < .001).

Two years later, in spring 2017, DCMP students surpassed the two- to three-semester sequence control group in college math enrollment and completion. Participating in DCMP's dev-ed math course in fall 2014 was associated with a 36 percentage point increase in the probability of completing a college math course by the end of the 2016–2017 academic year, compared to students in the two-/three-semester sequence (marginal effect = .360, SE = .053, p < .001). Enrolling in and passing nonalgebra college math coursework explained the majority of the increase in college math completion—participation in DCMP was associated with a 46.9 percentage point increase in the probability of completing a nonalgebra math course (SE = .064, p < .001). At the same time, participating in DCMP has a small negative relationship with taking and passing college algebra, lowering the probability of each by about 1 percentage point. Although students in the treatment group did not appear more likely to earn an associate degree by the end of 3 years of follow up, they did experience a significant positive boost in the number of college-level credits earned.

Results from analyses with one-semester control group

Our second control group includes students in the one-semester traditional dev-ed math sequence (see Table 4, Column 2). Although they tended to be more academically prepared than DCMP students, they were on a similar academic trajectory in terms of sequence. Both groups should have been eligible to complete their dev-ed requirement in one term, although DCMP explicitly encourages immediate enrollment in college math. Participating in DCMP had a small negative relationship with enrolling in and completing college-level math, driven largely by the control group's higher probability of enrolling in college algebra during the 1st year. Compared with students in a one-course traditional dev-ed sequence, participation in DCMP was associated with a 1 percentage point decrease in the probability of enrolling in college algebra in the subsequent term (marginal effect = .011, SE = .003, p < .001). However, DCMP also was associated with an increased probability of passing dev-ed math and accumulating college-level credits (p < .001).

By the end of the 2017 academic year, DCMP participants were more likely to have enrolled in and passed a college math course than their peers in the one-course traditional sequence, though participating in DCMP appeared to have a particularly strong association with nonalgebra college math enrollment. DCMP students experienced a 38 percentage point increase in the probability of completing a nonalgebra college math course compared with their one-course traditional dev-ed peers (marginal effect = .380, SE = .054, p < .001). DCMP participation also has a negative, though smaller, association with taking and passing college algebra. Of course, if college math completion is an important college milestone, as suggested by the community college literature, enrolling in and passing any type of college math moves students closer to their degree attainment goals. Compared with students in the one-course sequence of traditional dev-ed math, participating in DCMP was associated with a 13.1 percentage point increase in the probability of enrolling in any college math and a 13.4 percentage point increase in the probability of passing any college math (enroll college math: marginal effect = .131, SE = .018, p < .001; pass college math: marginal effect = .134, SE = .039, p < .001).

Sensitivity analyses

We performed a number of robustness checks to examine whether our results are sensitive to additional observable variables and to a potential unobserved confounder. Appendix A describes and presents results from three alternative model specifications, which suggest our main results are largely robust to alternative model specifications. Appendix B presents an overview, along with results and implications, of a sensitivity analysis for unobserved confounders (Ichino et al., 2008). We find that some of the Year 1 effects, particularly persistence and college credits earned for the two-/three-course control group, could be sensitive to a confounder that is a positive predictor of DCMP participation. Several Year 3 results appear robust to even a confounder with a very strong relationship with assignment to treatment or the outcome.

Discussion

In this study, we examined the impact of DCMP's accelerated dev-ed course, which uses active-learning opportunities and real-world applications for math content while speeding up the rate at which material is covered. The course is the first step in a pathway toward college-level mathematics coursework that is broadly tailored to students' interests and career ambitions. To make the goals of DCMP feasible, dev-ed students—who constitute a substantial portion of community college enrollees—must gain momentum in their pathway by passing dev-ed math and enrolling in a gateway math course. We examined the relationship between DCMP and measures of college milestones, comparing DCMP students with those enrolled in a traditional dev-ed sequence.

First, we examined placement into DCMP, comparing DCMP students with their peers in either a one-semester or two-/three-semester traditional dev-ed math sequence. Our propensity score model suggested that students in DCMP were less academically prepared, according to the placement test, than those in the one-semester sequence but more prepared than those in the two-/three-semester sequence. Second, we estimated the relationship between DCMP participation and important outcomes in Year 1, like passing the current dev-ed math course, enrolling in and passing college-level math, and the number of college-level credits accumulated, and similar outcomes plus associate degree attainment by the end of Year 3. Our study uses multiple outcomes to understand how the initial phase of DCMP—the accelerated dev-ed course—is related to college momentum, as measured by the students' movement through the math sequence and their college pathway more generally.

Our estimates indicate that students placed into DCMP's dev-ed course are more likely to enroll in and pass college-level math than are their peers in traditional dev-ed math by the end of their 3rd year. Unsurprisingly, the effects are the largest when comparing DCMP students with those in longer dev-ed sequences requiring two to three courses. By the end of Year 3, participating in the DCMP model is positively and significantly associated with taking and passing college-level math. Our results also suggest that the majority of the increase in enrolling and passing college math occurs through nonalgebra math coursework, which is unsurprising given the model's goal to move students through a math pathway that aligns with their major requirements (rather than an algebra-for-all approach). Early milestones like increased probabilities of passing dev-ed math and persistence appear to have a domino effect on other important milestones, such as number of degree-bearing credits. While we still see a positive relationship between DCMP participation and accumulation of college credits by the end of Year 3, we do not observe a relationship between DCMP status and associate degree attainment. Only 4.9 percent of students in the entire sample earned an associate degree by that point, so it may be too early to detect changes in that outcome.

In many ways, the results we present here are to be expected—the Dana Center's goal in reforming the math pathway is to create options beyond algebra-for-all, enabling more students to take and pass college-level math in order to move toward their educational aspirations. But many programs do not work as intended in the field, so it is worthwhile to note that DCMP's dev-ed pathway is facilitating momentum by increasing both enrollment in and advancement through college-level math, especially for students with the lowest math ability. Although the findings align with DCMP's goals, our results cannot empirically establish why placement into DCMP for dev-ed coursework improves outcomes more than traditional dev-ed does. We highlight patterns of effects but are unable to untangle the mechanisms driving them. Based on program design, we expect that the effects we see on enrollment in college math may stem from both accelerating the dev-ed sequence (in comparison with students in the two-/three-course sequence) and establishing an alternative to college algebra with encouragement to enroll soon after completing dev-ed math (in comparison to students in both traditional dev-ed control groups).

The majority of the increase in college math course enrollment and completion stems from DCMP students' taking nonalgebra college math courses. There are several potential consequences for students, though we cannot yet observe all of them with only 3 years of follow-up. First, we might see an increase in associate degree attainment, as more students get past a key barrier to major requirements, passing college math. Second, students in DCMP may be less likely to pursue algebra-intensive majors because it would require taking an additional entry-level math course, college algebra, for those who took a nonalgebra course. The extent to which that is a problem is unclear because most students who participated in DCMP indicated interest in non-STEM fields. Furthermore, the benefit of increasing the number of students making progress toward a degree likely outweighs the potential loss of time and money for a small number of students switching to an algebra-intensive major. Of course, we can only speculate at this point, and we hope to see additional research on the implications of nonalgebra math coursework on degree attainment and major selection.

This study suggests that an accelerated curriculum coupled with an emphasis on yearlong math experiences may be able to help community college students reach important milestones in their college careers, especially compared to students placed into a traditional dev-ed sequence of two or more courses. One area for continued exploration in the model is further improving early college-level math enrollment (during the 1st year), since the gains over the control groups appear to occur after that point in this sample. The results have important implications for students across the nation. The DCMP model continues to expand across the country as more colleges—in both the 2- and 4-year sectors—aim to efficiently prepare students for college-ready coursework and reform college math to align with the skillsets demanded by careers.

Although we focus on DCMP, it is one of many reforms to dev-ed math. As institutions invest their limited resources in these interventions, evaluation of reforms and programmatic changes is crucial. The approach we take here—examining the relationship between an alternative dev-ed pathway and college milestones by comparing its effects with those of traditional dev-ed pathways—offers valuable information necessary for assessing reforms to dev-ed. As additional dev-ed reforms are implemented across the country, evaluation can illuminate whether and which reform models are effective.

Acknowledgements

This research was supported by Grant P2CHD042849, awarded to the Population Research Center at the University of Texas at Austin by the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and by funding provided to the Dana Center from the Texas Association of Community Colleges and the State of Texas (legislative appropriations request). The content is the sole responsibility of the authors and does not represent the official views of the National Institutes of Health, the Texas Association of Community Colleges, or the State of Texas.

ORCID iD
Lauren Schudde https://orcid.org/0000-0003-3851-1343

1.
The Dana Center Mathematics Pathways program began in 2012 under the name New Mathways Project and was renamed Dana Center Mathematics Pathways in 2016.

2.
Although several studies suggest that students just below the placement cutoff experience the most severe negative effects from dev-ed, at least one study found that those placed in the lowest levels of dev-ed experience the most severe negative effects (Clotfelter, Ladd, Muschkin, & Vigdor, 2015).

3.
Research suggests that placement test scores are not a great proxy for students' underlying ability, though they appear more valid for math than for reading (Scott-Clayton, 2012). In light of new evidence that high school transcript information may offer useful measures of skills necessary to pass college-level coursework (Ngo & Kwon, 2015; Scott-Clayton, Crosta, & Belfield, 2014), we incorporate additional measures in our robustness checks, described in more detail in Appendix A.

4.
The reference list includes references from the main text and the online appendixes.

References4

Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through college. Retrieved from http://www.avid.org/dl/res_research/research_thetoolboxrevisited.pdf
Google Scholar
Attewell, P., Lavin, D., Domina, T., Levey, T. (2006). New evidence on college remediation. Journal of Higher Education, 77, 886924. Retrieved from https://doi.org/10.1353/jhe.2006.0037
Google Scholar
Bahr, P. R. (2008). Does mathematics remediation work? A comparative analysis of academic attainment among community college students. Research in Higher Education, 49(5), 420450. Retrieved from https://doi.org/10.1007/s11162-008-9089-4
Google Scholar
Bailey, T. R. (2009). Challenge and opportunity: Rethinking the role and function of developmental education in community college. New Directions for Community Colleges, 2009(145), 1130. Retrieved from https://doi.org/10.1002/cc.352
Google Scholar
Bailey, T. R., Jaggars, S. S., Scott-Clayton, J. (2013). Characterizing the effectiveness of developmental education: A response to recent criticism. Journal of Developmental Education, 36(3), 1822. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/response-to-goudas-and-boylan.pdf
Google Scholar
Bailey, T. R., Jeong, D. W., Cho, S.-W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255270. Retrieved from https://doi.org/10.1016/j.econedurev.2009.09.002
Google Scholar
Barnett, E. A. (2011). Validation experiences and persistence among community college students. Review of Higher Education, 34(2), 193230. Retrieved from https://doi.org/10.1353/rhe.2010.0019
Google Scholar
Bettinger, E. P., Boatman, A., Long, B. T. (2013). Student supports: Developmental education and other academic programs. Future of Children, 23(1), 93115. Retrieved from https://doi.org/10.1353/foc.2013.0003
Google Scholar
Bettinger, E. P., Long, B. T. (2005). Remediation at the community college: Student participation and outcomes. New Directions for Community Colleges, 2005(129), 1726. Retrieved from https://doi.org/10.1002/cc.182
Google Scholar
Bettinger, E. P., Long, B. T. (2009). Addressing the needs of underprepared students in higher education: Does college remediation work? Journal of Human Resources, 44(3), 736771. Retrieved from https://doi.org/10.3368/jhr.44.3.736
Google Scholar
Boatman, A. (2012). Evaluating institutional efforts to streamline postsecondary remediation: The causal effects of the Tennessee Developmental-Course Redesign Initiative on early student academic success. Retrieved from https://eric.ed.gov/?id=ED550506
Google Scholar
Boatman, A., Long, B. T. (2017). Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation. Educational Evaluation and Policy Analysis. Retrieved from https://doi.org/10.3102/0162373717715708
Google Scholar
Bonham, B. S., Boylan, H. R. (2011). Developmental mathematics: Challenges, promising practices, and recent initiatives. Journal of Developmental Education, 34(3), 210.
Google Scholar
Brower, R., Bertrand Jones, T., Tandberg, D., Hu, S., Park, T. (2017). Comprehensive developmental education reform in Florida: A policy implementation typology. Journal of Higher Education, 88(6), 809834. Retrieved from https://doi.org/10.1080/00221546.2016.1272091
Google Scholar
Bryk, A., Treisman, U. (2010, April 18). Make math a gateway, not a gatekeeper. Chronicle of Higher Education, 56(32), B19B20. Retrieved from https://www.chronicle.com/article/Make-Math-a-Gateway-Not-a/65056
Google Scholar
Calcagno, J. C., Crosta, P., Bailey, T., Jenkins, D. (2007). Stepping stones to a degree: The impact of enrollment pathways and milestones on community college student outcomes. Research in Higher Education, 48(7), 775801. Retrieved from https://doi.org/10.1007/s11162-007-9053-8
Google Scholar
Caliendo, M., Kopeinig, S. (2008). Some practical guidance for the implementation of propensity score matching. Journal of Economic Surveys, 22(1), 3172. Retrieved from https://doi.org/10.1111/j.1467-6419.2007.00527.x
Google Scholar
Carlson, K. A., Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education, 19(1), 123. Retrieved from https://doi.org/10.1080/10691898.2011.11889596
Google Scholar
Cho, S.-W., Karp, M. (2013). Student success courses in the community college: Early enrollment and educational outcomes. Community College Review, 41(1), 86103. Retrieved from https://doi.org/10.1177/0091552112472227
Google Scholar
Clotfelter, C. T., Ladd, H. F., Muschkin, C., Vigdor, J. L. (2015). Developmental education in North Carolina community colleges. Educational Evaluation and Policy Analysis, 37(3), 354375. Retrieved from https://doi.org/10.3102/0162373714547267
Google Scholar
Dadgar, M. (2012). Essays on the economics of community college students' academic and labor market success. Doctoral dissertation, Teachers College, Columbia University, New York, NY. Retrieved from https://academiccommons.columbia.edu/doi/10.7916/D83B6696
Google Scholar
Dana Center . (2013). Curriculum design standards. Retrieved from http://www.utdanacenter.org/wp-content/uploads/NMP_curriculum_design_standards_Sept2013.pdf
Google Scholar
Dana Center . (2017a). Dana Center terminology for college mathematics reform. Retrieved from http://www.utdanacenter.org/wp-content/uploads/Dana-Center-Terminology_2013.pdf
Google Scholar
Dana Center . (2017b). Frameworks for mathematics and collegiate learning course. Retrieved from http://www.utdanacenter.org/frameworks-for-mathematics-and-collegiate-learning/
Google Scholar
Deil-Amen, R., Rosenbaum, J. E. (2002). The unintended consequences of stigma-free remediation. Sociology of Education, 75(3), 249268. Retrieved from https://doi.org/10.2307/3090268
Google Scholar
Edgecombe, N. (2011). Accelerating the academic achievement of students referred to developmental education (Assessment of Evidence Series). New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/accelerating-academic-achievement-students.pdf
Google Scholar
Edgecombe, N., Cormier, M. S., Bickerstaff, S., Barragan, M. (2013). Strengthening developmental education reforms: Evidence on implementation efforts from the scaling innovation project. CCRC Working Paper No. 61. New York, NY: Community College Research Center. Retrieved from https://files.eric.ed.gov/fulltext/ED565656.pdf
Google Scholar
Edgecombe, N., Jaggars, S. S., Baker, E. D., Bailey, T. (2013). Acceleration through a holistic support model: An implementation and outcomes analysis of [email protected] CCD. New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/acceleration-through-holistic-support-model.pdf
Google Scholar
Epper, R. M., Baker, E. D. (2009). Technology solutions for developmental math: An overview of current and emerging practices. Journal of Developmental Education, 26(2), 423.
Google Scholar
Feldman, M. J. (1993). Factors associated with one-year retention in a community college. Research in Higher Education, 34(4), 503–512. Retrieved from https://doi.org/10.1007/BF00991857
Google Scholar
Fike, D. S., Fike, R. (2008). Predictors of first-year student retention in the community college. Community College Review, 36(2), 68–88. Retrieved from https://doi.org/10.1177/0091552108320222
Google Scholar
Goldrick-Rab, S. (2007). Promoting academic momentum at community colleges: Challenges and opportunities. Working Paper No. 5. New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/academic-momentum-community-colleges.pdf
Google Scholar
Goldrick-Rab, S., Han, S. W. (2011). Accounting for socioeconomic differences in delaying the transition to college. Review of Higher Education, 34(3), 423445. Retrieved from https://doi.org/10.1353/rhe.2011.0013
Google Scholar
Goldstein, L. B., Burke, B. L., Getz, A., Kennedy, P. A. (2011). Ideas in practice: Collaborative problem-based learning in intermediate algebra. Journal of Developmental Education, 35(1), 2630.
Google Scholar
Grubb, W. N. (2010, September). The quandaries of basic skills in community colleges: Views from the classroom (NCPR working paper). Paper presented at the National Center for Postsecondary Research developmental education conference, New York, NY. Retrieved from https://files.eric.ed.gov/fulltext/ED533875.pdf
Google Scholar
Grubb, W. N., Worthen, H. (1999). Remedial/developmental education: The best and the worst. In Grubb, W. N. (Ed.), Honored but invisible: An inside look at teaching in community colleges (pp. 171209). New York, NY: Routledge.
Google Scholar
Hawley, T. H., Harris, T. A. (2005). Student characteristics related to persistence for first-year community college students. Journal of College Student Retention: Research, Theory & Practice, 7(1), 117142. Retrieved from https://doi.org/10.2190/E99D-V4NT-71VF-83DC
Google Scholar
Ho, D. E., Imai, K., King, G., Stuart, E. A. (2007). Matching as nonparametric preprocessing for reducing model dependence in parametric causal inference. Political Analysis, 15(3), 199236. Retrieved from https://doi.org/10.1093/pan/mpl013
Google Scholar
Hodara, M. (2011). Reforming mathematics classroom pedagogy: Evidence-based findings and recommendations for the developmental math classroom. CCRC Working Paper No. 27. New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/reforming-mathematics-classroom-pedagogy.pdf
Google Scholar
Hodara, M., Jaggars, S. S. (2014). An examination of the impact of accelerating community college students' progression through developmental education. Journal of Higher Education, 85(2), 246276. Retrieved from https://doi.org/10.1353/jhe.2014.0006
Google Scholar
Ichino, A., Mealli, F., Nannicini, T. (2008). From temporary help jobs to permanent employment: What can we learn from matching estimators and their sensitivity? Journal of Applied Econometrics, 23(3), 305327. Retrieved from https://doi.org/10.1002/jae.998
Google Scholar
Jaggars, S. S., Hodara, M. (2011). The opposing forces that shape developmental education: Assessment, placement, and progression at CUNY Community Colleges. Community College Journal of Research and Practice, 7, 575579. Retrieved from https://doi.org/10.1080/10668926.2012.716754
Google Scholar
Jaggars, S. S., Hodara, M., Cho, S.-W., Xu, D. (2015). Three accelerated developmental education programs: Features, student outcomes, and implications. Community College Review, 43(1), 326. Retrieved from https://doi.org/10.1177/0091552114551752
Google Scholar
Jann, B. (2017). Kernel matching with automatic bandwidth selection. London, UK: Stata. Retrieved from https://www.stata.com/meeting/uk17/slides/uk17_Jann.pdf
Google Scholar
Jenkins, D., Bailey, T. R. (2017, February). Early momentum metrics: Why they matter for higher education reform. CCRC Research Brief Number 65. New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/early-momentum-metrics-college-improvement.pdf
Google Scholar
Jones, S., Elston, D. (2014, August 15). Game changers. Austin, TX: Complete College America.
Google Scholar
King, G., Nielsen, R. (2016). Why propensity scores should not be used for matching. Working paper. Cambridge, MA: Author. Retrieved from https://gking.harvard.edu/files/gking/files/psnot.pdf
Google Scholar
Leppel, K. (2002). Similarities and differences in the college persistence of men and women. Review of Higher Education, 25(4), 433450. Retrieved from https://doi.org/10.1353/rhe.2002.0021
Google Scholar
Leppel, K., Williams, M. L., Waldauer, C. (2001). The impact of parental occupation and socioeconomic status on choice of college major. Journal of Family and Economic Issues, 22(3), 373394. Retrieved from https://doi.org/10.1023/A:1012716828901
Google Scholar
Logue, A. W., Watanabe, M., Douglas, D. (2016). Should students assessed as needing remedial mathematics take college-level quantitative courses instead? A randomized controlled trial. Educational Evaluation and Policy Analysis, 38(3), 578598. Retrieved from https://doi.org/10.3102/0162373716649056
Google Scholar
Martorell, P., McFarlin, I. (2011). Help or hindrance? The effects of college remediation on academic and labor market outcomes. Review of Economics and Statistics, 93(2), 436–454. Retrieved from https://doi.org/10.1162/REST_a_00098
Google Scholar
Marzinsky, M. (2002). Teaching and curricular practices contributing to success in gateway courses for freshman and sophomore students in math, science, engineering, and technology (MSTE) majors at a large public research university: A longitudinal study. Doctoral dissertation, University of Arizona, Tucson. Retrieved from http://arizona.openrepository.com/arizona/handle/10150/280026
Google Scholar
Melguizo, T., Bos, J. M., Ngo, F., Mills, N., Prather, G. (2016). Using a regression discontinuity design to estimate the impact of placement decisions in developmental math. Research in Higher Education, 57(2), 123151. Retrieved from https://doi.org/10.1007/s11162-015-9382-y
Google Scholar
Monaghan, D. B., Attewell, P. (2015). The community college route to the bachelor's degree. Educational Evaluation and Policy Analysis, 37(1), 7091. Retrieved from https://doi.org/10.3102/0162373714521865
Google Scholar
Morgan, S., Winship, C. (2007). Counterfactuals and causal inference: Methods and principles for social research. New York, NY: Cambridge University Press. Retrieved from https://doi.org/10.3102/0162373714521865
Google Scholar
Ngo, F., Kwon, W. W. (2015). Using multiple measures to make math placement decisions: Implications for access and success in community colleges. Research in Higher Education, 56(5), 442470. Retrieved from https://doi.org/10.1007/s11162-014-9352-9
Google Scholar
Ngo, F., Melguizo, T. (2016). How can placement policy improve math remediation outcomes? Evidence from experimentation in community colleges. Educational Evaluation and Policy Analysis, 38(1), 171196. Retrieved from https://doi.org/10.3102/0162373715603504
Google Scholar
O'Gara, L., Mechur Karp, M., Hughes, K. L. (2009). Student success courses in the community college: An exploratory study of student perspectives. Community College Review, 36(3), 195218. Retrieved from https://doi.org/10.1177/0091552108327186
Google Scholar
Prince, D., Jenkins, D. (2005). Building pathways to success for low-skill adult students: Lessons for community college policy and practice from a statewide longitudinal tracking study. New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/pathways-success-low-skill-adult.pdf
Google Scholar
Radford, A., Horn, L. (2012). Web tables: An overview of classes taken and credits earned by beginning postsecondary students. Washington, DC: U.S. Department of Education. Retrieved from https://nces.ed.gov/pubs2013/2013151rev.pdf
Google Scholar
Roksa, J., Jenkins, D., Jaggars, S. S., Zeidenberg, M., Cho, S. (2009, November). Strategies for promoting gatekeeper success among students needing remediation: Research report for the Virginia Community College System. New York, NY: Community College Research Center. Retrieved from https://files.eric.ed.gov/fulltext/ED507392.pdf
Google Scholar
Rosenbaum, P. R., Rubin, D. B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70, 4155. Retrieved from https://doi.org/10.1093/biomet/70.1.41
Google Scholar
Schudde, L. (2019). Short- and long-term impacts of engagement experiences with faculty and peers at community colleges. Review of Higher Education, 42(2), 385396. Retrieved from https://doi.org/10.1353/rhe.2019.0001
Google Scholar
Scott-Clayton, J. (2011). The shapeless river: Does a lack of structure inhibit students' progress at community college? New York, NY: Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/media/k2/attachments/shapeless-river.pdf
Google Scholar
Scott-Clayton, J. (2012). Do high stakes placement exams predict college success? CCRC Working Paper No. 41. New York, NY: Community College Research Center. Retrieved from https://files.eric.ed.gov/fulltext/ED529866.pdf
Google Scholar
Scott-Clayton, J., Crosta, P. M., Belfield, C. R. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371393. Retrieved from https://doi.org/10.3102/0162373713517935
Google Scholar
Scott-Clayton, J., Rodriguez, O. (2015). Development, discourgement, or diversion: New evidence on the effects of college remediation policy. Education Finance and Policy, 10(1), 445. Retrieved from https://doi.org/10.1162/EDFP_a_00150
Google Scholar
Scott-Clayton, J., Schudde, L. (2016). Performance standards in need-based student aid. NBER Working Paper No. 22713. Cambridge, MA: National Bureau of Economic Research. Retrieved from http://www.nber.org/papers/w22713
Google Scholar
Stigler, J. W., Givvin, K. B., Thompson, B. J. (2010). What community college developmental mathematics students understand about mathematics. MathAMATYC Educator, 1(3), 416. Retrieved from https://eric.ed.gov/?id=EJ890224
Google Scholar
Texas Higher Education Coordinating Board . (2014). TSI operational plan for serving lower-skilled learners. Austin, TX: Author.
Google Scholar
Texas Higher Education Coordinating Board . (2016). Texas public higher education almanac: A profile of state and institutional performance and characteristics. Austin TX: Author. Retrieved from http://www.thecb.state.tx.us/reports/PDF/7831.PDF
Google Scholar
Verhovsek, E., Striplin, T. (2003). Problem based learning: Applications for college mathematics and allied health. Mathematics and Computer Education, 37(3), 381387.
Google Scholar
Weisburst, E., Daugherty, L., Miller, T., Martorell, P., Cossairt, J. (2016). Innovative pathways through developmental education and postsecondary success: An examination of developmental math interventions across Texas. Journal of Higher Education, 88(2), 183209. Retrieved from https://doi.org/10.1080/00221546.2016.1243956
Google Scholar
Xu, D., Dadgar, M. (2018). How effective are community college remedial math courses for students with the lowest math skills? Communtiy College Review, 46(1), 6281. Retrieved from https://doi.org/10.1177/0091552117743789
Google Scholar
Zachry Rutschow, E . (2018, July). Making it through: Interim findings on developmental students' progress to college math with the Dana Center Mathematics Pathways. Center for the Analysis of Postsecondary Readiness Research Brief. New York, NY: Community College Research Center. Retrieved from https://postsecondaryreadiness.org/interim-findings-dana-center-mathematics-pathways/
Google Scholar
Zachry Rutschow, E., Diamond, J. (2015, April). Laying the foundations: Early findings from the New Mathways Project. New York, NY: MDRC. Retrieved from http://www.mdrc.org/sites/default/files/New_Mathways_FR.pdf?utm_source=MDRC%20Updates&utm_campaign=d8ae79a6ac-April_23_2015&utm_medium=email&utm_term=0_504d5ac165-d8ae79a6ac-42202049
Google Scholar
Zachry Rutschow, E., Diamond, J., Serna-Wallender, E. (2017, May). Math in the real world: Early findings from a study of the Dana Center Mathematics Pathways. Center for the Analysis of Postsecondary Readiness Research Brief. New York, NY: Community College Research Center. Retrieved from https://postsecondaryreadiness.org/wp-content/uploads/2017/05/dcmp-math-real-world.pdf
Google Scholar
Zafar, B. (2013). College major choice and the gender gap. Journal of Human Resources, 48(3), 545595. Retrieved from https://doi.org/10.1353/jhr.2013.0022
Google Scholar
Zeidenberg, M., Jenkins, D., Calcagno, J. C. (2007). Do student success courses actually help community college students succeed? CCRC Brief Number 36. New York, NY: Community College Research Center. Retrieved from https://files.eric.ed.gov/fulltext/ED499357.pdf
Google Scholar

LAUREN SCHUDDE is an assistant professor of educational leadership and policy at the University of Texas at Austin and a research affiliate of the Population Research Center. Her research examines the impact of educational policies and practices on college student outcomes, with a particular interest in broad-access postsecondary institutions.

KATHERINE KEISLER is a doctoral student of economics at the University of Texas. Her work focuses on public finance, health, and education outcomes, with a focus on policy outcomes.

Pathways To College Mathematics 1st Edition Pdf Download

Source: https://journals.sagepub.com/doi/10.1177/2332858419829435

Posted by: woodsreaccurtut.blogspot.com

0 Response to "Pathways To College Mathematics 1st Edition Pdf Download"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel