The data sgp is an important tool for teachers and parents to see how students are performing in school. It shows how they are progressing over time, and allows them to identify students who may need extra help to achieve their academic potential. It also lets educators evaluate current educational systems and find ways to improve them.
Student growth percentiles (SGP) describe a student’s relative progress on MCAS compared to other students with similar prior test scores. They are calculated by dividing a student’s score on an individual assessment by the number of students scoring at that level or higher. For example, a student who grows at the 90th percentile means they scored better than 90% of other students with similar MCAS performance histories.
SGP analyses are done using a software program called R. This program is available for Window, OSX, and Linux and, because it is open source, can be compiled to run on just about any computer. Running SGP analyses requires familiarity with using the R software environment. A good way to get familiar with R is to take a look at the various tutorials available online.
For SGP analyses to be most effective, one needs access to a dataset of student assessments containing both raw scores and SGPs. The minimum dataset necessary for this purpose is the sgpData set which provides 5 windows (3 annually) of vertically scaled, anonymized assessment data in WIDE format. However, if you plan on running SGP analyses operationally year after year, it is best to use the sgptData_LONG dataset which provides 8 windows (3 annually) of assessment data in LONG format.
This dataset contains SGPs and SGP projections/trajectories for each student in the sample, along with a wide range of other statistical information. To obtain these data, you will need to have a subscription to the Massachusetts Comprehensive Assessment System (MCAs). To subscribe, please contact your local MCAs district office or contact the Massachusetts Department of Education directly.
A median SGP is the main summary statistic used for groups such as subgroups, classes, schools, and districts. However, mean SGPs have a stronger alignment with the Department’s guiding philosophy that all students contribute to accountability results and are more meaningful than medians for comparing students within groups. The change from median to mean SGPs is a natural transition as the Department prepares for the next-generation MCAS assessments.
A student’s SGP is a key indicator of their progress and could be an excellent predictor of future achievement. However, the current system for measuring student progress is broken. SGPs are more accurate than VAMs at predicting student outcomes, but they do not adequately capture the complexity of the learning process and do not fit well into existing accountability systems that focus on test score measures. This is a key reason why the Massachusetts Department of Education is moving toward a student growth paradigm in the new MCAS assessment system. This is a major policy shift for the state, and will require additional efforts to make the transition successful.