Skip to Content.
Sympa Menu

ctrl-shift - Re: [Ctrl-Shift] Your thoughts on testing

ctrl-shift AT lists.mste.illinois.edu

Subject: Social discussion of CS in K-12

List archive

Re: [Ctrl-Shift] Your thoughts on testing


Chronological Thread 
  • From: "Reese, George Clifford" <reese AT illinois.edu>
  • To: "sacrophyte AT gmail.com" <sacrophyte AT gmail.com>, "ctrl-shift AT lists.mste.illinois.edu" <ctrl-shift AT lists.mste.illinois.edu>
  • Subject: Re: [Ctrl-Shift] Your thoughts on testing
  • Date: Thu, 15 Jan 2015 22:05:32 +0000
  • Accept-language: en-US
  • List-archive: <https://lists.mste.illinois.edu/private/ctrl-shift>
  • List-id: Social discussion of CS in K-12 <ctrl-shift.lists.mste.illinois.edu>

Hi Charles and shifters,  

 

PARCC. There’s lots going on with this. The pilots having been going on for a while, and IL is one for the states that has bought in and is going forward.

Many people confuse the Common Core Standards with the testing around them. So let me say at the outset that I like the CCSS for Math. However, I, like many, am troubled by the PARCC role out. There are a lot of challenges, particularly with the computer implementation. I will copy below some concerns from Marty Gartzman at the University of Chicago.

 

There are other concerns. You can see the video http://goo.gl/eSCqL3 of the concerns of the implementation from one the high schools: Evanston. The roll out at the high school takes time and energy that can be highly problematic for high school juniors who are also taking the ACT and AP tests.

 

I took some of the computer-based high school assessments and found the interface very frustrating, but in general like the problems that were there, but not for a one-time, high-stakes, individual test. I think more can be done and I am wary of the speed with which they are being rolled out as well as the agendas for teacher and school evaluation based on these unproven assessments.

 

Another perspective is available this video presentation by Heather Brown and Kathy Felt at a recent ICTM conference. http://goo.gl/k8uDXx

The have some lessons learned for helping students as these are coming out.

My thoughts, so far. Unfortunately, some of the lessons learned are just what people feared. If you want students to do well on the test, they need to practice the test, practice with the interface, get familiar with the multiple response expectations, recognize that an answer in the calculator is not an answer in the computer test, not click outside the test, and so on. All that training takes away from instructional time.

 

But, ...other opinions are available.

 

George

 

+++Marty Gartzman email+++7/15/14

(Please distribute to other Illinois mathematics teachers who may be interested.) 

 

Dear ICTM colleagues:

 

With the recent, extensive field test of PARCC test items now completed and some sample items posted on the PARCC web site, teachers now have a better sense of the kinds of test items that PARCC will be asking.  (Practice items form the Grades 3-8 performance-based tests will be released in Fall 2014.) I raise here one concern about plans for the performance (open response) items in the hope of generating some discussion among ICTM members regarding the limited ways that students are allowed to input their solution strategies for open-response test items.

 

Many PARCC items, but especially the open-reponse items that will be part of the performance-based test, require that students show or explain how they solved the problems.  That is a good thing.  However, to the best of my knowledge, the only options students have for showing how they solved the problems is via a simple equation editor or via using a keyboard to type in a written explanation.  That means that some of the most powerful tools that students typically use in mathematics to solve problems and to explain or justify a solution — graphs, tables, drawings, and other representations — are not available for students to use on the PARCC exams. 

 

Restricting students to verbal explanations (typed in on a keyboard) or equations (typed in using an equation editor) robs students of some of the most common ways they solve problems.  If the intent of the test is to assess what students know about the mathematics in a task, the ability to use drawings, rough sketches, calculations (even partial), tables, lists, charts, counting schemes, graphs, expressions as well as equations, functions and written explanations is essential. Exam scorers also need to see that information in order to assess accurately students’ understanding.

 

Some teachers whose students participated in the PARCC field test reported that students generally solved the open-response (and other) problems using paper and pencil, then attempted to recreate their problem solving process in a step-by-step, typed-in explanation, sometimes giving up before being able to fully transfer their answer.  In some cases, hand-written solutions were submitted to PARCC along with the answers submitted online and the teachers noted significant differences between submitted answers for the online and hand-written solutions. 

 

The potential undesired implications of not being able to use graphs, tables, and drawings to demonstrate mathematical knowledge are significant.  First, students’ understanding is likely to appear less than students actual proficiency.  This is true for students who struggle, as well as advanced students. Schools may begin favoring  only verbal or numerical explanations while downplaying other mathematically rich ways of solving problems and justifying solutions. One could also imagine a strong push, even in the earliest grades, to spend time in mathematics classes on keyboarding skills.  Furthermore, PARCC is likely to be offering both online and paper-and-pencil options for administering the test this year, in recognition that some schools may not have the requisite technology for the computer-based-only administration. Will students doing the computer-based test have disadvantages over those doing the paper-and-pencil version, where graphs, charts, and drawings might be used with the open-response items?  If we want to asses what students know (as opposed to what they don’t know), students need access to the full range of tools that can used to provide evidence of their understanding.

 

That said, I have seen no public discussion of this issue among teachers whose students participated in the PARCC field test. Perhaps such concerns were expressed in direct feedback to PARCC or perhaps most do not share this concern. Especially if your students participated in the PARCC field test, I encourage you to share any thoughts (on the ICTM listserv ) you may have about this (without discussing any specifics related to individual test items). Do you view this as a problem? Was it a problem for your students in the field test?  Note that while PARCC has not yet indicated how much the open-response items will be weighted in a student's test score, earlier PARCC documents indicated that the performance-based exam could account for up to 40% of a student’s score. (Potentially, that is a good thing.)

 

If you prefer not to share your thoughts publicly, you can send them to me and I can submit them to the listserv without attribution.  

 

I understand that developing a high quality mathematics assessment for millions of students is an extremely complex and formidable task.  But PARCC promised us “a next generation assessment system” and we should expect nothing less.  Illinois is an important state for PARCC. If this is an issue that merits concern, there may still be time to influence the test development. If this concerns ICTM members, ICTM could advocate on behalf of mathematics students and teachers—to make sure that the results on next year’s exam accurately reflect the mathematical understanding and proficiency of Illinois students. 

 

Sincerely,

 

Marty Gartzman

----------------------------

Martin Gartzman, Executive Director

Center for Elementary Mathematics and Science Education

The University of Chicago 

1225 E. 60th Street

Chicago, IL  60637

(773) 834-0023

http://cemse.uchicago.edu/

+++/Marty Gartzman+++

 

 

++++A response by Kathy Felt to Marty’s concerns+++ 7/21/14

Dear Marty and ICTM colleagues,

 

Thank you for voicing your concerns and offering this conversation to others.  Sharing concerns in an effort to improve our new assessments is how beneficial change occurs.

  

I am happy to inform you that I discussed the issues presented here with PARCC officials last week at a PARCC Educator Leader Cadre meeting. 

 

Some of these concerns have come out in the analysis of the field test.  As you know, the primary purpose of the field test was to test the test.  While the analysis is not yet complete (remember that paper and pencil tests take a lot longer to score before an analysis can occur), there is evidence that many students need extra practice explaining their reasoning and also with the use of equation editor in their responses.  (As I understand, a tutorial is being created to assist students with using the equation editor function.)  Many students are not accustomed to using a computer for open response items.  I would think they will gain experience with this during the upcoming years as computer based tests become the norm.  On the field test, many students used paper and pencil on the open response items and then tried to transfer the information to the computer.  This was cumbersome and difficult for many.  So, in the short term, it tells teachers that we need to practice with this type of activity in our classrooms and day to day activities in order for our students to fare better with their responses.

 

I believe, in the future, it would at least be helpful to include additional options on the toolbar of the computer based test such as a coordinate graph, a table template, etc., for students to use in their responses if that will help them demonstrate their reasoning.  PARCC officials said this should be doable in the long term, but it is unclear if it can be done prior to next year’s tests. 

 

It is good that people are speaking up and also that we had a field test to see what does/does not work well. I am optimistic that students will do better every year as their experience with the online format improves. 

 

Expect a full report about the field test before too long.  There were some conclusions from feedback that included that students who took the test tutorial knew what to expect and had a much better experience than students who did not do that.  That should be a message to all teachers to make sure students participate in the tutorial.  Also, as mentioned earlier, students need more experience with equation editor and just explaining their reasoning.  Additionally, it would also be beneficial for students to practice with released sample items and practice tests to better hone their skills with Common Core type items.

 

Please do realize that PARCC personnel are listening and want to make this experience the best that it can be.  We live in an environment of assessments.  That brings anxiety and concern by its very nature.  Hopefully, the computer based tests will enhance the experience and not detract from it in an effort to better determine our students’ true understanding.

 

Kathy Felt

Illinois Educator Leader Cadre

Former ICTM Board Chair and Director 5-8

Sherrard Junior High School

Western Illinois University-Quad Cities     

+++/Kathy Felt++++

 

 

From: ctrl-shift-bounces AT lists.mste.illinois.edu [mailto:ctrl-shift-bounces AT lists.mste.illinois.edu] On Behalf Of Charles Schultz
Sent: Thursday, January 15, 2015 3:25 PM
To: ctrl-shift AT lists.mste.illinois.edu
Subject: [Ctrl-Shift] Your thoughts on testing

 

Good day, CTRL-Shift! :)

 

Today I had an opportunity to sit in on a class that was "practicing" for a PARCC test. I will say right up front that based on my own personal experiences, I have a huge dislike for standardized testing. But it was very interesting watching a group of kids from an adult perspective.

 

It is my desire to start a discussion on PARCC in general, but standardized testing in general, and depending on a multiple-answer test to properly assess a student's academic achievement. When I observe kids doing code.org, the assessment is immediate, fun and intuitive. That is not what I saw today. When I hear you all talk about collaborative spaces and computational thinking, I grow encouraged that education is moving forward in exciting ways. But how does PARCC testing fit into the domain of computational thinking?

 

I have a host of personal thoughts based on my past experience and my opportunity today. And I know Travis Faust has some strong opinions on this topic as well. *grin* I ask of the group, what do you think?

 

--

Charles Schultz




Archive powered by MHonArc 2.6.19.

Top of Page