Showing posts with label data. Show all posts
Showing posts with label data. Show all posts

Saturday, March 07, 2009

PDC Work - part 2

A few ideas I want to weave into the strategic stance I drafted in the last post.

1 - Sustained contact time on a single focus spread over time Consistent with a recent report titled "Professional Learning in the Learning Profession," our professional development center would emphasize choosing a focus, working together with a school for at least 50 hours and spread out over 6-12 months.

2 - People on the ground have the capacity to invent their own solutions This falls under our assets based approach. However, I think there are so many specific elements to the assets based approach that it warrants listing them out. The last post listed the concept of "positive deviance" and now we have the belief in the capacity of people to invent their own solutions. More can also be written on the "strengths based" movement, positive psychology, growth mindset, appreciative inquiry and learned optimism.

3 - Building teams in this work is a high leverage point More brains are better than one and only different perspectives can really produce new knowledge.

4 - Whatever theory or concept we are working on, it must be grounded in the work produced at the site Studying student work together or videotaping teacher practice provides the reality test when we are discussing more abstract concepts of differentiation, scaffolding, or project based learning. It takes far more disciplined energy to keep returning to our work than it does to have abstract debates on what works best for students. Our approach is more empirical.

Monday, June 26, 2006

46 graduates tomorrow June27, 2006

Depending on how the city calculates the cohort we are in a range of 67 to 75% graduation rate. The New York Post recently published a 39% graduation rate across city schools.

We have a lot of work to do to raise and meet internal standards. We need to graduate more. We need to know much sooner that our students are on a productive path.

However, with everything we have left to do, we accomplished an amazing feat for our first graduating class. I have turned down 3 jobs in the past two years (LIS, Director of New and Small Schools and Baruch Faculty for SAM program). Every time I have said I need to wait until my first graduating class. I did.

Now, I am moving on to become Director of the Professional Development Center, Eagle Rock School.

More later.

Sunday, April 10, 2005

Preliminary Report on Latest Data

Although this is just for our 11th graders, it is my belief that it is reflective of school progress.
When you look at how the students did in terms of credit last semester (probably our worse semester of "earnings") we now see:

79 % projected to earn more credit than they did last semester
27 % earning just as much
2 % doing worse

Now compare how they are doing compared to their average earnings prior to this school year
59 % projected to earn more than they had on average
2 % achieving their average earnings
40 % doing worse than their average

Finally, every student has a "burden." That's the number of credits they need to earn for the remaining semesters to get on track for a 4 year graduation.

16 % projected to reduce their burden (none of these are among the undercredited 11th graders -- that is, all were already accelerated towards graduation)
3 % have the same burden
84 % have a higher burden - that is they have to earn more credits per semester next year to get to graduation than they did at the beginning of this semester or at the beginning of this year.

Michael

Friday, April 08, 2005

The Data Is In!

It has been 2 months and 8 days since the semester began.

We had intended to take actions to address the problem of the undercredited 11th graders. Our first action was to institute or increase conferencing with an emphasis on instruction. We would record conferencing events and measure the success by tracking progress towards credit achievement. The grids were to be used as ways to capture fractions of credit.

Finally, today we have a full set of credit achievement data for this subgroup. 2 months and 8 days seems to be the current delay in implementing an action cycle and receiving data on that action. At this rate we would not have data again until the end of the school year.

I want to collect this data three more times. How can we cut down the turnaround by a third?

Michael

Thursday, April 07, 2005

One Crew / One quarter / Zero Credits

As reported by crew leader #3
Student1 10th grader O credits earned from 1/05 - 4/1/05
Student2 10th grader 0 credits....
Student3 10th...0 credits...
Student4 11th grader...0 credits from...5 credits projected for June
Student5 10th grader 0 credits...
Student6 11th grader...0 credits...4 credits projects for June
Student7 11th....0....5 for June
Student8 11th...0...2 for June
Student9 9th...0
Student10 ...11th...0...4 for June
Student11 10th...0...
Student12 9th...0
Student13...11th...0...5 for June
Student14 10th...0...

What do you make of this?
Interpretations? Hypotheses?

Does it mean conferencing does not lead to credit accumulation?
Is there some confusion by what we mean by “credits?”
Does it mean these students have accomplished nothing? Learned nothing?

This is what assignment 7 and 8 are all about: making and testing hypotheses.

Look back on the data previously reported by two other crews. Does it alter any thinking?

Michael

Wednesday, March 30, 2005

Push to collect data

Michael writes:

Hopefully this reinforces Al's previous post.

In February, staff were required to conference, maintain conferencing records and collect data on credit accumulation. To date, one-quarter through this final semester, only two staff members have provided data. One is on the leadership team and one is not. The fact that the leadership team is not getting this data together is a big problem.

The problem now in late March is the lack of data regarding student progress and correlation to conferencing. We have to look at how we’re doing and ground it in data. We can’t do that if the data isn’t collected and provided. Please do that.

What I have now is limited to two crews. This data is based on progress of students between January 31st and March 28th.

Crew #1
5 undercredited 11th graders

Past Record (data from Semester 1)
5/5 earned less credit than their average earning power.
5/5 increased their burden (have more to do per semester than in the past)
1/5 fell off track (previously on track)

Based on 20 class days since January 31st (data from 1/31 through 3/28)

5/5 are earning more credit than they did last semester
5/5 are earning more credit than their past average earning power
3/5 increased their burden
2/5 meeting or decreasing burden – moving from off-track to on-track

Correlation to conferencing: unknown. No conferencing statistics provided.

Crew #2
Information was provided in aggregate form. That is, overall 14 students have earned a total of 8 credits in the first 20 days. Not broken out per student although range is earnings of 0-2 credits per student in first 20 days.

When this aggregate earnings is compared to aggregate earnings for past semester’s undercredited 11th graders, we see no significant difference in earning. That is, students on average earned 2.25 credits last semester and projecting from this current report will earn 2.42 credits per student.

Every student in this group has had at least one conference. Each student has experienced 1-3 conferences.

Friday, March 25, 2005

Checking in on conferencing & grids

On March 21, Marc reported the following status on use of grids amongst Explorations crew leaders:

3 crew leaders - sending grids this afternoon
1 crew leader- no updates on grids, offering learning plans instead
2 crew leaders - no updates on grids since they are using binders to chart progress
4 crew leaders - no updates on grids, nothing offered in its place
2 crew leaders- absent

I sent out the following email on March 25th in response.

It's clear we are still struggling with collecting results on conferencing and using the grids for record keeping. You each broke up the teachers into smaller groups that you were each responsible for to check in on conferencing. Please add to that task that you should check that those teachers are actively using the grids on a weekly basis. Show them how.

What I'm interested in now is the extent to which you have done these practices within your own crew. I would like each of you to tell me your status to each question:

1. Does each person in your crew have a learning plan for this quarter (whether it's foundations or explorations)?
2. How many conferences have you had with your crew members since this action was first initiated? Only count those that you recorded in your conferencing notebook.
3. What is the total credit accumulation according to your grids for your crew since January 31? Report this number as average number of credits per student (so that's a single number) and a range from lowest accumulator to highest accumulator. You would get these numbers by calculating partial credits as indicated on your grids.

So, possible responses might look like

SAMPLE
Crew of 10 students
1. 6 have learning plans.
2. I have done 1 conference each with 3 students (total of 3 conferences recorded in the manner it was presented in PD...one where you taught something).
3. One average, my students have earned a 0.25 credits per student. The range is from 0 to 1. (This means you have grids that show this)

That's it. Please do this as soon as possible.

Michael

Thursday, February 24, 2005

Data Analysis

FROM MICHAEL:

I looked at the credits earned in January and analysed the credit accumulation for all our 11th graders. It is very likely that some of you may respond that the data is not all accurate and that is true. However, it is my opinion that it is accurate enough to make the general conclusions made below. Although we should still be working on accuracy, for the purposes of looking at our 11th graders I think this data is good enough. In order to avoid being on some radar screen for a school in need of improvement, we would have to be wrong on about 40 of our students by a semester’s worth of credit. If we are wrong on anything less than 40 students or by less than a semester’s worth of credit, it is not going to make any big difference anyway. Remember that when we are talking about 3 or 4 kids who have significant credit differences. Or even if we have dozens who are off by a few credits. It really won’t matter much.


I looked at three things.

+++On Track Students: Did the students in your crew get on track, off track or stable compared to how they started in September.

Not one student who was “off track” get “on track.” Even if they did better this semester than the past, they were still so behind that it was not enough to get them on track (Out of 65 students only 12 students actually earned more in January than they usually do per semester).

8/12 crews saw no change in this category and “did no harm.” However, 4 crews had a student who was previously “on track” fall “off track.”

+++Earning Power: How many credits did a student earn in January compared to their average credit earning per semester. Did they earn less, as much or more than they had in the past?

12 students earned more credit in January than they usually do. These students are spread out over 7 crews.

50 students earned less credit in January than they usually do and 5 crews saw all their students earn less than that student’s average.

+++Burden: Every student – even those on track – have a certain number of credits they need to earn per future semester in order to graduate on time. Did these burdens increase, decrease or stay stable?

59 students saw their burden increase. That is, they fell more behind than they were in September.

7 crews had every one of their students see their burdens increase. That is, all these students are in a worse position than they were in September.

4 crews each had a single student reduce their burden. 1 crew had one student stay stable.

So, as a school we lost more kids to the off track status, increased the burden of 90% of the students, and saw 77% of the students earn less credit than they had in previous semesters.