Launching a new data initiative within our CT program

As we kick off our second year of Computational Thinking at Excel, we decided we wanted to be more data-driven going forward. Last year focused on understanding how best to integrate CT-rich curriculum into classrooms, how to get students hooked and how to ensure we were working towards our program goal of empowering all students to see themselves as computational thinkers. Subjectively, it all felt pretty successful – we had some great lessons, students seemed to enjoy the projects, and we saw some national attention. This year, we’re looking to prove our success.

We have two goals we’re focused on. The first is to show that students are demonstrating knowledge transfer by connecting computer science concepts to other disciplines. This happens when a student proposes using an algorithm to help write a poem, or when they can identify when and how they used pattern recognition to analyze lab experiment results. The second, more audacious goal is to show that integrating computational thinking into core disciplines increases student mastery of targeted standards. That is, we want to show that this program can and should scale to the broader K12 education world – it directly increases measurable student success where teachers want it.

Now, I’ll be the first to admit that our data will not be statistically significant. We have two hundred students within a single school – our goal is not to demonstrate scientifically-rigorous results. However, there are a lot of very smart people working in this space, and we hope to contribute something worthwhile to the conversation in our methods and in our findings.

To that end, we will be experimenting with a wide array of data collection methods, all of which will be documented here. We began by capturing baseline data in a short survey taken by all students, with initial data presented below. Similar surveys will be presented throughout the year, to identify longitudinal trends across demographics and grades. Computational thinking questions will also show up on classroom exit tickets, measuring how students connect these concepts with what they’re learning on a daily basis. Finally (for now), specific questions of our quarterly interim assessments will be coded towards computational thinking goals, allowing us to measure student mastery in a more formal setting.

But let’s look at some baseline data, shall we? The very first thing each student did in their computer science classes was to take a short survey. This came before the name of the class was even defined for them – we wanted to know what they knew coming in. Students were asked to define “computer science” and “computational thinking”, and then responded to statements using a Likert scale, quantifying how much they agreed with each.

image001

The open-response questions were coded to shared themes. Above, most “computer science” definitions fell into either coding / programming, or a general knowledge of how computer worked, the science behind computers, technology awareness, etc. Thirty students wrote simply “science on computers”, and some talked about basic computer skills like typing and searching the internet. A handful connected computer science with computational thinking vocabulary (algorithms, decomposition) or problem solving in general.

image002

Unsurprisingly, the largest set of students weren’t sure what “computational thinking” means, or defined it as something as simple as “computer thinking”. This should be pretty straightforward to improve over time! Many students thought it meant to think like a computer, and some assumed it meant thinking about computers. More than thirty students, however, correctly identified that it was linked with problem solving or thinking “really really hard” about something. Encouragingly, we also had students who even after a summer vacation were able to recall words like “algorithm” and “decomposition”, or at least describe thinking in a series of steps or breaking down a large problem

The remainder of our data comes from a series of statements that students ranked their agreement with. They have been combined in charts primarily for the sake of brevity, not direct comparison. As we go forward, we will slice and dice this data based on demographics and grading data in core disciplines, but for now, these baselines are presented below without comment.

image005

image006

image007

image008

image009

If you’ve got ideas for further data collection methods or analysis techniques, let us know! We’re definitely looking for thought partners or more formal academic partnerships this year, as long as helping our students grow and succeed as computational thinkers remains the focus.

Facebooktwitterredditmail

One thought on “Launching a new data initiative within our CT program”

Leave a Reply

Your email address will not be published.