When the Teacher doesn’t know the answer… Oxygen and More

This past week I went into my two classes with more questions than answers for my students. See my last posting to read about Innovation Lab. This one is about Science Magic. I entered class with questions and no answers, which is quite unlike me – I like to test everything beforehand.  But this is how science works – it happens quickly and in the real world – so sometimes middle of the night simulations in a kitchen just won’t work.  This week we manipulated the pond water to see if we could increase oxygen, and we analyzed data from our Eco-Project vote. There is still time to vote!

The first question was about the pond’s water quality.  In the Fall of 2013 we tested the pond in Forestville’s atrium and found that it was a bit low on dissolved oxygen.  We ended the session by brainstorming ideas for increasing oxygen in the pond.  This week we brought one of those ideas to life and tested it.  The idea that we tested was whether a bubbler (like those used in fish tanks at home) would work in the pond.  I brought in the apparatus for a bubbler – air pump, tubing, air stones – and the students figured out how to set it up.

Our pond water experiment setup
Our pond water experiment setup

We then started the experiment, discussing aspects of the scientific method.  We discussed how scientists must always have a control, and that the control and the experiment must be exactly the same except for the variable being tested. So we collected two samples of pond water – both 1000 ml measured in a beaker, and put them in two bowls of the same size right next to each other. We used a digital thermometer to take the temperature of each sample and used the GREEN Water Quality kit to test dissolved oxygen levels. We then placed the bubbler in one sample, letting it run for about 30 minutes.

The concept of keeping everything the same was a hard one for students to remember.  In the end the samples were definitely treated differently, with more fingers put in the first one, more splashing during collection of the second one – but it’s a lesson that needs to start early. I introduced the idea of uncertainty: by sticking fingers into the water along with the thermometer, where students taking the temperature of the water or their fingers?  Teaching students these concepts reminded me of things that scientists take for granted, but which are not intuitive for our young (just yet).

Data analysis for the younger students – on paper
Spreadsheet analysis

While we waited for our experiment to run, we analyzed results from our Eco-Project vote.  One older student worked at the computer for a bit, experimenting with spreadsheets.  Others used printed data sheets. My first question was “Which project won?” We counted votes for the 5 projects – this actually took quite a bit of checking and rechecking as it was easy to miss data entries – and got the count for each project.  So far – with votes still coming in – it looks like the winner was “Planting more plants.” However, we didn’t stop our data analysis there.  I then asked the students if they could trust the data – for instance, had teachers swayed any of the results? We analyzed the votes from one class (we recorded first names and teachers for each voter), creating a bar chart for each vote from that class.  What we found was that there was a spread of votes in the class – every idea had at least one vote and there was no particular leaning.  So we decided that we trusted our data.  Finally, we looked at the percentage of votes from each class.  The student using the computer had worked out the percentages for each vote and had even created a pie chart to show the relative preferences. When the students saw this information, they realized that the “winning” project was ahead by only a hair, and that really, more people wanted something else.  We decided that our best way of dealing with that information was to encourage more votes, ideally to get a stronger lead for one option.

Initial results. Voting is ongoing.
Initial results. Voting is ongoing.

Just as class was ending we pulled out the bubbler, took the temperature of each sample, and took sub-samples for follow-up dissolved oxygen tests.  Prior to the experiment, both of the samples had dissolved oxygen tests showing 4 ppm (parts per million) in each sample.  The light orange color in each tube was the same. The after test appears to have a tiny difference in color – with the sample from the bubbler seeming a bit darker (which would indicate more oxygen).  But are my eyes simply playing a trick on me because I want the second one to be darker? See the pictures below to judge for yourself.  The students will analyze the data and calculate percentage saturation next week.

IMG_0295 IMG_0296

Left - none. Right - bubbler. IMG_0303 IMG_0302 IMG_0298 IMG_0297

In the case of this week’s lesson, I can’t say I was pleased with my own approach to class.  I had not analyzed the data before class and I had no idea if the bubbler would actually work in the conditions (time, pond water) that we had.  Of course I was armed with the scientific theory behind it all – approaches to data analysis, motion and surface area to facilitate oxygen diffusion – but theory doesn’t work so well with young students.  When they are young I think some of the nuance of science goes over their heads.  So it’ll be back to the drawing board for this lesson.

Stay tuned for the final results of project voting and the dissolved oxygen experiment.

Ms. Anu

P.S. Thanks to my students and iSchool Founder Nuria for the kind Teacher Appreciation card and gift!

(1) Comment

Comments are closed.