Monday, August 4, 2014

How many links does it take?

How many links does it take to go from one part of science to another?  To be a little more concrete, how many steps do you have to take to get from a paper on exercise physiology to a paper on black holes?

This was the question my son and I discussed some Sunday night.  It arose because I'd suggested PubMed as a good place for him to get information about exercise (what's good, or not, for you).  PubMet is a great resource.  At least the abstract of every paper (within some range of biology) is available there.  If you want to know how much protein is too much, and just why that's too much (my last use of it), they've got the research.  Now, PubMed works great for me.  I go in, find what I'm looking for, and get out.

My son, however, has the problem that I do with research in my field.  Namely, in reading the first paper on a topic, he sees how it references several others that are also very interesting.  So read one, find 3 more that have to be read.  (I'm being conservative here.)  Read those three, and each shows you three more that are also very interesting.  So now we have nine to read.  And so on. 

He mentioned that he could start out reading about exercise physiology and wind up with a paper on black holes.  I agreed (he is my son after all) and started wondering about how many steps it would take.  The only thing which keeps me from having the same problem is that I reserve this inclination for my professional field.  But I do approach satisfying it there.  (Eventually, namely after the first couple thousand papers I read, the interesting papers I found from reading one paper were papers I'd already read.)

My guess is maybe 20 steps between exercise physiology and black holes.  I know that it's only 1 step between turkey vultures and sea ice.  Keep in mind, turkey vultures are not polar creatures, and do not like it to be especially cold.  You don't find them closer to the Arctic than southern Canada.  But I was involved in a project, which definitely did need knowledge of sea ice, and that project was then used by people studying turkey vultures.  This is part of what I call the range and unity of science.  I also know, though never wrote it up for the blog, that it's only 1 step between trying to observe gravitational waves (LIGO) and predicting waves on the ocean.  My source being one of the LIGO people asking for information about the ocean's waves.

Might be only two steps between exercise physiology and black holes.  1) Exercise physiology paper looking at swimming or kayaking in the ocean, and how waves affect that. 2) waves and LIGO (I'm sure some LIGO paper cites both waves and black holes at this point).

Since I've put forward two unlikely connections, each only 1 step, I'll turn the table over to you all.  Can you make a connection -- in the professional literature, no fair using something like 'Guide to all science' -- between exercise physiology and black holes?  How short a chain can you make it?  Feel free to change the targets to other things you're interested in (kumquats and functional MRI imaging of the brain?).

Tuesday, July 29, 2014

Arctic Ice Guesses 2014

Have to bite the bullet here and discuss my guesses for the September 2014 Arctic sea ice extent average.  The thing which has made them so difficult is that they're so different from each other.  Now, one method I've retired.  It was simply so bad last year that there's no point in continuing it.  That is the one I did based on a population growth (of ice-free area) curve.

That leaves, however, two different model-based guessers.  The first one, which appears at the Sea ice prediction network as 'Wang', is based on doing a statistical regression between what the CFSv2 (climate forecast system, version 2) predicts for September ice area and what is observed.  The second also uses CFSv2, but in a different way.  Namely, we know that the model is biased towards ice being too extensive (which the Wang method addresses statistically) and to being too thick.  The Wu method is based on thinning the ice and seeing what the extent is thicker than a critical limit (60 cm it turns out).  (Both Wang and Wu work with me, or vice versa, and we discuss how to work on these guesses.)

The guesses are:
June -- Wang -- 6.3 million km^2 0.47 stdev
July -- Wang -- 5.9 million km^2 0.47 stdev
July -- Wu -- 5.1 million km^2 0.56 stdev
June -- Wu -- 4.8 million km^2 0.65 stdev

One of the things to notice is that the two estimates moved towards each other from June to July.  Wu rose, the higher Wang declined.  The second is, the Wu method has a standard deviation (variability of its estimate) that is double what it was last year.  Whatever is going on in the model, it is much less self-consistent in previous years.  Much more uncertain.  This is one of the reasons for ensemble modeling (part of the Wu approach).

You can also see that the Wang estimate is the highest of all -- even higher than the Watts up with that group estimate.  This is true in both June and July.

So, what's up?  Well, I'm not sure.  Some of it is certainly related to sea ice thickness estimates.  Xingren (Wu) did a different approach based on thickness for June, which we didn't submit, but which landed in between the official June estimates from Wang and Wu.  With the step towards convergence from June to July between Wu and Wang methods, I'm inclined to guess (a meta-guess) 5.5 million km^2 for September.  If this were to occur in reality, it probably suggests something important.  What, exactly, I'm still pondering.

Monday, July 28, 2014

Yabba2 -- Construction

Katherine Monroe:

Below are the full instructions on how to build exactly what I built. There is so much that could be done to improve the design. I know it is not anywhere close to perfect. The materials I used were makeshift, whatever was lying around the house or wasn’t too expensive. But that was the point. I like spontaneity. It doesn’t have to be extremely elaborate to work and to be useful. This is for anyone who wants to do anything with it or for anyone who is just interested.
1. Vernier Flow Rate Sensor, Order Code: FLO-BTA/FLO-CBL
2. Vernier Lab Quest by Vernier Software and Technology.13979 SW Millikan Way, Beaverton, Or 97005. 888-837-6437. (for transmitting and collecting data from the Flow Rate Sensor.)
3. 3 22” steel dowel rods
4. Compressed fiber board
5. Minwax Polyurethane Varnish
6. 24 Gauge- 100 ft. Green Floral Wire Twister
7. Small foosball
8. 2 IDEC Sensors, Magnetic Proximity Switches. Type: DPRI-019. Premium Waterproof Clear Silicone Sealant (without Acetic Acid)
10. Plugable USB to RS-232 DB9 Serial Adapter (Prolific PL2303HX Rev D Chipset)
11. RS232 Breakout - DB9 Female to Terminal Block Adapter
12. Xnote stop watch, version 1.66 (downloadable at
13. Loctite Epoxy glue
14. Drill
15. Hammer
16. 2 Brass quarter inch Phillips Head screws
17. Electric hand held reciprocating saw
18. Electrical tape
19. 4” by 3/4” strip of thin steel (cut from a can)
20. Twisted Nylon string
21. 2’ long wooden slat (to be used as a handle for carrying and placing the designed device in the water.)
22. Study Site: United States Geological Survey (0164900), Northeast Branch of Anacostia River at Riverdale, MD. (Test site was just next to the USGS data collection gauge.) (-38.961,  -76.626)

Friday, July 25, 2014

Yabba -- Building your own stream gauge

Katherine Monroe*, the author/inventor of this stream gauge, is a graduate of Eleanor Roosevelt High School, in the same class as Elliott Rebello.  Her senior project was quite different, and you'll get to see the details in her own words.  Part 1 is today, the narrative.  Part 2 will be on Monday -- the full parts list and construction instructions.

Engineering the “Yabba Dabba Doo”

By: Katherine Monroe
June 2014
Eleanor Roosevelt High School

One year ago, as a rising senior at Eleanor Roosevelt High School in Greenbelt, MD, I was faced with the same grueling task that all students in the Science and Technology program were: RP- that is Research Practicum. This is what we had been leading up to for the past three years and now, here it was. 

RP is the year long research project that all seniors in the Science and Technology program at Roosevelt are required to complete. By the end of the year we had to have completed a science fair backboard, a laminated poster, a power point, and a five chapter paper. We had a whole class dedicated to working on all the different aspects of the project and to learning how to analyze data quantitatively and statistically. We were told to come up with a project that was interesting to us because we would be spending the entire year working on it. Some students applied for internships with NASA, USDA, NIH, the University of Maryland, the National Zoo, Walter Reed Hospital and more. Other students applied for programs established by and within the school and other students worked separately from any structured programs. 

I chose to apply to a program started by one of our school’s AP Chemistry teachers called WISP (Watershed Integrated Study Program.) It was a program which emphasized local water quality studies. Students in WISP formed groups and measured chemical and physical properties of local waterways at a bunch of sites across the county. We measured nitrate and phosphate concentrations, dissolved oxygen levels, alkalinity, ph, turbidity, total dissolved solids, temperature and took seasonal macroinvertebrate data. We then added our values to an ongoing database which students could draw from for all sorts of studies which require long term data collection.

I applied to WISP because out of the endless ocean of things I was unsure of I was sure of at least one thing and that was my love for the environment and for being outdoors. After having been accepted to WISP I began the process of deciding what to do for my project. In the end the basis of my project came from the one other thing I was sure of which was that I enjoyed building things. So I knew I wanted to build something and I knew it should relate to the local water quality movement that WISP was promoting. I looked at what we did in WISP and thought about what we measured. One aspect of water quality that I found important to a gaining a comprehensive understanding of a stream or river’s health (that we did not measure in WISP) was the speed of the water in the stream.

The water speed can provide insight into the types of organisms that can live in a stream or river, to the flow of sediment down a river, and sometimes to the oxygen levels of a river. The greater the speed of a river, the more aerated it typically is, and the higher the dissolved oxygen level. All of these can greatly affect the health of a stream or river. Stream speed can also help in understanding volume flow rate of a stream and in identifying storm water runoff patterns near and around the stream or river and in developing flood models. Overall stream speed seemed like an important factor that we did not account for in WISP due to what I believe to be a range of reasons, the expense of the necessary equipment, the complicated nature of taking stream speed measurements at a variety of points along a stream and still getting inaccurate results due to the variability of speed along an uneven stream bed, and maybe more. 
I decided that I wanted to design and build something that would measure stream speed; something that would be cost effective and accurate, and something that would be easy for anyone who wanted to do research, like the kind we do in WISP, to build for their own purposes. The point was to encourage citizen science by going through all the steps independently and then showing people what I had done so that they could do it too. 
In the end what I came up with consisted of an open track along which a light and neutrally buoyant ball was pushed by the flowing water. On either end of the track there were magnetic sensors which timed how long it took for the ball to move from one end of the track to the other. From this the speed was computed. This is where the name of the device comes in. I decided to call it the “Yabba Dabba Doo” because it looked like something out of the Flintstones (or maybe like an old-fashioned push lawn mower.) 
Next I had to figure out if my design actually worked. In order to do that, I compared my device to an already existing speed measurement device by the company Vernier. I assumed that the Vernier data was accurate. My null hypothesis was that the average of the speeds taken with my device would be statistically equivalent to the average of the data taken with the Vernier device. Strangely enough, I wanted to FAIL to reject the null hypothesis. Statistics are weird. I collected data with each of the devices within a half an hour period of each other (assuming that the stream speed would not change in that amount of time.) Then I analyzed the data through a statistical t-test which looked for a significant difference between the two sets of speeds and their averages. 
After multiple trials and readjustments to the design I got what looked like a pretty accurate result. Initially (before reaching my final design), the object moving along the track of the Yabba Dabba Doo was a metal disk attached to the metal rods of the track with metal rings. All that metal caused for a lot of friction between the disk and the track which prevented the disk from reaching the speed of the water and gave me slower averages than the Vernier averages. This also yielded a significant difference in the statistics which I did not want. In order to minimize the coefficient of friction there, I changed my design to one which consisted of that light weight, neutrally buoyant foosball (which I mentioned earlier,) that was attached to the metal track with small sections of plastic drinking straw. The foosball had no tendency to float or sink in the water and caused less friction on the track. Furthermore, the coefficient of friction between the plastic straw and the metal track was much less than between the original metal rings and the metal track. After making this design change I got averages that were much closer together between the Yabba Dabba Doo and the Vernier and in a majority of my statistical t-tests there was no significant difference. In the end I had a device that seemed to be working pretty accurately and cost $350 less to build than to buy the Vernier. 
Going through that process, of trial and error and trial and error and trial and then success!!! was extremely gratifying. I got to experience the life of an engineer first hand and to learn about the plethora of unforeseen problems that can arise. 
This entire year was a great learning experience for me. I learned what a null hypothesis was and how to go about trying to reject it (or in my case, fail to reject it.) I learned all sorts of things about the materials that I used to build my device. I learned how to bear standing out in winter weather water up to my waist, wearing my mom’s baptismal waders, for the good of science! I learned about all the things that can go wrong and need to be accounted for in a field study like this one. I learned how to access all sorts of functions on excel, power point, and word. And I learned something about myself. I learned that engineering, and I think in particular environmental engineering, is something that I could easily be passionate about and be satisfied with in the future. And for a chronically confused and disoriented teenager about to go to college, that is reassuring.

Below are the full instructions on how to build exactly what I built. There is so much that could be done to improve the design. I know it is not anywhere close to perfect. The materials I used were makeshift, whatever was lying around the house or wasn’t too expensive. But that was the point. I like spontaneity. It doesn’t have to be extremely elaborate to work and to be useful. This is for anyone who wants to do anything with it or for anyone who is just interested. 

[Back to your host: Directions on Monday ; The * is that Ms. Monroe normally goes by a more informal version of her name and I've gone with the formal here.  Formal for publication is a rule I use myself (I'm not usually Robert), and one which I've learned is helpful for women to be taken seriously.] 

Wednesday, July 23, 2014

Data are ugly

Current news about whether there really is an increase in Antarctic sea ice cover is reinforcing my belief, shared by most people who deal with data, that data are ugly.  This work argues that the trend that some have seen in some trend analyses has more to do with the data processing than with nature.  I encourage you to read the article in full itself.  It is freely available.

From the abstract:
Although our analysis does not definitively identify whether this change introduced an error or removed one, the resulting difference in the trends suggests that a substantial error exists in either the current data set or the version that was used prior to the mid- 2000s, and numerous studies that have relied on these observations should be reexamined to determine the sensitivity of their results to this change in the data set.
One of the obnoxious things about data sources is that they don't remain the same forever.  This is not so much a problem for my concerns about weather prediction, since the atmosphere forgets what you said you observed in a few days.  But for a climate trend, the entire record is important.  For the data set being discussed, the Bootstrap Algorithm (Comiso) applied to passive microwave, we immediately run in to data obnoxing.  Since 1978, there have been several passive microwave instruments -- SMMR, SSMI F-8, SSMI-F11, SSMI-F13, 14, 15, AMSR-E, SSMI-S F16, 17, 18, and AMSR-2.  They didn't all fly at the same time, and they don't have exactly the same methods of observation.  And none of them exactly observe 'sea ice', which leads to a universal problem which we (people who want to use these instruments to say something about sea ice) all have to deal with.

So a few considerations of what all is behind the scenes of this paper and the earlier Screen, 2011.  The latter paper involved some of my work (read deep in to the acknowledgements).  This one doesn't, but the fundamental issues are the same ...

Tuesday, June 24, 2014

Ice Science Cafe

This Thursday (June 26th) I'll be talking about ice, and, better, yet, answering questions about ice at the Annapolis Cafe Scientifique.  The time will 6:30 PM.  Same location as usual -- Cafe 49 West.  Local folks are invited, and non-local are welcome to pose questions here. 
I'll also invite folks to suggest topics for me to prepare for.  My sources of information on sea ice are pretty different than my audience's, so it's hard for me to tell what people have been hearing about.

Our Sea Ice Outlook guesses this year are wildly different -- 4.8 million km^2 and 6.3 million km^2.  The former is above the median of contributions.  The latter is the highest of all, even higher than from Watts and company.  We have a third, unpublished, guess in between the two.  I need to check out a couple of things before writing it up.  Given the spread we're encountering ourselves, I think we're going to learn a lot this year.

Wednesday, May 21, 2014

A challenge and offer

The challenge is for a science teacher to incorporate Science and/or Nature in to their teaching.  The offer is that I'll pay for the subscription(s) for at least the first year.  US High School teachers only (sorry others, but I'll exercise provincialism here).  First come first served.

The prompting here is that I've been reading some of my backlogged Science and Nature issues.  Some articles are past almost all K-12 students (though not some I was talking to at Eleanor Roosevelt High School's Research Practicum celebration, so even the most rarefied will be useful in some institutions).  But there are research article summaries which don't require such a high level of background.  And I think a talented enough teacher can make good use of the wealth of material in each issue of Science and Nature.

The third leg of the tripod, so to speak, is that I'll invite some discussion as to exactly how a (US) grade 9-12 teacher can make use of professional journals like Science and Nature.  I know I have a teacher or two in the readership, and look forward to their ideas.

Update 5/27/2014: I've now got a taker, @ragbag01 on twitter.  But discussion of how to make use of the subscription is still very welcome (per anonymous1). 

Anonymous2 notes has some free materials for the classroom on selected papers.

An article of mine on language and reading science may also be useful -- Science Jabberwocky

Tuesday, May 20, 2014

Agriculture in changing climate

If you're one of the people who thinks that food grows in grocery stores, all the talk about climate change affecting agriculture is passing you by.  You'd be wrong to think so, but most modern industrial country people are not involved in agriculture.  Having grown up in the corn belt I'm perhaps a little sensitized to the fact that farming is hard work.  And that farming is extremely sensitive to details of the weather.  Anything sensitive to weather is sensitive to climate.

Many foods depend on extremely specific climates.  Not just current climates, but the history of climate for thousands of years -- soils to grow a good crop in develop over that time span.  The corn belt is where it is not just because of current (well, 1950-1980) climate but because in the thousands of years before that, the soil improved and developed to the point of being able to support such farming.  For something like corn, which is grown across a huge area, climate change can be an issue.  But someone, somewhere, will probably be able to grow corn 30 years from now.

But many items grow in relatively small areas, subject to the whims of local change.  Some of these are:
Such specialized crops are sensitive, to the point of perhaps being eliminated, to climate changes.

I invite readers to check the sources linked to above.  And to contribute their own crop types that are either sensitive to climate change, those which are insensitive, and those which would even benefit from expected changes.  Please do include links to your examples.