26 October 2015

Been a while hasn't it

Didn't mean to disappear quite that long, 2.5 months it turns out.  Well, I'll be picking up my posting again.  In the interim, I've been on vacation in the Peruvian Amazon, picture below, been a manager at work, and generally running around.

You probably think of piranha when thinking about the Amazon river.   We were on a fishing expedition for piranha.  My wife, the pilot, and our guide all caught piranha -- mostly red bellied, but a couple white bellied.  Above is my one catch.  It is a sardine (about 10", 25 cm).  I'm amused, or puzzled, or something.  I'm happy about it.  It's a reminder of the fact that the world is more involved and weirder than you might think.  And a reminder than if there's weirdness to be found, I'll be the one to find it.

Managing, well, I'll go back to a story from college.  A friend of mine was a highly talented computer science major who went from starting his bachelor's degree to finishing his master's in 4 years.  While we were room mates, he did a group project with the other two top students in the class.  If technical skill were the only issue in doing a technical project, this group would have done by far the best.  Instead, it was a mediocre project.  That's when he, and I by contact, developed an appreciation for good managers.  One of their skills is to get the best out of a group of people.  So that's my aim.

23 July 2015

Data Horrors

"The great tragedy of science -- the slaying of a beautiful hypothesis by an ugly fact."  Thomas H. Huxley.

Sometimes, though, you have to pay attention to just how ugly the observation (fact) is.  And even more to how ugly a collection of observations is.  Science fair project I judged a couple of years ago, the student mentioned his methods for keeping the experiment, which had to be untouched while going, out of reach of his young brother.  This student has a firm grasp of the ugliness of data and trying to collect it.  I gave him high marks.

I also mentioned a story or two I knew of data collection challenges.  I'll share them and some others here, and invite you to add your own.

One family of ocean data comes from buoys floating on top of the ocean.  A lot of the ocean is far from land, therefore far from perches for birds.  Sea gulls and other birds are often grateful for the lovely perches we're putting out for them.  Unfortunately, it does not help the accuracy of your wind speed measurements to have a bird sitting on your gauge.  Birds sitting on the solar panel reduce your energy available/recharge rate, and thence maybe lead to data outages while waiting for recharging. Guano is great for fertilizer, but wrecks havoc on the accuracy of your temperature, pressure, and moisture readings.

Walrus don't mind taking a rest every now and then either.  They're not normally a threat to wind speed measurement (which is at the top of the buoy).  But we also want to get wave measurements -- how high are they, how fast are they, what direction are they going.  Having a walrus or two on your buoy slows its ability to respond, and may suppress the peaks of the measured waves.

On land, your instrument enclosures (the Stevenson Screen for instance) provide a nice place for bees, wasps, small birds to nest.  Squirrels like to play with them too.  A beehive next to your thermometer does not help its accuracy.

Back at sea, I once got a call about a problem buoy.  It was reporting extremely high temperatures near noon because the paint had been stripped during a storm, and the now-bare metal was reflecting sunlight onto the marine thermometer.

That should get you started for remembering your own horror stories about data collection.

Recently saw someone on the web taking the line that if data wasn't perfect, you should throw out everything from that instrument or site.  Well, no.  If you did that, you'd never have any data to work with.  For my examples, you mostly just ignore the data during the period you've got a walrus infestation.  But there are other kinds of things which affect your observing, and which you might be able to compensate for.

08 June 2015

Spectating on Science: Length of the Game

Science doesn't move as fast as basketball, so spectators need to adjust their expectations.  The 'game' plays out over a period of years.  The first play of the game is that someone publishes their work in the peer reviewed professional literature.  But that's something like the first pass in football/basketball/hockey -- it might _eventually_ turn in to a score.  But it isn't the score itself.

The short-hand for this is 'single study syndrome'.  All sorts of things show up in the media, or scientific literature, as being interesting and perhaps revolutionary.  But almost no revolutions follow from the very first study.  Few of the potentially interesting ideas, from the first publication, really hold up for any length of time.  Something worked out to be interesting _once_.  But, chances are good it won't hold up in the long run -- the previous consensus or state of knowledge is more likely correct than the new idea with just a single supporting piece of research.

For the spectator of science, which also includes me most of the time, we can, and have to, sit back a little and wait for the confirming evidence or studies.  One area which is an active area of discussion in science now is whether the recent US weather extremes (Eastern US has been far to the cold end of the historical distribution in winters of 2014 and 2015, but the Western US has been far to the hot end -- including setting several all time records) is due to the reduced Arctic sea ice pack.

02 June 2015

How to build a climate model?

How is it that we go about building climate models?  One thing is, that we would like to build our model to represent everything that we know happens.  If we could actually do so -- mainly meaning if the computers were fast enough -- life would be simple.  As usual, life is not simple.

I'll take one feature as a poster child.  We know the laws of motion pretty well.  I could write them down pretty easily and with only a moderate amount more effort write a computer program to solve them.  These are the Navier-Stokes equations.  On one hand, they're surprisingly complex (from them comes dynamical chaos), but on the other, they're no problem -- we know how to write the computer programs to do conservation of momentum.  Ok entire books have been written on even a single portion of the problem.  Still, the books have already been written.

The problem is, if you want to run your climate model using what we know is a representation sufficient to capture everything we need to do, in order to represent everything we know is going on, you need to have your grid points only 1 millimeter apart.  That's ok, but it means something like 10^30 times as much computing power as the world's most powerful computer today. (A million trillion trillion times as much computing power.)

What do we do in the mean time?

01 June 2015

What is a model?

In the blogospheric talk about climate change 'model' gets mentioned a lot.  Sometimes it's merely descriptive, and often it is perjorative.  But it is mostly never really defined.  Like or loath them, nobody says just what models are.  Except for me, here and now.  (And probably a number of other people at other times and places -- but still, few and far between. :-)

'Obviously' a model is a particularly attractive human.  Right?  I've actually received email at my workplace (a 'modelling branch') from people who were trying to advance the careers of their models, in this sense of model.  We don't deal with that kind of model.

'Obviously' a model is to take the original (the Apollo Saturn V rocket that took people to the moon, for example) and duplicate everything about it, but at 1/32 the original size  Right?  Perhaps.  I know people tho like this sort of thing.  But again that's not what we mean either if we are discussing climate (or atmosphere, ocean, sea ice, land, glacier, ...) models.

For my purposes, a model is an idealized, and/or simplified, representation of the real world.  When we are interested in something as big and complex as climate, or even just the Arctic sea ice pack, we really can't cope with the whole thing in all of its glorious complexity.  We have to simplify the reality somehow.  That simplification is the model.

In this sense of 'model', models are everywhere.  We use a model for human behavior when we decide what somebody else means when they raise their hand in a certain way.  (is it open hand, or a fist?  did they just say 'hello', or 'I'm going to kill you'.  and so on)  Weather has also been modelled by using 'dishpans' -- Raymond Hide and David Fultz being two of the best examples of people taking this approach*.

22 May 2015

Bad philosophy 1

Different people are good at different things, which is no real surprise; but one of the common situations where some people suddenly become blind to this is scientists regarding philosophy.  Plus, well, most non-philosophers regarding philosophy.  I've had the good fortune to know a couple of serious philosophers of science, enough to appreciate that they've developed some understandings more profoundly than I have.  And, I'm immodest enough to extend that to 'more profoundly than most non-philosophers'.

One path of bad philosophy, the one which causes this post, follows from mistakes on the matter of certainty.  Or, naming it by way of the error it leads to, intellectual nihilism.  Certainty is a problematic concept for science, and science versus philosophy.  Errors come from both sides, so beware of throwing rocks.  From my philosophical vantage point, science is intrinsically uncertain.  My scientific excuse for that philosophical assumption is to consider the Uncertainty Principle.  It's enough for here to understand that you cannot, simultaneously, observe everything about a complex system (like an electron, an atom, or the climate system) exactly.  You can do pretty well, but there's always some uncertainty in the observations.

A different line of philosophy regards how and how well you can consider yourself to know something (epistomology).  One view of this derives from Karl Popper, under the label 'falsification'.  For here, it's enough to note that one can really only be confident about your knowledge to the extent to which you've tested it.  (Do, of course read further!)  Since you can only be confident about your knowledge to the degree to which you've tested the idea/hypothesis/theory/..., and any test of an idea (etc.) is intrinsically uncertain (uncertainty principle again), you can never be entirely certain that you have the right answer, idea, hypothesis, theory.  So some humility is in order -- for everybody.

Enter the bad philosophy.

18 May 2015

Playing With Numbers: Triangles and Squares

You can play with numbers; which will be a surprise to some and extremely obvious to others.  I'm writing for those who will be surprised.  Consider the picture of dots here:
* *

We've got a triangle, a small one.  It has 3 dots.  Now put another row of dots, keeping it a triangle:
* * *
 * *

There are 3 dots in the first triangle, 6 in the second.  Next triangle will have 10 (as we add in a row of 4). 

For gaming: What is the 20th triangle number?  Is there a way you can look at a number and tell whether it is triangular?

Or you can play with squares:

* *
* *

* * *
* * *
* * *

So the first three square numbers are 1, 4, 9.  Next, the 4th square number, will be 16.  These are actually simpler to game than the triangular numbers.  What's the 20th square number?

And of course we can make more interesting figures, like hexagons:
 * *
* * *
 * *

So the first hexagonal number is 7.  What's the second?  Can you predict the 3rd, the 20th?

On the one hand, we're just playing some games here.  On the other, there are also serious mathematical papers on hexagonal numbers, and triangular, octagonal, and so forth.

08 April 2015

Autism Awareness month 2015

A reminder that April is Autism Awareness month.  I can't say very much first hand, but won't let that stop me from writing.  (As usual.)

Couple notes.  One is, though I'm not autistic, I'm also not dead center 'normal' (whatever that is).  (what, you've noticed?).  I deviate from 'normal' in some directions that point in the direction of autism.  Not enough to be on the autism spectrum myself, but enough that my sister found me useful as a guidepost towards her autistic students.  Partly because of this, I am irritated by people who say 'everybody in science is autistic'.  

Another note is, I know a number of autistic people, at various places along the spectrum.  That's the other reason I'm irritated by such blanket generalizations.  I wouldn't be surprised to find that some working scientists/engineers/... are indeed autistic.  But it's neither necessary nor sufficient, nor does it really honestly connect to either the scientists (who may or may not be autistic) or autistic people (who may or may not be scientists).

The thing to do is, er, be aware of autism.  See also my sister's (same one) blog.  Autistic people are people.  Start, and finish.  As with any people, you get farther with understanding them as themselves rather than trying to fit them in to preconceptions you may have.