Earlier I talked a bit about the
math and
science classes you would probably take if you went to study climate. Somehow, perhaps because they're so ubiquitous, I forgot to mention anything about the computer end of things. On the other hand, they
are ubiquitous, so I should catch up a bit and mention the computer hardware, operating systems, software packages, and programming languages you might run in to, or that I have, in working on climate modelling and data analysis. A different part: the computers are just tools. Being good with computers (knowing many languages, whatever) is like being good with a hammer. Being good with hammers doesn't make you a carpenter, nor does being good with computers make you a scientist.
The hardware is pretty much anything you can find. For small models or data sets, a single processor desktop is still used. Small being defined as 'what you can do on a single processor desktop system'. Given that they're about a million times more powerful than the desktops of 30 years ago, this is actually not a minor set. In intermediate ranges are multiprocessor desktops or workstations (or at least what we used to call workstations; the distinction seems far less common now), up to a few dozen home-style processors. I see that you can now get at home, if not cheaply, 8 processor systems with 8 Gb memory. The first computer I worked on had 8 Kb of memory. These mid-range systems can do substantial work, particularly if used well. At the high end, you're looking at hundreds to thousands of processors, or vector processors. The latter were the domain of Cray in the mid 70s-90s; NEC started producing them as well. They had (from the later 80s) multiple processors, but the number was fairly small. The power of such systems was that each processor could do the same thing many time (16-32 in the early 90s). Since our models tend to do just that, this can be an effective design.
Operating systems have often been a matter of whatever the vendor shipped. In the 70s and 80s, this was often home-grown at the vendor's. These days for larger systems it's almost always some flavor of Unix or related. For smaller systems, it's hardware-based (Mac, which these days is also Unix-based), Unix-related (Linux, BSD), Windows, or other, what seem to be more regional systems (Acorn?).
Programming languages are often a matter that people get ... let's stay 'testy' ... about. I don't really understand it myself. For the models themselves, the main language is Fortran in whatever flavor is widespread. These days 90/95. But others get used as well or instead, including C, C++, Java, and Python. For the data processing, it is usually one of these others which is most-used (mostly C, with the others increasing; at least from where I sit). The reason I don't see the problem is that I am polylingual myself (more below) and learning a new programming language just isn't a big deal and doesn't seem like it ought to be. The one significant hurdle is going between procedural languages like Fortran/C/... and object oriented languages like C++/Java/... But if you're staying on one side of that hurdle, going from one language to another is a fairly minor matter if you learned to be rigorous in the first place.
You'll also likely wind up using one or another graphics or toolbox sort of package. Some are: Matlab, IDL, R, GraDS, MacSYMA, MAPL. I'm sure there are a raft more; these are just ones I've heard of recently. As they're (mostly) commercial packages, which one gets used varies widely by what center you're at.
So a fairly typical set to know is something like:
- Fortran and at least one other language (C/C++/Java/Python/...)
- A Unix-related operating system plus a home system (Mac, Windows, ...)
- A graphics package
And, for all of them, be ready and able to learn a new one with fair speed. As you'll see from my list of systems, languages, and packages (all of which are incomplete), knowing any one of them will only suffice for a while and often not a long while.
My own hardware list (as I recall it over the decades):
Home-type computers:
Wang 8k, Apple II, Commadore 64, Mac Plus (wrote my PhD on one), Mac (IIx, IIfx, SE30, Powermac 120, PowerMac G5, Mac Pro), IBM-PC (when it really was IBM), PC-AT, 386-type, 486-type, pentium I, II, III, IV.
Workstations:
DEC PDP-8, PDP-11, VAX 11/750, VAX 11/780
HP (... never seemed to name theirs, but a 68030/68881, running HP-UX 5, and then, later, an HP-UX 10 system)
SGI Iris, Indigo, Origin,
Sun Sparc 1, 10, and a couple of Solaris systems
Big systems:
CDC 180, 195
IBM (old big-iron systems)
Cray 1, 2, X-MP, Y-MP, C-90, J-916
IBM RS/6000 (PowerPC based parallel systems PowerPC 2-6 if I remember correctly)
Operating systems:
NUCC (Northwestern University CDC system, SNOBOL-based)
NOS-Ve (CDC system of early 1980s)
COS (Cray operating system)
VAX/VMS
IBM systems: MVS, VM/CMS, ... (?)
*nix flavors: HP-UX, PDP unix, Solaris, Linux (slackware, redhat), UNICOS (Cray unix), AIX (IBM Unix), ... no doubt several more
CP/M (not really an operating system, but, for lack of a better word ...)
DOS 1.0, 3.0, 5.0, 6.0; Windows 3.1, 95, XP; Desqview X
MacOS 1-9, X
Programming Languages:
Fortran (4, 66, 5, 77, 90/95; Ratfor, Watfor, Watfiv)
C
C++
Pascal, Basic, Java
Logo, Algol, APL
Forth, Lisp
Not languages, but:
VAX/VMS assembler, 68030 assembler
And, again I wouldn't call them languages, but I use them: Perl, Javascript
... and yes, I did use punched cards. Wrestled a pterodactyl so I could use it's beak to punch out the holes!