Saturday, September 25, 2010

Anthony Doerr

Read the whole thing. It starts
During my sophomore year, 1992, 1,500 scientists, including more than half the living Nobel laureates, admonished in their Warning to Humanity: “A great change in our stewardship of the earth and the life on it is required if vast human misery is to be avoided and our global home on this planet is not to be irretrievably mutilated.”

So what have we done? Not much. From 1992 to 2007, global CO2 emissions from burning fossil fuels rose 38 percent. Emissions in 2008 rose a full 2 percent despite a global economic slump. Honeybees are dying by the billions1, amphibians by the millions, and shallow Caribbean reefs are mostly dead already.2 Our soil is disappearing faster than ever before, half of all mammals are in decline, and a recent climate change model predicts that the Arctic could have ice-free summers by 2013. Unchecked, carbon emissions from China alone will probably match the current global level by 2030.

The god thou servest,” Marlowe wrote in Dr. Faustus, almost four hundred years before the invention of internet shopping, “is thine own appetite.” Was he wrong? How significantly have you reduced your own emissions since you first heard the phrase “climate change?” By a tenth? A quarter? A half? That’s better than I’m doing. The shirt I’m wearing was shipped here from Thailand. The Twinkie I just ate had 37 ingredients in it. I biked to work through 91-degree heat this morning but back at my house the air conditioner is grinding away, keeping all three bedrooms a pleasant 74 degrees.

My computer is on; my desk lamp is glowing. The vent on the wall is blowing a steady, soothing stream of cool air onto my shoes.
h/t Andrew Sullivan. Anthony Doerr, whom I had not heard of until today, lives in Idaho, writes "on Science" column for the Boston Globe, and is a 2010 Fellow of the John Simon Guggenheim Memorial Foundation.

Friday, September 24, 2010

Exploring NIO

The PyNIO install from binary went more or less without incident; I had to dig up a conformant numpy (1.5 is out but the binary required 1.4)

I am deferring a PyNGL install until I figure out what PyNIO can do.

So now I have py 2.5.1, numpy 1.4.1, Nio 1.4.0 and a doc.

The main doc is here

Not hard for me to dig up a netcdf file from the CAM distribution; let's take a 2 D dataset from the land surface model. No idea what is in it.


-rwxr-x--- 1 tobis G-25522 7671124 Oct 22 2009 clms_64x128_USGS_c030605.nc


seems to work! The data object has some attributes set by the data file.


>>> import numpy as np
>>> import Nio
>>> data = Nio.open_file("nctest.nc")
>>> data

>>> dir(data)
['Conventions', 'Glacier_raw_data_file_name', 'History_Log', 'Host', 'Inland_water_raw_data_file_name', 'Input_navy_oro_dataset', 'Lai_raw_data_file_name', 'Logname', 'Revision_Id', 'Run_mode', 'Soil_color_raw_data_file_name', 'Soil_texture_raw_data_file_name', 'Source', 'Urban_raw_data_file_name', 'Vegetation_type_raw_data_filename', 'Version', '__class__', '__delattr__', '__dict__', '__doc__', '__getattribute__', '__hash__', '__init__', '__module__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__str__', '__weakref__', 'close', 'create_dimension', 'create_variable', 'set_option']

but how to get at the actual contents?

Ah; the directory is incomplete. Apparently the extension module wasn't Pythonic. This can be patched.

So we can get at dv = data.variables() yielding


['PCT_CLAY', 'LATIXY', 'LANDMASK', 'LANDFRAC_PFT', 'PCT_LAKE', 'LANDFRAC', 'NUMLON', 'LONGXY', 'MONTHLY_HEIGHT_BOT', 'PCT_WETLAND', 'MONTHLY_SAI', 'PCT_URBAN', 'PCT_SAND', 'MONTHLY_HEIGHT_TOP', 'PCT_PFT', 'MONTHLY_LAI', 'SOIL_COLOR', 'PCT_GLACIER', 'PFT']


Now, just printing (__str__) data tells us


dimensions:
lsmlon = 128
lsmlat = 64
...
variables:
...
float PCT_LAKE [ lsmlat, lsmlon ]


so dv['PCT_LAKE'] ought to be a 128 x 64 array

but dv[2000,0] works. Apparently an overflow is treated as a [-1] index and an underflow as a [0] index. Not very reassuring.

Exasperatingly, though PyNIO is available for py2.5 and numpy 1.4, PyNGL is not. Must get python 2.6... boring...

Tuesday, September 21, 2010

Install confusion

Trying to install PyNCL on a Red Hat machine without sudo priv's.

1) Py2.7 from source. Builds fine. Runs fine.

2) numpy from source using aforesaid python. atlas/blas instructions aren't too clear, proceed blindly ahead. build works fine.

3) import numpy or import numarray fails:

ImportError: Error importing numpy: you should not try to import numpy from
its source directory; please exit the numpy source tree, and relaunch
your python intepreter from there.

very little info on Google about this. Try various versions of current working directory. Failure all the same.

4) looks like this code:


if __NUMPY_SETUP__:
import sys as _sys
print >> _sys.stderr, 'Running from numpy source directory.'
del _sys
else:
try:
from numpy.__config__ import show as show_config
except ImportError, e:
msg = """Error importing numpy: you should not try to import numpy from
its source directory; please exit the numpy source tree, and relaunch
your python intepreter from there."""
raise ImportError(msg)
from version import version as __version__


is getting invoked. Try setup command, but it tries to install in /usr/local/lib/python2.7

5) How to tell setup where to put the numpy?

setup.py install --help

helps, yielding


Options for 'install' command:
--prefix installation prefix
--exec-prefix (Unix only) prefix for platform-specific files
--home (Unix only) home directory to install under
--install-base base installation directory (instead of --prefix or --
home)
--install-platbase base installation directory for platform-specific files
(instead of --exec-prefix or --home)
--root install everything relative to this alternate root
directory
--install-purelib installation directory for pure Python module
distributions
--install-platlib installation directory for non-pure module distributions
--install-lib installation directory for all module distributions
(overrides --install-purelib and --install-platlib)
--install-headers installation directory for C/C++ headers
--install-scripts installation directory for Python scripts
--install-data installation directory for data files


It must be one of those!

6) Used "--prefix". setup.py runs without complaint.

Haha! New error. "{PATH}/multiarray.so: cannot open shared object file: No such file"

but the file exists! Now what?

7) Wait - no. I used the system python not the self-installed 2.7. It works! numpy is imported.

8) Now get PyNCL? (ncl is already working on this machine)

NO! Looks like I have to do the numpy thing again, because PyNGL recommends against the g77 compiler and requires gfortran. So my numpy, though working, is wrong.

9) While I'm at it, no BLAS/ATLAS? etc?