Skip to main content

Posts

Showing posts from September, 2016

High Performance Computing: A library of utilities

A brief look into all the advanced modules we can do on a typical HPC cluster.
I will gradually introduce what I have been using over the time to advance my research.

--------------------------- /opt/modules/modulefiles ---------------------------
   abaqus/6.12-3
   abinit/7.6.4_openmpi-1.8.1_intel-14.0.2.144
   ampl/20100429
   anaconda/2.0.1-py26
   anaconda/2.0.1-py27                                 (D)
   anaconda/2.0.1-py33
   anaconda/2.0.1-py34
   ansys/14.0
   ansys/14.5
   ansys/15.0
   ansys/16.2                                          (D)
   ansysem/16.2
   ase/3.6.0.2515
   ase/3.8.1.3440
   ase/3.9.1.4567                                      (D)
   atk/12.8.2
   atk/13.8.0                                          (D)
   atk/2014.1
   atlas/3.11.30_gcc-4.7.2                             (D)
   atlas/3.11.34_gcc-4.7.2
   autodesk/2013
   autodesk/2015                                       (D)
   bbftp/3.2.0
   bioinfo
   boost/1.49.0
   boost/1.53.0_intel-13.1.1.163
   boo…

IDL programming: How to design a simple program

For scientists and engineers, our scripts/programs are usually very straightforward and simple. Even some of us are modelers, our models should be complex but the source code are still manageable.
However, the re-usability of these simple programs are still necessary because a lot of our work must be done over and over again. For example, we need to plot/map some data in various scenarios. And we wants the plot/map program can be used for various types of inputs.

Specially, for IDL programming, there are a few common blocks that should be considered in a program.

Compiler option; this step is usually just for compiling setup, you can also specify it the start-up file.Error control; some of the most important error information;Global variable; this step should be used to define the global variable, such as the extension of a file.Keyword check;Local variable; this step should be project variable, such as data range.Workspace; workspace usually refer to the directory all operations will …

Spatial datasets operations: When the sky is failing

Not all raster can be re-projected to another projection, at least some tools don't allow us to.

For example, the AMSR-E/Aqua L3 Global Snow Water Equivalent EASE-Grids cannot be directly re-projected to UTM-6N within ENVI 5.1.
https://nsidc.org/data/docs/daac/ae_swe_ease-grids.gd.html
Please refer this website for projection information.

 Fortunately, ArcMap can do a better job in this scenario.
You can use the project tool just like any other operations.

In order to re-project lot of data using ArcObject, I have updated a ArcTool, which is basically a batch mode data operation toolkit. For this task, the screenshot is as follow:



Below is the result:




Beneath the hook is the GP tool, some sample scripts are as follow: ///================================================ public static object ProjectRaster(string sFileIn, string sFileOut, string sNewProject, double dCellSize)         {             ESRI.ArcGIS.Geoprocessor.Geoprocessor gp = new ESRI.ArcGIS.Geoprocessor.Geoprocessor(); …

Spatial datasets operations: From MODIS to model ready binary workflow

I have written a couple of articles regarding the potential issues during raster operations. Here I have put down some typical workflow for these operations.
For study area smaller than one MODIS granule, following these steps:

The common method that will work for you: Convert one HDF file to GeoTIFF using ArcMap, save this file as A.tif.Convert all HDF files to GeoTIFF using IDL and A.tif, call routine "hdf2tiff". In this step, you should set all unwanted pixel to missing value, such as -9999.Project one GeoTIFF A.tif to B.dat using ArcMap;Project all GeoTIFF to ENVI files using B.dat, call routine "project48";Define the missing value for all the ENVI files from step 4, call routine "define_envi_missing_value";Prepare the shapefile mask for the study area, C.shp using Arcmap, C should has the same projection as the target projection;Extract all ENVI files using the C.shp shapefile , call routine "extract50".(Updated January 15 2017). 
Since in s…

Spatial datasets operations: From MODIS HDF to Binary file

MODIS products are widely used in Earth Science, here I present some typical demos and tryouts to deal with MODIS datasets.
All the MODIS products datasets are distributed in HDF format. To get familiar with HDF, you can go to: https://en.wikipedia.org/wiki/Hierarchical_Data_Format or  https://www.hdfgroup.org/HDF5/

MODIS land surface products usually have a different map projection than most of our projects. Here are some information of this projection: http://modis-land.gsfc.nasa.gov/MODLAND_grid.html Therefore, a re-project is needed to convert HDF datasets to our projects.
There are quite a few tools available to operate on HDF files, but most of them are not efficient enough. You have to repeat the same set of operations. Therefore, using IDL or ArcObject based script/program is desirable.
The first step is to extract the datasets from HDF, and we also need to define the projection as exactly the same with HDF. Below is a very simple way to do so. You can use ArcMap to open one sin…