ROMS UNSW2008

From WikiROMS
Jump to: navigation, search
Installing and Running ROMS for First Time Users

A tutorial for new ROMS users will be held at the UNSW Computer Labs on Monday 30 March 2009, immediately prior to the ROMS Sydney 2009 User Workshop at the Sydney Institute of Marine Sciences, 31 March to 2 April 2009.

NoteThis tutorial is intended for complete newcomers to ROMS. It assumes basic knowledge of working in a UNIX environment, and that the essential components required to compile and execute ROMS are already installed on the host computer network. This wiki page borrows heavily from David Robertson's excellent Installing ROMS under Cygwin tutorial where you will find more information about setting up the required computing environment (compilers, libraries etc.) for ROMS.

In this tutorial, we cover how to download the code, configure it for an application, and run the model. Error messages that arise during the configuration process will be explained so that these can better be debugged when users return to their home institutions and try to work through this process again.

A follow-on tutorial discussing sediment in ROMS will be presented on Friday 03 April.

NoteAn important resource you should use as you get started is the Frequently Asked Questions entry in WikiROMS.


Download ROMS

The disk space available on the UNSW Computer Lab machines is quite limited, so for the purposes of this tutorial we have downloaded the ROMS source code to /srv/ckpt/roms/shared/src on host matht001. Instructions below will explain how to point the build.bash script that compiles ROMS to this directory.

To download the code to your own machine, these are the steps you would follow:

  • You must have already registered on the ROMS portal and obtained your ROMS username/password as indicated in the Register.
  • Create a src folder where you will keep the ROMS source code. You can place this wherever you wish in your directory tree (here we assume under your home directory "~") and name it whatever you like.
cd ~
mkdir src
  • Check out the ROMS source code replacing bruce with the ROMS user name you registered with.
svn checkout --username bruce https://www.myroms.org/svn/src/trunk src
Note the target directory src at the end of the command. If your code ends up in the wrong place, you may have omitted this.

You will see many lines stream by indicating the files that are being added to your src directory. When it finishes, you can type ls src to see the contents of the directory.

To see the contents of the directory where the code is downloaded for this tutorial, type this:

cd /srv/ckpt/roms/shared
ls src

Customize the Build Script

The ROMS source code comes with a build script in the ROMS/Bin directory. Examples written with bash (build.bash) and csh (build.sh) are provided. The UNSW Computer Lab machines are configured to use bash as the default login shell, so we will work with build.bash. A full description of the build script can be found here.

  • In your home directory (you can use some other directory to organize your ROMS projects if you wish) create a new folder named Projects and change into it.
cd ~
mkdir Projects
cd Projects
  • Create a folder named upwelling and change into it. ROMS is distributed with several Test Cases and the Upwelling example is the default which we will compile and run here.
mkdir upwelling
cd upwelling
  • Copy the build.bash file distributed with ROMS to your Projects/upwelling directory.
cp /srv/ckpt/roms/shared/src/ROMS/Bin/build.bash .

Next we need to configure a few options inside build.bash so that it finds the directories where the source code and your Project are located.

  • Open the build.bash script you just copied into your upwelling directory using your preferred text editor, e.g. vi.
vi build.bash
  • Scroll down until you find ROMS_APPLICATION. You will notice it is set as follows:
export ROMS_APPLICATION=UPWELLING
We do not need to change this. But this is the first thing you will alter when starting your own project. This tells ROMS the name of an include file that will contain all the directives to the C-PreProcessor to configure your application at compile time. ROMS' rule is change this string to lowercase and append a ".h", so this will search for a file called upwelling.h. It must be in the directory specified by MY_PROJECT_DIR:
  • Scroll down until you find MY_PROJECT_DIR and set it as follows:
export MY_PROJECT_DIR=${HOME}/Projects/upwelling
This obviously assumes you put Projects/upwelling under your home directory.

If you frequently move your ROMS project between hosts where you have a different directory structure, e.g. a temporary scratch space, you can use the MY_ROOT_DIR variable to minimize the changes you make to build.bash.

  • For example:
export MY_ROOT_DIR=/usr/scratch/bruce
export MY_PROJECT_DIR=${MY_ROOT_DIR}/Projects/upwelling

Next we tell build.bash where to find the ROMS source code downloaded from the svn repository (which you can keep up to date the svn update command - see more on this at LINK ). Note that most of the source code changes you make to customize ROMS will be made in your Projects space, and need not be made to the downloaded code directly. We will discuss exceptions to this during the tutorial, and how source code modifications interact with svn.

  • Set MY_ROMS_SRC to the location of the source code:
export MY_ROMS_SRC=/srv/ckpt/roms/shared/src
In practise, you will probably do something more like this:
export MY_ROMS_SRC=${MY_ROOT_DIR}/src
assuming this is the relative path in which you keep your source code on the various machines you work on.

Make sure that MY_CPP_FLAGS is not set. Sometimes this is set in the distributed build.bash exmaple. Comment out options with the # symbol like so:

#export MY_CPP_FLAGS="-DAVERAGES"

The UNSW Computer Lab machines are single core, so we need to tell build.bash not to assume MPI parallel compilation.

  • Comment out the options for USE_MPI and USE_MPIF90
#export USE_MPI=on
#export USE_MPIF90=on
  • If you were compiling in parallel you would leave the default entries in build.bash.
export USE_MPI=on
export USE_MPIF90=on
  • We leave the compiler option as the default because this says use the ifort (Intel FORTRAN) compiler which is what we want on the UNSW machines.
export FORT=ifort
  • In the interests of speed for this tutorial, we turn off compiler optimization by activating the debug option:
export USE_DEBUG=on
On the UNSW Lab machines compiling with optimization on will take over 15 minutes, but with optimization off (USE_DEBUG=on) it will be less than 60 seconds.

Save and close the build.bash file.


Copy the input and CPPDEFS options files

We need three more files in Projects/upwelling to configure and run ROMS. We copy the versions downloaded with svn because these are files you will work with locally when you experiment with changes to the test case example configuration.

  • Copy files ocean_upwelling.in, varinfo.dat and upwelling.h into the Projects/upwelling directory you just created.
cd ~/Projects/upwelling
cp /srv/ckpt/roms/shared/src/ROMS/External/ocean_upwelling.in .
cp /srv/ckpt/roms/shared/src/ROMS/External/varinfo.dat .
cp /srv/ckpt/roms/shared/src/ROMS/Include/upwelling.h .

View the file upwelling.h. It contains all the C-Pre-Processor (CPP) options that the compiler interprets to activate certain source code options within ROMS.

View the file ocean_upwelling.in. It contains the inputs options that ROMS reads from standard input at run time to set options that need not be fixed at compile time.

View varinfo.dat. The file varinfo.dat contains descriptions of the names and attributes of input and output variables that ROMS reads and writes from netcdf files. For most applications you will not need to change the entries in this file. If you need to know the default units assumed for different variables, those are noted in this file. (Before we run ROMS, we will need to tell it where to find this file).

Now we are ready to compile ROMS by executing the build.bash script.

Compile ROMS

Before you run ROMS, you need to compile it to create an executable oceanS file (S for serial or single processor computer), or oceanM file (if using MPI on a parallel computer).

  • Go to your upwelling project directory:
cd ~/Projects/upwelling
  • Then type:
./build.bash
  • If lots of stuff comes on the screen then compilation is proceeding, and make take some time.
  • If the build process ends quickly with an error, then it is likely that build.bash does not point to the correct location for the upwelling.h file, the FORTRAN compiler, or some libraries. We describe common getting started errors and solutions in the next section.
  • You may give the option -j to the build command to distribute the compilation to multiple processors if your host supports this, e.g.:
./build.bash -j 8
to compile on 8 processor at once.

If your build was successful it will not have reported any errors, and there will be an executable file in your Projects/upwelling directory called oceanG. The "G" in the file name indicates build.bash activated the USE_DEBUG option.

If USE_DEBUG were not selected, the executable would be oceanS, where the "S" indicates "serial" or "single-processor" because we deactivated MPI.

If you had activated MPI with the USE_MPI option the executable would be named oceanM.

(See also FAQ: My build finished with no errors, where is the ROMS executable?).


Common getting started compile error messages

Getting past the first few errors with compilation is often tricky. Carefully read any error messages you get for clues on what might be wrong with your configuration. Here are some common difficulties new users encounter getting started when first executing the build.bash command.

  • Compilers/../ROMS/Include/cppdefs.h:709:22:
    error: /student/0/a0000020/Projects/upwelling/upwelling.h: No such file or directory
    This says the file upwelling.h is not where Build expects it to be, which is in MY_PROJECT_DIR. You set this to ~/Projects/upwelling.
  • cp: cannot stat `/opt/intelsoft/netcdf/include/netcdf.mod': No such file or directory
    This says that netcdf is not where build.bash expects to find it. Locate where the netcdf include and lib directories with steps something like:
    cd /usr
    find . -name netcdf.mod -print
    ./local/netcdf-3.6.2/include/netcdf.mod
    ./local/netcdf/intel/3.6.3/include/netcdf.mod
    This tells us the most recent (3.6.3) netcdf is in /usr/local/netcdf/intel/3.6.3. Direct ROMS to this location by making two changes to build.bash. First, advise ROMS to read your changes to the default library path by uncommenting the option for USE_MY_LIBS.
export USE_MY_LIBS=on
Then specifiy the correct location for netcdf:
export NETCDF_INCDIR=/usr/local/netcdf/intel/3.6.3/include
export NETCDF_LIBDIR=/usr/local/netcdf/intel/3.6.3/lib
Warning Be careful where you make this change. You need to make it for the ifort compiler option, and NOT for the USE_NETCDF4 option (we are using netcdf-3). If you've done this correctly, your compilation with build.bash should now succeed.
  • error
    error:
    Note here further errors we encounter during the tutorial.


Run ROMS

You run ROMS by executing the oceanG (or oceanS) binary, giving it the ocean_upwelling.in file as UNIX standard input.

./oceanS < ocean_upwelling.in
ROMS standard output will be typed to the screen. To save it a file instead, enter, e.g.:
./oceanS < ocean_upwelling.in > my_upwelling.log

If you have compiled a parallel (MPI) executable, the syntax for running the mode is slightly but critically different. The ocean_upwelling.in file is no longer read from UNIX standard input (it has handled by all the MPI threads) so the "<" disappears from the command, and you need the correct syntax on your UNIX host for running an MPI process. It is probably something like:

mpirun -np 8 ./oceanM ocean_upwelling.in > my_upwelling.log
where the "-np 8" indicates use 8 processors and this number of tiles must have been set by

(See also FAQ: What do I have to do to runs ROMS?).


Common getting started run error messages

bash: oceanG: command not found
The working directory is not in your UNIX path. That's why we type "dot-slash" in front of the commands above.


Successful execution

Standard Output

When ROMS runs it will type a lot of information to UNIX standard output. This is the "logfile" you named following the ">", or your terminal if you did not redirect stdout.

STDOUT shows the following:

  • UNIX process info, run time, run TITLE
Process Information:
Thread # 0 (pid= 4449) is active.
Model Input Parameters: ROMS/TOMS version 3.2
Monday - March 23, 2009 - 10:02:39 AM
-----------------------------------------------------------
Wind-Driven Upwelling/Downwelling over a Periodic Channel
  • OS, compiler information, SVN version, and your MY_ROMS_SRC, MY_HEADER_DIR and ROMS_APPLICATION settings
Operating system : Linux
CPU/hardware  : i686
Compiler system  : ifort
Compiler command : /usr/local/intel/fc/10.1.021/bin/ifort
Compiler flags  : -heap-arrays -ip -O3 -pc80 -xW -free

SVN Root URL  : https://www.myroms.org/svn/src/trunk
SVN Revision  : 333
Local Root  : /srv/ckpt/roms/shared/src
Header Dir  : /student/0/a0000020/srv/Projects/upwelling
Header file  : upwelling.h
Analytical Dir: /student/0/a0000020/srv/Projects/upwelling

Resolution, Grid 01: 0041x0080x016, Parallel Threads: 1, Tiling: 001x001
Check that these are what you intended. In last line above
  • "Grid 01" pertains to future ROMS developments with multiple nested/connected grids,
  • 0041x0080x016 shows the grid size is 41 x 80 x 16 grid points in the K,J,I directions
  • The Parallel/Tiling message shows you are using a single process and a single domain tile. When using MPI, this message will describe how many tiles you are using and the MPI processes assigned.
  • Input parameters set in ocean_upwelling.in
Physical Parameters, Grid: 01
=============================

288 ntimes Number of timesteps for 3-D equations.
300.000 dt Timestep size (s) for 3-D equations.
30 ndtfast Number of timesteps for 2-D equations between
each 3D timestep.
...
Output Averages File: ocean_avg.nc
Output Diagnostics File: ocean_dia.nc
  • Then some more about the tiling when running in parallel
  • The C-PreProcessor (CPP) flags set in upwelling.h but AS MODIFIED by ROMS when interpreting and checking the selected CPP options.
Activated C-preprocessing Options:

UPWELLING Wind-Driven Upwelling/Downwelling over a Periodic Channel
ANA_BSFLUX Analytical kinematic bottom salinity flux.
ANA_BTFLUX Analytical kinematic bottom temperature flux.
ANA_GRID Analytical grid set-up.
ANA_INITIAL Analytical initial conditions.
...
You should check that the CPP options displayed here agree with what you intended. For example, if you inadvertently specify more than one horizontal advection scheme option, ROMS will have chosen only one and reported that option here.
  • The preamble in STDOUT continues with information about the space and time discretization: grid spacing, grid volume, Courant number (time step stability) and stiffness (related to s-coordinate accuracy).
  • Then the model starts time stepping:
NL ROMS/TOMS: started time-stepping: (Grid: 01 TimeSteps: 00000001 - 00000288)

STEP Day HH:MM:SS KINETIC_ENRG POTEN_ENRG TOTAL_ENRG NET_VOLUME

0 0 00:00:00 0.000000E+00 6.579497E+02 6.579497E+02 3.884376E+11
DEF_HIS - creating history file: ocean_his.nc
WRT_HIS - wrote history fields (Index=1,1) into time record = 000000
DEF_AVG - creating average file: ocean_avg.nc
DEF_DIAGS - creating diagnostics file: ocean_dia.nc
1 0 00:05:00 3.268255E-13 6.579497E+02 6.579497E+02 3.884376E+11
2 0 00:10:00 6.503587E-12 6.579497E+02 6.579497E+02 3.884376E+11
3 0 00:15:00 4.592307E-11 6.579497E+02 6.579497E+02 3.884376E+11

...
This output indicates several things:
  • the run is programmed to run from time step 1 to 288
  • the run starts at time 00:00:0
  • netcdf output HISTORY, AVERAGES and DIAGNOSTICS files are created. Every time ROMS creates a new netcdf file, and writes to an existing file, it reports this to STDOUT
  • output is written to the HISTORY file
  • then global quantities related to the model KE, PE and domain volume are reported on each time step

Note In 99% of situations, getting started problems with model set-up and configuration can be diagnosed by carefully reading the STDOUT above. Things to look for are:

  • misconfigured CPP options (what you got is not what you thought you asked for)
  • parameter errors (e.g. you activated horizontal mixing but left the coefficient as zero)
  • misnamed output files (that's why the files from your last run got overwritten)
  • irrational choices of grid spacing or time step
  • initial/boundary/forcing data being read from the wrong file, or not read at all (because you selected analytical conditions)

At the conclusion of the run, ROMS reports information about run time:

Elapsed CPU time (seconds):

Thread # 0 CPU: 108.079
Total: 108.079

Nonlinear model elapsed time profile:

Initialization ................................... 0.016 ( 0.0148 %)
Processing of input data ......................... 0.028 ( 0.0259 %)
Processing of output time averaged data .......... 4.312 ( 3.9899 %)
...
  • about the number of output records written to each file
ROMS/TOMS - Output NetCDF summary for Grid 01:
number of time records written in HISTORY file = 00000005
...
  • and the analytical files included
Analytical header files used:

ROMS/Functionals/ana_btflux.h
ROMS/Functionals/ana_grid.h
...
If you used a modified analytical file in your MY_HEADER_DIR it will be reported here and is another thing you should check for consistency with your intentions.


Netcdf file output

As reported above, ROMS created 4 output netcdf files when it ran. There are ocean_his.nc, ocean_avg.nc, ocean_dia.nc, and coean_rst.nc. These are, respectively:

  • history records or 'snapshots' of the model state a selected time intervals
  • averages of the model state over selected intervals (not necessarily the same intervals as the history)
  • diagnostics of the model state, the precise contents of which are controlled by CPP options
  • a restart file with everything ROMS needs to restart a application. This is useful if your job crashes at some point and you want to recommence from a previous state without starting over. Typically the restart file is set to keep just 2 time records by continually over-writing the oldest as the run proceeds. This behaviour is controlled in ocean_upwelling.in. Also, when ROMS "blows up" it dumps the ocean state to a 3rd record in the restart file.

You can browse the contents of netcdf files at the UNIX command line with the command ncdump, e.g.

ncdump -h ocean_his.nc | more

Note the use of the "-h" option. This restricts the output from ncdump to be just the header information, or metadata, in the netcdf file. Without the "-h" option you will get the entire contents of the file converted to ascii.

Things to notice when you ncdump the ocean_his.nc file are that it contains all the input parameters (time step, mixing coefficients, s-coordinate parameters, etc.) from ocean.in, the model grid coordinates (x, y, lon, lat, depth, Coriolis parameter, etc.) which may have been computed by the ANA_GRID option or read from an input grid netcdf file, in addition to the actual model output (ocean_time, zeta, u, v, temp, salt).

There are netcdf global attributes that echo much of the information typed to STDOUT. This includes compiler, svn version, and project directory information, and all the CPP options. This is a valuable source of information when returning to a project and trying to figure out what you did! The global attributes metadata show precisely which options were activated when creating the output in this netcdf file.


Changing the UPWELLING test case configuration

Compile time changes: upwelling.h

Changes to options that must be set at compilation time are made to the upwelling.h file. These settings are interpreted during the C-PreProcessing step.

To see what the present options are, edit the upwelling.h file:

vi upwelling.h

Recall that the actual options active after this file in interpreted will be typed to STDOUT (the "logfile") and also written to the output netcdf file in the global attributes.

To see the all the options that might be set using C-PreProcessor directives, you can browse the cppdefs.h file in the ROMS/Include directory underneath the MY_ROMS_SRC location set in your build.bash. In this case:

cd /srv/ckpt/roms/shared/src
more ROMS/Include/cppdefs.h

The contents of this file are almost entirely comments and are provided to document the options available. For more information consult WikiROMS or the User Forum.

At the very bottom of cppdefs.h you will see a short code segment that loads the actual application options from ROMS_HEADER. This variable is set by the ROMS_APPLICATION value in build.bash.


Run time changes: ocean_upwelling.in

Changes to options that are set at run time are made to the ocean_upwelling.in file.

To see what the present options are, edit the ocean_upwelling.in file:

vi ocean_upwelling.in

Comments at the beginning of this file document the KEYWORD == value syntax.

Comments at the end of file provide brief summaries of what each parameter does.

For more information consult WikiROMS or the Use Forum.

Recall that the actual parameter values ROMS uses after reading this file will be typed to STDOUT (the "logfile") and also written to the output netcdf files.

A Realistic model example: LaTTE

THIS TEST CASE IS WAY OUT OF DATE AND ALMOST CERTAINLY WON'T WORK WITH A RECENT VERSION OF ROMS. IT NEEDS UPDATING. UNTIL I GET TO THAT, TREAT THIS SECTION AS A GUIDELINE ON THE BASIC STEPS TO CREATING A REALISTIC APPLICATION, BUT DON'T PERSEVERE TRYING TO RUN IT.


This section of the tutorial assumes you have successfully compiled and run the UPWELLING example above. Key concepts you should be comfortable with before you proceed are:

  • you need a new directory where you will keep the files specific to the new application
  • customize build.bash for the new application (copy the build.bash from upwelling because it has all the correct compiler and library settings)
  • set MY_PROJECT_DIR in build.bash to point to the directory for the new application
  • set ROMS_APPLICATION in build.bash to the correct name for the new application
  • if you wish to customize any of the ana_*.h files, copy just the ones you need into the new project directory
  • you don't need to make a copy the source code a new application

With these concepts in mind, we proceed by configuring ROMS to run a realistic coastal ocean application that includes open boundaries on 3 sides, open boundary tides and climatological open boundary velocity and tracer (temperature and salinity) conditions, surface meteorological forcing, and initial conditions, all provided by input netcdf files.

The example is called LaTTE_C because it simulates ocean conditions during the Lagrangian Transport and Transformation Experiment conducted on the New Jersey inner shelf in the Spring of 2006. The '_C' denotes a coarse resolution configuration suitable for this training exercise.


Create a latte_c project directory

We have placed the CPP options file latte_c.h, standand input ocean_lattec.in, and a modified varinfo.dat in /srv/ckpt/roms/shared/latte_c/Forward. Make a new Project directory for this new application and copy these 3 files. DO NOT copy all the netcdf files from /srv/ckpt/roms/shared/latte_c/in .

Edit build.bash

Set the correct entries for environment variables that define the user application.

ROMS_APPLICATION=LATTE_C causes build.bash to look for the file latte_c.h in order to set the CPP options

MY_PROJECT_DIR=${HOME}/Projects/latte_c will instruct build.bash where to look for the latte_c.h file.

Setting MY_HEADER_DIR would instruct ROMS where to look for the user functional files ana_*.h that over-ride default options. In this example, however, we don’t actually need to modify any of those functionals. This is typical of “realistic” applications where input grid, initial and boundary conditions are provided from data in input netcdf files.

NoteThe format of ROMS output files (history, averages and restart) is the same as ROMS input initial conditions and climatology. This means the output of previous runs can become the initial conditions, or 3-D climatology (for nudging) for new runs.

Edit ocean_lattec.in

Open the ocean_lattec.in file in an editor. There are KEYWORDS that define the names of the input netcdf files for applications of this type:

  • GRDNAME is the grid file with coordinates, grid metrics (spacing), bathymetry, land/sea mask and Coriolis
  • ININAME is the initial conditions
  • BRYNAME are the open boundary sea leve, velocity and tracer conditions
  • FRCNAME are the tides, river source, and surface meteorological forcing files. Notice there are multiple files and the number of files ROMS is to read is set by the NFFILES parameter. On initialization, ROMS scans this list for each forcing variable it needs, using the first file to contain the necessary and shadowing any entry in subsequent files. Therefore if you want to re-run your model with a new set of wind data but happen to have other wind data in a files with all your other meteorology inputs, just put the new file at the beginning of the list.

There are KEYWORDS that determine the output file names. These are:

  • RSTNAME, HISNAME, AVGNAME, DIANAME, STANAME, FLTNAME .... etc

There are keywords that set how many time steps ROMS takes between writing output. These are:

  • NRST,NHIS,NAVG,NDIA,NSTA ...etc

You can have ROMS write multiple records to each output file at these intervals, but periodically create a new file (to keep file sizes manageable) by setting the keywords:

  • NDEFHIS,NDEFAVG,NDEFDIA


Compile with build.bash

If you have set the entries in build.bash correctly, you compile exactly as before by executing the script.

Watch the output of the build process. You should see that instead of "Project/upwelling/Build" the compilation is now writing temporary files to "Project/latte_c/Build". This Build subdirectory is kept separate so you can be working on two projects at once and not confuse things.

cd to the Build subdirectory and look at some of the files there, e.g.

cd ~/Projects/latte_c/Build
more u3dbc_im.f90

This is the file that sets open boundary conditions on 3-d velocity. This is the Fortran90 file that is generated after the C-PreProcessor has done its job. If you find your model is doing things you don't expect, it can be instructive to view the ".f90" file and corresponding ".F" (in MY_ROMS_SRC) to see whether the CPP options being processed are what you intended. If not, review your header file (in this case latte_c.h), and the list of CPP options typed to STDOUT. Don't edit the "f90" file directly because it gets over-written when you recompile.

Before we run latte_c

Browse the latte_c.h options

Use vi or more to browse the CPP options used in this "realistic" application. Things to notice that are different from upwelling.h are:

  • There are no ANA_INITIAL or ANA_WINDS options. If analytical initial and forcing options are not set with a #define then ROMS defaults to reading this information from input netcdf files.
  • WEST_, EAST_, SOUTH_ and NORTHERN_WALL. These options define the open boundary schemes. (The nomenclature of the compass points assume a grid oriented with west along i=1 and south along j=1). The upwelling case had closed boundaries to the north and south, and periodic conditions east-west. The open boundary conditions here are set with the following options:
  • WEST_FSCHAPMAN,WEST_M2FLATHER etc. indicate “west” side free surface (FS) and depth-averaged velocity/momentum (M2)
  • WEST_M3GRADIENT,WEST_TGRADIENT “west side” 3-d velocity/momentum (M3) and all tracers.
  • SSH_TIDES, UV_TIDES cause ROMS to add tidal variability in sea level and depth-average velocity using the harmonics read from the tides forcing file. See the wikiROMS entry on tides for more information.
  • #ifdef SSH_TIDES
    #define ADD_FSOBC

    This construct is a conditional test that causes prescribed mean or slowly varying sea surface height to be added to the tidal variability.
    The further option ANA_FSOBC means the prescribed value is set by one of the analytical functional include files. If ANA_FSOBC is not defined, ROMS will look for the boundary sea level in a boundary conditions file.
  • BULK_FLUXES In this application the surface meteorology forcing files give net shortwave and longwave radiation and the temperature, pressure and humidity conditions in the marine atmospheric boundary layer. These values are converted to air-sea fluxes of heat and momentum according to the Fairall et al. bulk formulae.
  • GLS_MIXING This activates the Generalized Length Scale vertical turbulence closure parameterization of Umlauf and Buchard. Parameters in ocean_lattec.in determine details such as whether the closure method is actually k-epsilon, k-kl, etc.
  • UV_PSOURCE, TS_PSOURCE These options activate point sources; in this case the inflow of the Hudson River.

The input netcdf files

Use ncdump to browse the contents of some of the input files, e.g.:

ncdump -h frc_lattec_wrf_Pair.nc | more

We see that in this case the dimensions of the data match the dimensions of the ROMS grid:

dimensions:
time = UNLIMITED ; // (1560 currently)
eta_rho = 82 ;
xi_rho = 146 ;
For time varying 2D surface forcing data only (e.g. meteorological data) ROMS will regrid during execution 2D data defined on simple 1D coordinates (vectors of lon and lat). In all other cases, the forcing data must have already been interpolated to the ROMS grid.

ROMS associates the variable name of the forcing data with the appropriate internal variable by consulting the entries in varinfo.dat. Edit varinfo.dat and search for the string "Pair"

vi varinfo.dat
'Pair'  ! Input
'surface air pressure'
'millibar'  ! [millibar]
'Pair, scalar, series'
'pair_time'
'idPair'
'r2dvar'
1.0d0
This tells us 3 important things:
  • The name of the surface air pressure variable in the forcing netcdf is "Pair". If your netcdf file was made with some other name you can change it here, e.g. "press_sfc" and you DO NOT need to change the netcdf file. This is acceptable because the variable is strictly an Input variable as noted by the comment (! Input) at the end of the first line.
  • The units are assumed to be millibars. If your data are not in millibars you must either modify the data in the forcing netcdf file, or you may apply a scalar factor by editing the last line in the block above (presently set to 1.0d0).
  • The Pair data are defined at the times recorded by the variable named "pair_time" in the same forcing netcdf file. However, in the netcdf file itself this default can be reset by adding a "time" attribute to Pair. Return to viewing the netcdf file with ncdump to see this:
variables:
double Pair(time, eta_rho, xi_rho) ;
Pair:long_name = "Surface air pressure" ;
...
Pair:time = "ncep_time" ;
double ncep_time(time) ;
ncep_time:long_name = "forcing observations time" ;
ncep_time:units = "days since 01-Jan-2006" ;
The "time" attribute "Pair:time" indicates that variable "ncep_time" in this file contains the times at which the air pressure is reported. This overrides the value in varinfo.dat. The units attribute to ncep_time shows the CF convention of "days since 2006-01-01 00:00:0". This time base must match in ALL your forcing netcdf files AND the initial conditions file. ROMS will not enforce consistency between different time conventions - that would be nice but for now it's on you. (Nor will the TIME_REF keyword in ocean.in adjust this time - it only sets the units attribute string for ocean_time in output netcdf files so you need to take care to set all these times be be consistent.) Nor will ROMS convert data units if they do not match what is assumed in varinfo.dat. The units attributes in this file are therefore purely metadata that document the file contents.


Let's also examine the format of the open boundary conditions file:

ncdump -h lattec_bndy_uv2d_half.nc | more

The file has the same dimensions as the forcing files and output netcdf files. It contains parameters and variables that define the vertical s-coordinate and then the actual open boundary data such as:

float temp_east(time, s_rho, eta_rho) ;
temp_east:long_name = "potential temperature east boundary condition" ;
temp_east:units = "Celcius" ;
temp_east:field = "temp_east, scalar, series" ;
temp_east:time = "ocean_time" ;
Notice that this variable has dimensions "(time, s_rho, eta_rho)" which describe a time varying 2D spatial structure in depth (s_rho) and horizontal coordinate (eta_rho). This "east" boundary is at constant "i" index (i=L) and therefore has no "xi" coordinate dimension. The data along the southern boundary on the other hand, temp_south, has dimensions "(time, s_rho, xi_rho)".

Run the latte_c example

Run the model exactly as you did in the UPWELLING example, but giving the new ocean_lattec.in as the input file.

  • UNIX process info, run time, run TITLE
./oceanG < ocean_lattec.in

Stdout for Latte_C

The preamble to STDOUT (which you might redirect to a logfile with ">") will resemble what you saw for UPWELLING. UNIX process information, compiler and svn version information, model input parameters read from the ocean.in file, and the list of active CPP options.

The differences from UPWELLING come shortly before the model begins time stepping. ROMS reports the following information to help you check that you have configured the model correctly.

  • Initial conditions
NLM: GET_STATE - Read state initial conditions, t = 94 00:00:00
(File: lattec_ini_94.nc, Rec=0001, Index=1)
- free-surface
(Min = -7.02081803E-01 Max = 0.00000000E+00)
This shows initial time is 94 days, that initial conditions were read from record 1 of file lattec_ini_94.nc, and that the free surface data read ranged from -0.702 meters to 0 meters. Check: are data being read from the correct file for the correct record/time? If you see unrealistic values in the Min/Max range you may have special values (999?) in your input netcdf file, or a units error.
  • Forcing data - static non time varying
GET_NGFLD - tidal period
(Min = 4.30820452E+04 Max = 9.67260840E+04)
GET_2DFLD - tidal elevation amplitude
(Min = 0.00000000E+00 Max = 1.13473975E+00)
The vector of tidal priods (in hours) are read and then 2D fields of tidal elevation and velocity. Again: check the file source and Min/Max range.
  • Forcing data - time varying
GET_NGFLD - river runoff mass transport, t = 94 00:00:00
(Rec=0465, Index=2, File: roms_lattec_river.nc)
(Tmin= -370.0000 Tmax= 267.0000)
(Min = -9.25473328E+02 Max = -1.69902756E+02)
...
GET_2DFLD - surface u-wind component, t = 94 00:00:00
(Rec=0001, Index=1, File: frc_latte_wrf_Uwind.nc)
(Tmin= 94.0000 Tmax= 158.9583)
(Min = -8.73357999E-02 Max = 4.12751995E+00)
...
GET_NGFLD - river runoff mass transport, t = 95 00:00:00
(Rec=0466, Index=1, File: roms_lattec_river.nc)
(Tmin= -370.0000 Tmax= 267.0000)
(Min = -9.88811890E+02 Max = -1.86609863E+02)
...
GET_2DFLD - surface u-wind component, t = 94 00:59:59
(Rec=0002, Index=2, File: frc_latte_wrf_Uwind.nc)
(Tmin= 94.0000 Tmax= 158.9583)
(Min = -1.26112560E-01 Max = 2.46840043E+00)
Time varying forcing data will be read for times that bracket the current ROMS time. ROMS finds the correct times from the time variable in the files (which can be different for each variable - see the "time" attribute of the data). ROMS linearly interpolates the forcing data for these two records to the ROMS ocean_time on every model time step. The forcing data do not need to be regularly spaced in time, but the data do need to bracket ROMS initial time for the model to start. Again: check file names, data ranges and the times. Recall that if forcing data of the same name occurs in two forcing files in the list set by FRCNAME in ocean.in then ROMS takes the first data it finds. This will be the filename in the message above.
  • The model then begins execution, reporting global energy diagnostics and any netcdf files it creates and writes to.
STEP Day HH:MM:SS KINETIC_ENRG POTEN_ENRG TOTAL_ENRG NET_VOLUME

0 94 00:00:00 9.698352E-03 2.011729E+02 2.011826E+02 1.421876E+12
DEF_HIS - creating history file: ./out/his_lattec.nc
WRT_HIS - wrote history fields (Index=1,1) into time record = 0000001
DEF_STATION - creating stations file: ./out/sta_ocean.nc
1 94 00:06:00 8.873902E-03 2.011284E+02 2.011372E+02 1.422005E+12
2 94 00:12:00 8.377911E-03 2.011090E+02 ...

If in this initialization phase you do not see a report of values you expected ROMS to read, then ROMS did not read those data. This could be because you have an ANA option set that causes ROMS to get the information internally, or you may have set other CPP options such that there is no need to read the data. ROMS does not read data from netcdf files that it does not need to run. This information will help you diagnose misconfiguration or misunderstanding of the interaction of various CPP options.

  • New forcing data when required
9 94 00:54:00 6.518542E-03 2.012623E+02 2.012688E+02 1.424463E+12
GET_2DFLD - surface u-wind component, t = 94 02:00:00
(Rec=0003, Index=1, File: frc_latte_wrf_Uwind.nc)
(Tmin= 94.0000 Tmax= 158.9583)
(Min = -4.55454426E-01 Max = 4.03826709E+00)
When ROMS ocean_time reaches the end of the interval bracketed by any forcing data it holds in memory it will read the next set of values from the netcdf file and report this to STDOUT. If ROMS runs for a while and crashes shortly after reading some new forcing data check that you don't have corrupt data (from the Min/Max range).

Eventually ROMS concludes and provides the same report it did in the UPWELLING example: elapsed time profile, the number of output records written, and the analytical functional used at compilation.

Plotting netcdf output with Matlab

Basics of Matlab-Netcdf

Matlab version 2008b, which is installed on the UNSW Computer Lab machines, has support for reading netcdf files. It uses the built-in java tools to do this and the necessity to install a machine-dependent set of mexnc routines that was required in earlier versions of Matlab is no longer necessary.

To ease the process of reading netcdf files into the Matlab workspace we have however installed a set of Matlab m-files called SNCTOOLS, written by John Evans at Mathworks. These tools should be in your Matlab path. You can verify this in Matlab by entering the command which nc_varget.

nc_varget is the workhorse utility that reads subsets of data from a netcdf file. Many of the Matlab tools that people distribute for working with ROMS output use SNCTOOLS functions, like nc_varget, to provide the interface to reading netcd files. You won't get much further in this part of the tutorial if you can't work nc_varget.

Enter help nc_varget to see the syntax.

Note that nc_varget also supports reading from OPeNDAP/THREDDS data servers in addition to reading from local netcdf files.

Using the roms_wilkin Matlab routines

ROMS output files are CF-Compliant netcdf files and therefore there are many software tools that allow you to browse, extract and plot output easily. There are also a number of collections of code written for Matlab that offer tools for plotting ROMS output. What you choose to use is a matter of personal preference and the functionality offered.

For this tutorial I show just a few tools out of the set of roms_wilkin Matlab tools described in more detail at the tiddlywiki http://romsmatlab.tiddlyspot.com and also at this thread on the ROMS forum.

On the UNSW Computer Lab machines you can add the directory of roms_wilkin Matlab routines to your Matlab path using the command:

>> addpath('/student/0/a0000020/matlab/roms_wilkin','-end')

The {z,s,i,j}view routines in roms_wilkin make simple plots directly from a ROMS file or OPeNDAP URL by slicing along coordinate directions. Enter help roms_zview to see the syntax. The functions all make use of the model coordinates loaded into a structure by function roms_get_grid.

In matlab, load this grid structure from any file containing the grid coordinates, such as an output file:

>> file = 'his_lattec.h';
>> g = roms_get_grid(file,file);

Then to make a plot at constant z = -3 metres of salinity from time record 2 of the file, overlaying green coloured velocity vectors at every 2nd grid point (scaled by a factor of 0.2 deg lon/at per m/s), enter:

>> roms_zview(file,'salt',2,-3,g,3,0.2,'g');
Not much has happened because we are only a few hours into the simulation so make it easier to the extent of the low salinity Hudson River water by changing the colorbar range and zooming in:
>> caxis([20 34]); colorbar
>> axis([-74.4 -73.2 39.9 41])

If you specify time as a string instead of an index, roms_zview will endeavour to parse the time/date information in the file and select the record nearest in time to plot. You can also give an optional output argument to roms_zview in which case it returns a structure with the data that went in to the plot.

>> thedata = roms_zview(file,'salt','2006-04-05 02:00',-3,g,3,0.2,'g');

The output structure thedata contains:

thedata =

x: [82x146 double]
y: [82x146 double]
data: [82x146 double]
t: 7.3277e+05
tstr: '05-Apr-2006 03:00:00'
u: [82x145 double]
v: [81x146 double]
ue: [726x1 double]
vn: [726x1 double]
xq: [726x1 double]
yq: [726x1 double]

"File" could equally well have been an OPeNDAP URL to ROMS output such as the example in the ROMS Forum.

>> file = 'http://tashtego.marine.rutgers.edu:8080/thredds/dodsC/roms/cblast/2002-050/averages'
>> g = roms_get_grid(file,file); % the grid structure
>> % temperature slice for time step nearest to 20JUN2002, at 2-m
>> % depth, with every 3rd velocity vector over-plotted
>> roms_zview(file,'temp','20-Jun-2002',-2,g,3,.1,'k')