Ocean Modeling Discussion

ROMS/TOMS

Search for:
It is currently Wed Dec 19, 2018 11:45 am




Post new topic Reply to topic  [ 7 posts ] 

All times are UTC

Author Message
PostPosted: Sun Dec 30, 2012 8:09 pm 
Offline

Joined: Tue Oct 16, 2012 8:55 pm
Posts: 29
Location: Old Dominion Universiy
Have Upwelling Test up and running in serial mode on Linux with gfortran. Created new directory in Projects folder, EstuarySedment, and copied into it: build.bash from successful upwelling run, estuary_test.h, ocean_estuary_test.in, sediment_estuary_test.in, and varinfo.dat. Then modified build script to:

Export ROMS_APPLICATIOIN=ESTUARY_TEST
Export MY_PROJECT_DIR${MY_ROOT_DIR)/Projects/EstuarySediment

Build appears to configure properly and creates: oceanS
I try to execute oceans with the command:

./oceans < ocean_estuary_test.in sediment_estuary_test.in > sedimentTest.out

Run quickly fails with sedimentTest.out as follows:

MOD_NCPARAM - Unable to open variable information file:
$HOME/ROMS/trunk/ROMS/External/varinfo.dat
Default file is located in source directory.

Doesn’t seem to make any difference if I set the path in ocean_estuary_test.in to the ROMS External source directory or the copy of varinfo.dat in my project directory,

VARNAME = ROMS/External/varinfo.dat, OR VARNAME = varinfo.dat,

I get the same error.

Any recommendations to fix and any other obvious flaws with my set-up so far?


Top
 Profile  
Reply with quote  
PostPosted: Mon Dec 31, 2012 7:07 pm 
Offline

Joined: Fri Apr 02, 2004 4:46 pm
Posts: 35
Location: USGS, Woods Hole, USA
I see a couple things that are either typos or mistakes.

The executable is oceanS (note uppercase S). The input file you should be piping to that is
ocean_estuary_test.in, not sediment_estuary_test.in (that file should be specified in ocean_estuary_test.in).

Unless you have reason to change it, best to leave varinfo.dat where it is, and in the .in file, use the full path name, e.g.:

VARNAME = /fullpathtoromssourcecode/ROMS/External/varinfo.dat

If that doesn't work, post your build.bash file and a ls of what you have in your project directory.

_________________
Chris Sherwood, USGS
1 508 457 2269


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 01, 2013 7:17 pm 
Offline

Joined: Tue Oct 16, 2012 8:55 pm
Posts: 29
Location: Old Dominion Universiy
Chris,
Sincerely appreciate the quick reply and assistance. The typos you identified oceans versus oceanS were errors in my posting versus my actual code. I reset the ocean_estuary_test.in pointer to varinfo.dat as follows:

VARNAME = ${HOME}/ROMS/trunk/ROMS/External/varinfo.dat

Then I re-ran: ./oceans < ocean_estuary_test.in > sedimentTest.out and got the same unable to open varinfo.dat error.

Below is the ls from my project directory (Projects/EstuarySediment):

Build
build.bash
oceanS
ocean_estuary_test.in
sediment_estuary_test.in
estuary_test.h

Note, I was unable to upload my build script—cycled “upload in progress” for over 30 minutes, so I stopped and tried again and finally gave up and copied txt below. thanks again for your assistance.

John

#!/bin/bash
#
# svn $Id: build.bash 585 2012-01-03 18:44:28Z arango $
#::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
# Copyright (c) 2002-2012 The ROMS/TOMS Group :::
# Licensed under a MIT/X style license :::
# See License_ROMS.txt :::
#::::::::::::::::::::::::::::::::::::::::::::::::::::: Hernan G. Arango :::
# :::
# ROMS/TOMS Compiling Script :::
# :::
# Script to compile an user application where the application-specific :::
# files are kept separate from the ROMS source code. :::
# :::
# Q: How/why does this script work? :::
# :::
# A: The ROMS makefile configures user-defined options with a set of :::
# flags such as ROMS_APPLICATION. Browse the makefile to see these. :::
# If an option in the makefile uses the syntax ?= in setting the :::
# default, this means that make will check whether an environment :::
# variable by that name is set in the shell that calls make. If so :::
# the environment variable value overrides the default (and the :::
# user need not maintain separate makefiles, or frequently edit :::
# the makefile, to run separate applications). :::
# :::
# Usage: :::
# :::
# ./build.bash [options] :::
# :::
# Options: :::
# :::
# -j [N] Compile in parallel using N CPUs :::
# omit argument for all available CPUs :::
# -noclean Do not clean already compiled objects :::
# :::
# Notice that sometimes the parallel compilation fail to find MPI :::
# include file "mpif.h". :::
# :::
#::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

parallel=0
clean=0

while [ $# -gt 0 ]
do
case "$1" in
-j )
shift
parallel=1
test=`echo $1 | grep -P '^\d+$'`
if [ "$test" != "" ]; then
NCPUS="-j $1"
shift
else
NCPUS="-j"
fi
;;

-noclean )
shift
clean=0
;;

* )
echo ""
echo "$0 : Unknown option [ $1 ]"
echo ""
echo "Available Options:"
echo ""
echo "-j [N] Compile in parallel using N CPUs"
echo " omit argument for all avaliable CPUs"
echo "-noclean Do not clean already compiled objects"
echo ""
exit 1
;;
esac
done

# Set the CPP option defining the particular application. This will
# determine the name of the ".h" header file with the application
# CPP definitions.

export ROMS_APPLICATION=ESTUARY_TEST

# Set a local environmental variable to define the path to the directories
# where all this project's files are kept.


export MY_ROOT_DIR=${HOME}/ROMS
export MY_PROJECT_DIR=${MY_ROOT_DIR}/Projects/EstuarySediment

# The path to the user's local current ROMS source code.
#
# If using svn locally, this would be the user's Working Copy Path (WCPATH).
# Note that one advantage of maintaining your source code locally with svn
# is that when working simultaneously on multiple machines (e.g. a local
# workstation, a local cluster and a remote supercomputer) you can checkout
# the latest release and always get an up-to-date customized source on each
# machine. This script is designed to more easily allow for differing paths
# to the code and inputs on differing machines.

#export MY_ROMS_SRC=${MY_ROOT_DIR}/branches/arango
export MY_ROMS_SRC=${MY_ROOT_DIR}/trunk

# Set path of the directory containing makefile configuration (*.mk) files.
# The user has the option to specify a customized version of these files
# in a different directory than the one distributed with the source code,
# ${MY_ROMS_SRC}/Compilers. If this is the case, the you need to keep
# these configurations files up-to-date.

export COMPILERS=${MY_ROMS_SRC}/Compilers
#export COMPILERS=${MY_ROOT_DIR}/Compilers

# Set tunable CPP options.
#
# Sometimes it is desirable to activate one or more CPP options to run
# different variants of the same application without modifying its header
# file. If this is the case, specify each options here using the -D syntax.
# Notice also that you need to use shell's quoting syntax to enclose the
# definition. Both single or double quotes work. For example,
#
#export MY_CPP_FLAGS="-DAVERAGES"
#export MY_CPP_FLAGS="${MY_CPP_FLAGS} -DDEBUGGING"
#
# can be used to write time-averaged fields. Notice that you can have as
# many definitions as you want by appending values.

#export MY_CPP_FLAGS="-D"

# Other user defined environmental variables. See the ROMS makefile for
# details on other options the user might want to set here. Be sure to
# leave the switches meant to be off set to an empty string or commented
# out. Any string value (including off) will evaluate to TRUE in
# conditional if-statements.

#export USE_MPI=on # distributed-memory parallelism
#export USE_MPIF90=on # compile with mpif90 script
#export which_MPI=mpich # compile with MPICH library
#export which_MPI=mpich2 # compile with MPICH2 library
export which_MPI=openmpi # compile with OpenMPI library

#export USE_OpenMP=on # shared-memory parallelism

#export FORT=ifort
export FORT=gfortran
#export FORT=pgi

#export USE_DEBUG=on # use Fortran debugging flags
export USE_LARGE=on # activate 64-bit compilation
#export USE_NETCDF4=on # compile with NetCDF-4 library
#export USE_PARALLEL_IO=on # Parallel I/O with Netcdf-4/HDF5

#export USE_MY_LIBS=on # use my library paths below

# There are several MPI libraries available. Here, we set the desired
# "mpif90" script to use during compilation. This only works if the make
# configuration file (say, Linux-pgi.mk) in the "Compilers" directory
# has the following definition for FC (Fortran Compiler) in the USE_MPI
# section:
#
# FC := mpif90
#
# that is, "mpif90" defined without any path. Notice that the path
# where the MPI library is installed is computer dependent. Recall
# that you still need to use the appropriate "mpirun" to execute.

if [ -n "${USE_MPIF90:+1}" ]; then
case "$FORT" in
ifort )
if [ "${which_MPI}" = "mpich" ]; then
export PATH=/opt/intelsoft/mpich/bin:$PATH
elif [ "${which_MPI}" = "mpich2" ]; then
export PATH=/opt/intelsoft/mpich2/bin:$PATH
elif [ "${which_MPI}" = "openmpi" ]; then
export PATH=/opt/intelsoft/openmpi/bin:$PATH
fi
;;

pgi )
if [ "${which_MPI}" = "mpich" ]; then
export PATH=/opt/pgisoft/mpich/bin:$PATH
elif [ "${which_MPI}" = "mpich2" ]; then
export PATH=/opt/pgisoft/mpich2/bin:$PATH
elif [ "${which_MPI}" = "openmpi" ]; then
export PATH=/opt/pgisoft/openmpi/bin:$PATH
fi
;;

gfortran )
if [ "${which_MPI}" = "mpich2" ]; then
export PATH=/opt/gfortransoft/mpich2/bin:$PATH
elif [ "${which_MPI}" = "openmpi" ]; then
export PATH=/opt/gfortransoft/openmpi/bin:$PATH
fi
;;

esac
fi

# If the USE_MY_LIBS is activated above, the path of the libraries
# required by ROMS can be set here using environmental variables
# which take precedence to the values specified in the make macro
# definitions file (Compilers/*.mk). For most applications, only
# the location of the NetCDF library is needed during compilation.
#:wg

# Notice that when the USE_NETCDF4 macro is activated, we need the
# serial or parallel version of the NetCDF-4/HDF5 library. The
# configuration script NC_CONFIG (available since NetCDF 4.0.1)
# is used to set up all the required libraries according to the
# installed options (openDAP, netCDF4/HDF5 file format). The
# parallel library uses the MPI-I/O layer (usually available
# in MPICH2 and OpenMPI) requiring compiling with the selected
# MPI library.
#
# In ROMS distributed-memory applications, you may use either the
# serial or parallel version of the NetCDF-4/HDF5 library. The
# parallel version is required when parallel I/O is activated
# (ROMS cpp option PARALLEL_IO and HDF5).
#
# However, in serial or shared-memory ROMS applications, we need
# to use the serial version of the NetCDF-4/HDF5 to avoid conflicts
# with the compiler. We cannot activate MPI constructs in serial
# or shared-memory ROMS code. Hybrid parallelism is not possible.
#
# Recall also that the MPI library comes in several flavors:
# MPICH, MPICH2, OpenMPI, etc.

if [ -n "${USE_MY_LIBS:+1}" ]; then
case "$FORT" in
ifort )
export ESMF_OS=Linux
export ESMF_COMPILER=ifort
export ESMF_BOPT=O
export ESMF_ABI=64
export ESMF_COMM=mpich
export ESMF_SITE=default

export ARPACK_LIBDIR=/opt/intelsoft/serial/ARPACK
if [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich" ]; then
export ESMF_DIR=/opt/intelsoft/mpich/esmf
export MCT_INCDIR=/opt/intelsoft/mpich/mct/include
export MCT_LIBDIR=/opt/intelsoft/mpich/mct/lib
export PARPACK_LIBDIR=/opt/intelsoft/mpich/PARPACK
elif [ "${which_MPI}" = "mpich2" ]; then
export ESMF_DIR=/opt/intelsoft/mpich2/esmf
export MCT_INCDIR=/opt/intelsoft/mpich2/mct/include
export MCT_LIBDIR=/opt/intelsoft/mpich2/mct/lib
export PARPACK_LIBDIR=/opt/intelsoft/mpich2/PARPACK
elif [ "${which_MPI}" = "openmpi" ]; then
export ESMF_DIR=/opt/intelsoft/openmpi/esmf
export MCT_INCDIR=/opt/intelsoft/openmpi/mct/include
export MCT_LIBDIR=/opt/intelsoft/openmpi/mct/lib
export PARPACK_LIBDIR=/opt/intelsoft/openmpi/PARPACK
fi
fi

if [ -n "${USE_NETCDF4:+1}" ]; then
if [ -n "${USE_PARALLEL_IO:+1}" ] && [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich" ]; then
export NC_CONFIG=/opt/intelsoft/mpich/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/intelsoft/mpich/netcdf4/include
elif [ "${which_MPI}" = "mpich2" ]; then
export NC_CONFIG=/opt/intelsoft/mpich2/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/intelsoft/mpich2/netcdf4/include
elif [ "${which_MPI}" = "openmpi" ]; then
export NC_CONFIG=/opt/intelsoft/openmpi/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/intelsoft/openmpi/netcdf4/include
fi
else
export NC_CONFIG=/opt/intelsoft/serial/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/intelsoft/serial/netcdf4/include
fi
else
export NETCDF_INCDIR=${HOME}/netcdf/include
export NETCDF_LIBDIR=${HOME}/netcdf/lib
fi
;;

pgi )
export ESMF_OS=Linux
export ESMF_COMPILER=pgi
export ESMF_BOPT=O
export ESMF_ABI=64
export ESMF_COMM=mpich
export ESMF_SITE=default

export ARPACK_LIBDIR=/opt/pgisoft/serial/ARPACK
if [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich" ]; then
export ESMF_DIR=/opt/pgisoft/mpich/esmf
export MCT_INCDIR=/opt/pgisoft/mpich/mct/include
export MCT_LIBDIR=/opt/pgisoft/mpich/mct/lib
export PARPACK_LIBDIR=/opt/pgisoft/mpich/PARPACK
elif [ "${which_MPI}" = "mpich2" ]; then
export ESMF_DIR=/opt/pgisoft/mpich2/esmf
export MCT_INCDIR=/opt/pgisoft/mpich2/mct/include
export MCT_LIBDIR=/opt/pgisoft/mpich2/mct/lib
export PARPACK_LIBDIR=/opt/pgisoft/mpich2/PARPACK
elif [ "${which_MPI}" = "openmpi" ]; then
export ESMF_DIR=/opt/pgisoft/openmpi/esmf
export MCT_INCDIR=/opt/pgisoft/openmpi/mct/include
export MCT_LIBDIR=/opt/pgisoft/openmpi/mct/lib
export PARPACK_LIBDIR=/opt/pgisoft/openmpi/PARPACK
fi
fi

if [ -n "${USE_NETCDF4:+1}" ]; then
if [ -n "${USE_PARALLEL_IO:+1}" ] && [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich" ]; then
export NC_CONFIG=/opt/pgisoft/mpich/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/pgisoft/mpich/netcdf4/include
elif [ "${which_MPI}" = "mpich2" ]; then
export NC_CONFIG=/opt/pgisoft/mpich2/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/pgisoft/mpich2/netcdf4/include
elif [ "${which_MPI}" = "openmpi" ]; then
export NC_CONFIG=/opt/pgisoft/openmpi/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/pgisoft/openmpi/netcdf4/include
fi
else
export NC_CONFIG=/opt/pgisoft/serial/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/pgisoft/serial/netcdf4/include
fi
else
export NETCDF_INCDIR=/opt/pgisoft/serial/netcdf3/include
export NETCDF_LIBDIR=/opt/pgisoft/serial/netcdf3/lib
fi
;;

gfortran )
export ESMF_OS=Linux
export ESMF_COMPILER=gfortran
export ESMF_BOPT=O
export ESMF_ABI=64
export ESMF_COMM=mpich
export ESMF_SITE=default

export ARPACK_LIBDIR=/opt/gfortransoft/serial/ARPACK
if [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich2" ]; then
export ESMF_DIR=/opt/gfortransoft/mpich2/esmf
export MCT_INCDIR=/opt/gfortransoft/mpich2/mct/include
export MCT_LIBDIR=/opt/gfortransoft/mpich2/mct/lib
export PARPACK_LIBDIR=/opt/gfortransoft/mpich2/PARPACK
elif [ "${which_MPI}" = "openmpi" ]; then
export ESMF_DIR=/opt/gfortransoft/openmpi/esmf
export MCT_INCDIR=/opt/gfortransoft/openmpi/mct/include
export MCT_LIBDIR=/opt/gfortransoft/openmpi/mct/lib
export PARPACK_LIBDIR=/opt/gfortransoft/openmpi/PARPACK
fi
fi

if [ -n "${USE_NETCDF4:+1}" ]; then
if [ -n "${USE_PARALLEL_IO:+1}" ] && [ -n "${USE_MPI:+1}" ]; then
if [ "${which_MPI}" = "mpich2" ]; then
export NC_CONFIG=/opt/gfortransoft/mpich2/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/gfortransoft/mpich2/netcdf4/include
elif [ "${which_MPI}" = "openmpi" ]; then
export NC_CONFIG=/opt/gfortransoft/openmpi/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/gfortransoft/openmpi/netcdf4/include
fi
else
export NC_CONFIG=/opt/gfortransoft/serial/netcdf4/bin/nc-config
export NETCDF_INCDIR=/opt/gfortransoft/serial/netcdf4/include
fi
else
export NETCDF_INCDIR=${HOME}/netcdf/include
export NETCDF_LIBDIR=${HOME}/netcdf/lib
fi
;;

esac
fi

# The rest of this script sets the path to the users header file and
# analytical source files, if any. See the templates in User/Functionals.
#
# If applicable, use the MY_ANALYTICAL_DIR directory to place your
# customized biology model header file (like fennel.h, nemuro.h, ecosim.h,
# etc).

export MY_HEADER_DIR=${MY_PROJECT_DIR}

export MY_ANALYTICAL_DIR=${MY_PROJECT_DIR}

# Put the binary to execute in the following directory.

export BINDIR=${MY_PROJECT_DIR}

# Put the f90 files in a project specific Build directory to avoid conflict
# with other projects.

export SCRATCH_DIR=${MY_PROJECT_DIR}/Build

# Go to the users source directory to compile. The options set above will
# pick up the application-specific code from the appropriate place.

cd ${MY_ROMS_SRC}

# Remove build directory.

if [ $clean -eq 1 ]; then
make clean
fi

# Compile (the binary will go to BINDIR set above).

if [ $parallel -eq 1 ]; then
make $NCPUS
else
make
fi


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 01, 2013 11:04 pm 
Offline

Joined: Fri Apr 02, 2004 4:46 pm
Posts: 35
Location: USGS, Woods Hole, USA
John,

I did an svn update (just to make sure we were working on the same code) and copied the build.bash, .in and .h files to a project dir, modified build.bash, and built the estuary test case.

Putting this in the .in file:

VARNAME = ${HOME}/src/my_roms/ROMS/External/varinfo.dat

does not work for me either. But if I replace ${HOME} with the actual path, it works fine. Check your syntax and make sure it is there from the project directory:

ls /home/csherwood/src/my_roms/ROMS/External/varinfo.dat

If any of your files have been on a Windows system, make sure they they have not gotten extra <CR><LF> added to the end.

_________________
Chris Sherwood, USGS
1 508 457 2269


Top
 Profile  
Reply with quote  
PostPosted: Wed Jan 02, 2013 1:25 am 
Offline

Joined: Tue Oct 16, 2012 8:55 pm
Posts: 29
Location: Old Dominion Universiy
Chris,

I use the Linux editor vim, so don't think I have introduced any unwanted <CR><LF> but this is good to know for future reference. Ran pwd for the ROMS/External directory and put in full path returned:

/afs/lions.odu.edu/home/j/jande023/ROMS/trunk/ROMS/External/varinfo.dat

also tried

~/ROMS/trunk/ROMS/External/varinfo.dat

Couldn't locate in either case, so I reverted back to the way I had successfully set up the Upwelling test case with:

VARNAME = varinfo.dat and varinfo.dat in my EstuarySediment project directory.

Now get a different error: At line 434 of file inp_par.f90 Fortran runtime error: No such file or directory. Not sure where inp_par.f90 is to investigate further. Next step is to svn relevant files; but, I fear that's not the issue. My output file, sedimentTest.out, is copied below, showing where the run terminates.


Model Input Parameters: ROMS/TOMS version 3.6
Tuesday - January 1, 2013 - 8:19:08 PM
-----------------------------------------------------------------------------

Suspended Sediment Test in an Estuary

Operating system : Linux
CPU/hardware : x86_64
Compiler system : gfortran
Compiler command : /usr/bin/gfortran
Compiler flags : -frepack-arrays -O3 -ffast-math -ffree-form -ffree-line-length-none -ffree-form -ffree-line-length-none -ffree-form -ffree-line-length-none

SVN Root URL : https://www.myroms.org/svn/src/trunk
SVN Revision : 634M

Local Root : /afs/lions.odu.edu/home/j/jande023/ROMS/trunk
Header Dir : /afs/lions.odu.edu/home/j/jande023/ROMS/Projects/EstuarySediment
Header file : estuary_test.h
Analytical Dir: /afs/lions.odu.edu/home/j/jande023/ROMS/Projects/EstuarySediment

Resolution, Grid 01: 0200x0003x020, Parallel Threads: 1, Tiling: 001x001


Physical Parameters, Grid: 01
=============================

28800 ntimes Number of timesteps for 3-D equations.
30.000 dt Timestep size (s) for 3-D equations.
20 ndtfast Number of timesteps for 2-D equations between
each 3D timestep.
1 ERstr Starting ensemble/perturbation run number.
1 ERend Ending ensemble/perturbation run number.
0 nrrec Number of restart records to read from disk.
T LcycleRST Switch to recycle time-records in restart file.
2880 nRST Number of timesteps between the writing of data
into restart fields.
1 ninfo Number of timesteps between print of information
to standard output.
T ldefout Switch to create a new output NetCDF file(s).
120 nHIS Number of timesteps between the writing fields
into history file.
1 ntsAVG Starting timestep for the accumulation of output
time-averaged data.
1440 nAVG Number of timesteps between the writing of
time-averaged data into averages file.
5.0000E-06 Akt_bak(01) Background vertical mixing coefficient (m2/s)
for tracer 01: temp
5.0000E-06 Akt_bak(02) Background vertical mixing coefficient (m2/s)
for tracer 02: salt
5.0000E-05 Akv_bak Background vertical mixing coefficient (m2/s)
for momentum.
5.0000E-06 Akk_bak Background vertical mixing coefficient (m2/s)
for turbulent energy.
5.0000E-06 Akp_bak Background vertical mixing coefficient (m2/s)
for turbulent generic statistical field.
3.000 gls_p GLS stability exponent.
1.500 gls_m GLS turbulent kinetic energy exponent.
-1.000 gls_n GLS turbulent length scale exponent.
7.6000E-06 gls_Kmin GLS minimum value of turbulent kinetic energy.
1.0000E-12 gls_Pmin GLS minimum value of dissipation.
5.4770E-01 gls_cmu0 GLS stability coefficient.
1.4400E+00 gls_c1 GLS shear production coefficient.
1.9200E+00 gls_c2 GLS dissipation coefficient.
-4.0000E-01 gls_c3m GLS stable buoyancy production coefficient.
1.0000E+00 gls_c3p GLS unstable buoyancy production coefficient.
1.0000E+00 gls_sigk GLS constant Schmidt number for TKE.
1.3000E+00 gls_sigp GLS constant Schmidt number for PSI.
1400.000 charnok_alpha Charnok factor for Zos calculation.
0.500 zos_hsig_alpha Factor for Zos calculation using Hsig(Awave).
0.250 sz_alpha Factor for Wave dissipation surface tke flux .
100.000 crgban_cw Factor for Craig/Banner surface tke flux.
3.0000E-04 rdrg Linear bottom drag coefficient (m/s).
3.0000E-03 rdrg2 Quadratic bottom drag coefficient.
5.0000E-03 Zob Bottom roughness (m).
5.0000E-03 Zos Surface roughness (m).
1 Vtransform S-coordinate transformation equation.
1 Vstretching S-coordinate stretching function.
1.0000E+00 theta_s S-coordinate surface control parameter.
1.0000E+00 theta_b S-coordinate bottom control parameter.
1.000 Tcline S-coordinate surface/bottom layer width (m) used
in vertical coordinate stretching.
1028.000 rho0 Mean density (kg/m3) for Boussinesq approximation.
0.000 dstart Time-stamp assigned to model initialization (days).
0.00 time_ref Reference time for units attribute (yyyymmdd.dd)
1.2500E-01 Tnudg(01) Nudging/relaxation time scale (days)
for tracer 01: temp
1.2500E-01 Tnudg(02) Nudging/relaxation time scale (days)
for tracer 02: salt
1.0000E-03 Znudg Nudging/relaxation time scale (days)
for free-surface.
1.0000E-03 M2nudg Nudging/relaxation time scale (days)
for 2D momentum.
1.0000E-03 M3nudg Nudging/relaxation time scale (days)
for 3D momentum.
1.0000E+00 obcfac Factor between passive and active
open boundary conditions.
F VolCons(1) NLM western edge boundary volume conservation.
F VolCons(2) NLM southern edge boundary volume conservation.
F VolCons(3) NLM eastern edge boundary volume conservation.
F VolCons(4) NLM northern edge boundary volume conservation.
10.000 T0 Background potential temperature (C) constant.
30.000 S0 Background salinity (PSU) constant.
1027.000 R0 Background density (kg/m3) used in linear Equation
of State.
1.7000E-04 Tcoef Thermal expansion coefficient (1/Celsius).
7.6000E-04 Scoef Saline contraction coefficient (1/PSU).
1.000 gamma2 Slipperiness variable: free-slip (1.0) or
no-slip (-1.0).
T Hout(idFsur) Write out free-surface.
T Hout(idUbar) Write out 2D U-momentum component.
T Hout(idVbar) Write out 2D V-momentum component.
T Hout(idUvel) Write out 3D U-momentum component.
T Hout(idVvel) Write out 3D V-momentum component.
T Hout(idWvel) Write out W-momentum component.
T Hout(idOvel) Write out omega vertical velocity.
T Hout(idTvar) Write out tracer 01: temp
T Hout(idTvar) Write out tracer 02: salt
T Hout(idUbms) Write out bottom U-momentum stress.
T Hout(idVbms) Write out bottom V-momentum stress.
T Hout(idBott) Write out bottom property 01: grain_diameter
T Hout(idBott) Write out bottom property 02: grain_density
T Hout(idBott) Write out bottom property 03: settling_vel
T Hout(idBott) Write out bottom property 04: erosion_stress
T Hout(idBott) Write out bottom property 05: ripple_length
T Hout(idBott) Write out bottom property 06: ripple_height
T Hout(idBott) Write out bottom property 07: bed_wave_amp
T Hout(idBott) Write out bottom property 08: Zo_def
T Hout(idBott) Write out bottom property 09: Zo_app
T Hout(idVvis) Write out vertical viscosity: AKv.
T Hout(idSdif) Write out vertical diffusion: AKt(isalt).
T Hout(idMtke) Write out turbulent kinetic energy.
T Hout(idMtls) Write out turbulent generic length-scale.

T Aout(idFsur) Write out averaged free-surface.
T Aout(idUbar) Write out averaged 2D U-momentum component.
T Aout(idVbar) Write out averaged 2D V-momentum component.
T Aout(idUvel) Write out averaged 3D U-momentum component.
T Aout(idVvel) Write out averaged 3D V-momentum component.
T Aout(idWvel) Write out averaged W-momentum component.
T Aout(idOvel) Write out averaged omega vertical velocity.
T Aout(idTvar) Write out averaged tracer 01: temp
T Aout(idTvar) Write out averaged tracer 02: salt

Output/Input Files:

Output Restart File: ocean_rst.nc
Output History File: ocean_his.nc
Output Averages File: ocean_avg.nc

Tile partition information for Grid 01: 0200x0003x0020 tiling: 001x001

tile Istr Iend Jstr Jend Npts

0 1 200 1 3 12000

Tile minimum and maximum fractional grid coordinates:
(interior points only)

tile Xmin Xmax Ymin Ymax grid

0 0.50 201.50 0.50 3.50 RHO-points

0 0.00 201.00 0.50 3.50 U-points

0 0.50 201.50 0.00 3.00 V-points


Top
 Profile  
Reply with quote  
PostPosted: Wed Jan 02, 2013 2:07 am 
Offline

Joined: Fri Apr 02, 2004 4:46 pm
Posts: 35
Location: USGS, Woods Hole, USA
It is weird that putting the full path name in did not fix the problem.

The .f90 files are in the ./Build directory: they are made when the C pre-processor program is run on the .F files, and are a good place to look to see what code is actually included after all of the #ifdef commands are resolved.

Have you changed the path of the sediment_estuary_test.in file specified in your ocean_estuary_test.in file? It starts out as /ROMS/External, but you should remove the path prefix to that it points to the local version.

_________________
Chris Sherwood, USGS
1 508 457 2269


Top
 Profile  
Reply with quote  
PostPosted: Wed Jan 02, 2013 2:05 pm 
Offline

Joined: Tue Oct 16, 2012 8:55 pm
Posts: 29
Location: Old Dominion Universiy
Good news Chris,

Had not specified location of sediment_estuary_test.in in ocean_estuary_test.in and doing so appearred to fix the problem. Did not change the path on the remaining files in the "External" folder as shown below:

APARNAM = ROMS/External/s4dvar.in
SPOSNAM = ROMS/External/stations.in
FPOSNAM = ROMS/External/floats.in
BPARNAM = ROMS/External/bio_Fennel.in
SPARNAM = sediment_estuary_test.in
USRNAME = ROMS/External/MyFile.dat

Run appears to complete "ROMS/TOMS: DONE... Wednesday - January 2, 2013 - 8:00 15 A
and created .nc files. Guess I need to plot to see if the run was actually successful?


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 7 posts ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group