Compiling INLET_TEST case

Report or discuss software problems and other woes

Moderators: arango, robertson

Post Reply
Message
Author
olabarrim
Posts: 3
Joined: Fri May 19, 2006 11:43 am
Location: University of Florida

Compiling INLET_TEST case

#1 Post by olabarrim » Fri May 18, 2007 10:13 am

Dear all! :)

I have a problem when compilig the INLET_TEST case model. :( The fact is that I am not using MPI when compiling. Even if I have installed the MCT libray enableing serial computation, the ocean_coupler.F seems to be dependent on MPI.

Has anybody try to compile the ROMS-SWAN coupling cases for serial applications?

Does anybody have any idea of what can be going on?


Thank you very much

jcwarner
Posts: 823
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#2 Post by jcwarner » Fri May 18, 2007 1:28 pm

Short answer: I have never tried it with 1 processor.
Long answer:
Currently, all the coupling applications require at least 2 or more processors. All the tests that I have done used 2 or more processors.

I do see that the MCT package has some way for a single processor machine to "emulate" multiple processor environment, but I have never tried this. There may be other ways for a single processor system to emulate multiple processors, but I have not tried these either.

So, as far as i know, the current methodology has only ever been tested on systems with 2 or more processors.
For my development, I use a windows XP laptop with dual core processors, Cygwin, MCT, and MPICH2
http://www-unix.mcs.anl.gov/mpi/mpich/
I can run inlet_test and test_head (both of which require mct coupling with swan) on my laptop.

For realistic applications, we have a 72 processor Linux cluster with AMD processors. All these tests cases work well there also.

As other users apply the coupled system, we are learning about compiler flags and best ways to set up the Compilers files. SWAN needs fixed width, and there have been some recent changes to Compilers to account for this in a more consistent manner.

olabarrim
Posts: 3
Joined: Fri May 19, 2006 11:43 am
Location: University of Florida

#3 Post by olabarrim » Fri May 18, 2007 2:00 pm

Thank you very much for the answer. I will try to install MPI and to run it in parallel. Thanks

nobuhitomori
Posts: 22
Joined: Fri Jul 08, 2005 5:42 pm
Location: Kyoto University

ROMS-SWAN coupling test

#4 Post by nobuhitomori » Tue Jun 19, 2007 4:39 am

Dear All,

I had compiled ROMS with MPICH and MCT for INLET_TEST and have
tried to run using mpirun but failed. The errors such as "p1_20395: p4_error: net_recv read" look like mpi based erros but I could not
figure out. Does anybody have similar problem?

I'm currently using Intel Compiler version 10.0.023 and MPICH 1.2.7p1.


# mpirun -np 2 oceanM ROMS/External/coupling_inlet_test.in
Coupled Input File name =
ROMS/External/coupling_inlet_test.in
Waves-Ocean Models Coupling:
Ocean Model MPI nodes: 000 - 000
Waves Model MPI nodes: 001 - 001
Process Information:
Node # 0 (pid= 10787) is active.
Model Input Parameters: ROMS/TOMS version 3.0
Tuesday - June 19, 2007 - 1:33:39 PM
READ_PHYPAR - Invalid dimension parameter, NAT = 2
make sure that NAT is either 1 or 2.
SWAN is preparing computation

p1_20395: p4_error: net_recv read: probable EOF on socket: 1
rm_l_1_20396: (0.082031) net_send: could not write to fd=5, errno = 32
bm_list_10788: (0.164062) wakeup_slave: unable to interrupt slave 0 pid 10787
p1_20395: (0.082031) net_send: could not write to fd=5, errno = 32

jcwarner
Posts: 823
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#5 Post by jcwarner » Tue Jun 19, 2007 5:50 pm

The output from your simulation shows:

"Model Input Parameters: ROMS/TOMS version 3.0
Tuesday - June 19, 2007 - 1:33:39 PM
READ_PHYPAR - Invalid dimension parameter, NAT = 2
make sure that NAT is either 1 or 2. "

This is why your simulation stopped. After that, all the MPI errors occur since ROMS has stopped and the comm links are no longer valid. So first we need to figure out why you are getting this NAT error.
The error says NAT = 2, and it should = 2, so i am not sure what is wrong.

I had tested the inlet_test application before roms 3.0 was released, and it works on my systems.

To test it agian, I compiled the INLET_TEST application, and here is my output:

"mpiexec -np 2 ./oceanM.exe ROMS/External/coupling_inlet_test.in

Coupled Input File name =
ROMS/External/coupling_inlet_test.in


Waves-Ocean Models Coupling:

Ocean Model MPI nodes: 000 - 000

Waves Model MPI nodes: 001 - 001
Process Information:

Node # 0 (pid= 0) is active.

Model Input Parameters: ROMS/TOMS version 3.0
Tuesday - June 19, 2007 - 1:40:09 PM
-----------------------------------------------------------------------------

SWAN is preparing computation


Inlet Test Case

Operating system : CYGWIN
CPU/hardware : i686
Compiler system : ifort
Compiler command : ifort
Compiler flags : /align /G7 /MD /Ox -Id:\data\models\MPICH2\include -Id:\data\models\MCT\MCT_2.2.0_0u\mct -Id:\data\models\MCT\MCT_2.2.0_0u\mpeu /noextend_source -assume:bytere

Input Script : ROMS/External/ocean_inlet_test.in

SVN Root URL : https://www.myroms.org/svn/cstm/branches/jcw_branch
SVN Revision : 817:819M

Local Root : /cygdrive/d/data/models/roms/roms_sed_rutgers_cygwin/branches/jcw_branch
Header Dir : d:\data\models\roms\roms_sed_rutgers_cygwin\branches\jcw_branch\ROMS\Include
Header file : inlet_test.h
Analytical Dir: ./ROMS/Functionals

Resolution, Grid 01: 0075x0070x008, Parallel Nodes: 1, Tiling: 001x001


Physical Parameters, Grid: 01
=============================

34560 ntimes Number of timesteps for 3-D equations.
5.000 dt Timestep size (s) for 3-D equations.
20 ndtfast Number of timesteps for 2-D equations between
each 3D timestep.
120.000 TI_WAV_OCN Time interval (s) between coupling WAV-OCN models.
24 nOCN_WAV Number of OCN timesteps between coupling to WAV.
1 ERstr Starting ensemble/perturbation run number.
1 ERend Ending ensemble/perturbation run number.
0 nrrec Number of restart records to read from disk.
T LcycleRST Switch to recycle time-records in restart file.
720 nRST Number of timesteps between the writing of data
into restart fields.
1 ninfo Number of timesteps between print of information
to standard output.
T ldefout Switch to create a new output NetCDF file(s).
720 nHIS Number of timesteps between the writing fields
into history file.
1.0000E-03 visc2 Horizontal, harmonic mixing coefficient (m2/s)

............................................ (continues on, and runs .............) "


As you can see, I am not getting the error that you are getting.
Did you modify any of the settings in the input files?

-john

jcwarner
Posts: 823
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#6 Post by jcwarner » Thu Jun 21, 2007 5:18 pm

well, i just updated to intel compiler 10.0.023.
I am not doing the inlet test right now, but for a different application i get the same error that you had :
"READ_PHYPAR - Invalid dimension parameter, NAT = 2
make sure that NAT is either 1 or 2. "

I did not get this error with Intel 9.1.032 or earlier.

so let me try some magic and see what i can find out.

nobuhitomori
Posts: 22
Joined: Fri Jul 08, 2005 5:42 pm
Location: Kyoto University

#7 Post by nobuhitomori » Fri Jun 22, 2007 2:30 am

John,

Thank you for your kind analysis.
I also ran oceanM with ocean_upwelling.in case. I got the same
message with coupling_inlet case as you pointed out.
This is probably intel compiler 10.0 and mpich problem and
I cannot figure out by myself. I wonder if you give me the
solution of it.

-----------------------------------------------------------------------------
# mpirun -np 2 ./oceanM ROMS/External/ocean_upwelling.in
Process Information:
Node # 0 (pid= 20005) is active.
Model Input Parameters: ROMS/TOMS version 3.0
Friday - June 22, 2007 - 11:12:07 AM
Node # 1 (pid= 21551) is active.
READ_PHYPAR - Invalid dimension parameter, NAT = 2
make sure that NAT is either 1 or 2.

Post Reply