Double Periodic Boundary Conditions

Report or discuss software problems and other woes

Moderators: arango, robertson

Post Reply
Message
Author
annalisa
Posts: 6
Joined: Fri Sep 03, 2004 5:48 pm
Location: Georgia Institute of Technology

Double Periodic Boundary Conditions

#1 Unread post by annalisa »

Hello,

I'm using ROMS version 2.2 (I may try version 3 tomorrow, but I don't have much hope), in a double-periodic configuration. On a single processor I have no problems, using 128X128 grid points in the horizontal, but with MPI I get in troubles. I tryed to change the size of the domain (+1, +2, -1 in each direction. In some case I could not run) and nothing worked. I get 'noise' according to the tiling I use. If for example I set NtileI == 2 and NtileJ == 8 I get a noisy line in the center of the domain, + due half lines at the borders and 8 'bumps' along the y-axis. With NtileI==1, I get 16 bumps all along y. at x=0 and x=128
The problem is that the 'noise' propagates into the domain and affect the physics of the solution. Any suggestion? Am I simply forgetting to set something properly?

thanks

Annalisa

jcwarner
Posts: 1182
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#2 Unread post by jcwarner »

well Annalisa it would be nice if you could update to roms 3.0. I have run many applications with mpi and am not getting the tiling lines that you seem to be getting. But if there are problems then i will spend the time to correct it.
Did you add any 'ana_*" stuff that could be causing the tiling issues?
Try the upwelling case first to see if the tiling works ok in that application.

annalisa
Posts: 6
Joined: Fri Sep 03, 2004 5:48 pm
Location: Georgia Institute of Technology

#3 Unread post by annalisa »

> Did you add any 'ana_*" stuff that could be causing the tiling issues?

yes... is there a simple way to correct it? We're using a very idealized wind stress forcing
Is there a routine where I can understand how the tiling has to be taken into account?

thanks a lot

Annalisa

jcwarner
Posts: 1182
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#4 Unread post by jcwarner »

why don't you just paste into this board the part of ana_amflux that you added. I do not need to see the whole thing - just the do loops that you added.

annalisa
Posts: 6
Joined: Fri Sep 03, 2004 5:48 pm
Location: Georgia Institute of Technology

#5 Unread post by annalisa »

Here it comes:
(for the wind-stress)
# elif defined QG
idum=-27
DO j=JstrR,JendR
DO i=Istr,IendR
val2=0.3_r8*(ran2(idum)-0.5_r8)
val1=SIN(2.0_r8*pi*3.0_r8*yr(i,j)/el(ng))
val1=val1*(SIN(2.0_r8*pi*3.0_r8*xr(i,j)/xl(ng)))
sustr(i,j)=0.0001_r8*(val1+val2)
END DO
END DO

run2 is defined in the standard Numerical Recipes format

(for tracer initial conditions):
# elif defined QG
DO k=1,N(ng)
DO j=JstrR,JendR
DO i=IstrR,IendR
! t(i,j,k,1,itemp)=T0(ng)+22.2_r8*EXP(0.0017_r8*z_r(i,j,k))
! t(i,j,k,1,isalt)=S0(ng)+2.36_r8*EXP(0.0024_r8*z_r(i,j,k))
t(i,j,k,1,itemp)=T0(ng)+7.0_r8+12.0_r8*EXP(0.017_r8*z_r(i,j,k))
t(i,j,k,1,isalt)=S0(ng)+2.0_r8*EXP(0.024_r8*z_r(i,j,k))
t(i,j,k,2,itemp)=t(i,j,k,1,itemp)
t(i,j,k,2,isalt)=t(i,j,k,1,isalt)
END DO
END DO
END DO

Thanks once more!

Annalisa

inga
Posts: 10
Joined: Wed May 25, 2005 10:08 pm
Location: GEOMAR | Helmholtz Centre for Ocean Research Kiel

#6 Unread post by inga »

Hi,

I'm working with Annalisa on this application, trying to configure it with MPI during the European day. And if there is an user-generated mess, I'm to be blaimed, not she...
Just to make sure if the problem is not the kernel itself, I have reloaded on the cluster the Rutgers 2.2 version with the freshest correction patch (with changes from Dec.2006). Thus, the only user-generated changes were the files: analytical.F, cppdefs.h and *.in file. I rerun the model under MPI with the tiling as before (16 and 2 vs 32 proc). All the same, a nasty stripe in the middle of the domain after 1 day and a subsequent blow-up.

Then I started switch off and on various options used (e.g climatology).
And I found the problem: it was ran2 in the forcing. When val2=0.0, I see nice spatial variability in my fields according to scales of the forcing imposed, and no nasty stripes even after 30 days.
Is it some exchange problem or maybe so imposed ran2 destroys the interpretation of the loop by paralelization somehow?
inga

inga
Posts: 10
Joined: Wed May 25, 2005 10:08 pm
Location: GEOMAR | Helmholtz Centre for Ocean Research Kiel

#7 Unread post by inga »

Ps. By "kernel" here I meant the model "kernel", i.e. the part that is usually not modified by an average user, and not a system.....
inga

jcwarner
Posts: 1182
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

#8 Unread post by jcwarner »

inga and annalisa-
sorry for the late response. i was on travel today.
I am not seeing anything that should be wrong. I am not sure why the rand function is an issue. The model should only compute the sustr values once at each i,j point. Then the mp_exchange routines would exchange the information to fill the halo regions. If the model was computing the same information several times at the same i,j locations, then a rand function would cause issues becasue the information would not be the same each time. But the model should only compute the data once at each i,j location.
I will ask around to see if anyone else sees anything.
Anyone else want to chime in ??

User avatar
arango
Site Admin
Posts: 1351
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

#9 Unread post by arango »

You can not use this type of random number routine in parallel :!: Your code above is completelly wrong in parallel. By the way, random numbers in parallel are extremely tricky. See Numerical recipes in Fortran 90. That book has a full chapter on this explaining the parallel difficulties of random numbers. You cannot use ran2 :!:

If you check ROMS version 3.0, you will find how the randon numbers are done in parallel. See routine white_noise.F which uses gasdev.F, ran_state.F, and ran1.F

inga
Posts: 10
Joined: Wed May 25, 2005 10:08 pm
Location: GEOMAR | Helmholtz Centre for Ocean Research Kiel

#10 Unread post by inga »

Thank you very much. Motivated by your rip, I have remembered that also roms-2.2 must have a parallel-proof random generator because it has a vertical-random-walk option for floats. So instead of employing the module from roms-3.0, I have found nrng and urng in utility.F of roms-2.2. It's working perfectly - no more mpi-stripes in the solution . :D
inga

Post Reply