SWAN wind input

General scientific issues regarding ROMS

Moderators: arango, robertson

Post Reply
Message
Author
zduvims
Posts: 9
Joined: Fri Feb 10, 2023 4:06 pm
Location: Virginia Institute of Marine Science

SWAN wind input

#1 Unread post by zduvims »

Hi, I am recently running a coupled SWAN-ROMS model, and I have a question about the wind input data. In my swan.in, it says:

Code: Select all

&& KEYWORD TO CREATE WIND GRID &&
INPGRID WIND CURVILINEAR 0 0 487 362 EXC 9.999000e+003 &
       NONSTATIONARY 20050701.000000 1 HR 20060101.000000
READINP WIND 1 './swan_ERA5_2005.dat' 4 0 FREE
After reading the SWAN manual, my understanding is that "NONSTATIONARY 20050701.000000 1 HR 20060101.000000" means my wind input data start from 2005/07/01 until 2006/01/01 with an hourly interval. I used the COAWST matlab tool to create this wind input file, and I noticed it did not contain time information.

Now I ran the model as a coldstart from 2005/07/01, then it stopped at some point, let say 2005/07/03, and I wanted to use the restart files to do a hotstart.

Here is my question: if I still use "NONSTATIONARY 20050701.000000 1 HR 20060101.000000", will SWAN ignore the data records before 2005/07/03 in the input file? If it's not, do I need to create a new wind input file with the data beginning at 2005/07/03, and change it to "NONSTATIONARY 20050703.000000 1 HR 20060101.000000"? Thank you.

jcwarner
Posts: 1220
Joined: Wed Dec 31, 2003 6:16 pm
Location: USGS, USA

Re: SWAN wind input

#2 Unread post by jcwarner »

for the rst, you can keep using the same wind file, and the same wind command lines.
the command lines go along with that file.
Yes, it is an archaic ascii file that only has data, so the command line tells swan when the data starts, dt, and data end.
next time you have a coawst question post it on that trac
https://github.com/DOI-USGS/COAWST/issues

thanks -j

zduvims
Posts: 9
Joined: Fri Feb 10, 2023 4:06 pm
Location: Virginia Institute of Marine Science

Re: SWAN wind input

#3 Unread post by zduvims »

Thank you very much, John!

Post Reply