Difference between revisions of "RBL4D-Var Analysis Observation Impact Tutorial"

From WikiROMS
Jump to navigationJump to search
(21 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<div class="title">4-DVar PSAS Observation Impact</div>
<div class="title">4D-PSAS Observation Impact</div>
The various files in the <span class="twilightBlue">PSAS_impact</span> folder are needed to run the strong/weak constraint, dual form 4-Dimensional Variational ('''4D-Var''') data assimilation observation impact based on the Physical-space Statistical Analysis System ('''PSAS''') algorithm in the California Current System, 1/3&deg; resolution, application ('''WC13''').
 
 
{{warning}}This page is under construction{{warning}}




Line 10: Line 6:


<div style="clear: both"></div>
<div style="clear: both"></div>
==Introduction==
During this exercise you will apply the strong/weak constraint, dual form 4-Dimensional Variational ('''4D-Var''') data assimilation observation impact based on the Physical-space Statistical Analysis System ([[Options#W4DPSAS|PSAS]]) algorithm to ROMS configured for the U.S. west coast and the California Current System (CCS). This configuration, referred to as [[Options#WC13|WC13]], has 30 km horizontal resolution, and 30 levels in the vertical. While 30 km resolution is inadequate for capturing much of the energetic mesoscale circulation associated with the CCS, [[Options#WC13|WC13]] captures the broad-scale features of the circulation quite well, and serves as a very useful and efficient illustrative example of 4D-PSAS observation impact.
{{#lst:4DVar_Tutorial_Introduction|setup}}
==Running 4D-PSAS Observation Impact==
To run this exercise, go first to the directory <span class="twilightBlue">WC13/PSAS_impact</span>. Instructions for compiling and running the model are provided below or can be found in the <span class="twilightBlue">Readme</span> file. The recommended configuration for this exercise is one outer-loop and 50 inner-loops, and <span class="twilightBlue">roms_wc13.in</span> is configured for this default case. The number of inner-loops is controlled by the parameter [[Variables#Ninner|Ninner]] in <span class="twilightBlue">roms_wc13.in</span>.


==Important CPP Options==
==Important CPP Options==
<div class="box">  [[W4DPSAS_SENSITIVITY]]    4D-PSAS observation sensitivity driver<br />  [[AD_IMPULSE]]              Force ADM with intermittent impulses<br />  [[OBS_IMPACT]]              Compute observation impact<br />  [[OBS_IMPACT_SPLIT]]        separate impact due to IC, forcing, and OBC<br />  [[WC13]]                    Application CPP option</div>
The following C-preprocessing options are activated in the [[build_Script|build script]]:
<div class="box">  [[Options#W4DPSAS_SENSITIVITY|W4DPSAS_SENSITIVITY]]    4D-PSAS observation sensitivity driver<br />  [[Options#ANA_SPONGE|ANA_SPONGE]]              Analytical enhanced viscosity/diffusion sponge<br />  [[Options#AD_IMPULSE|AD_IMPULSE]]              Force ADM with intermittent impulses<br />  [[Options#BGQC|BGQC]]                    Backgound quality control of observations<br />  [[Options#IMPACT_INNER|IMPACT_INNER]]            Writing observations impacts for each inner loop<br />  [[Options#MINRES|MINRES]]                  Minimal Residual Method for 4D-Var minimization<br />  [[Options#OBS_IMPACT|OBS_IMPACT]]              Compute observation impact<br />  [[Options#OBS_IMPACT_SPLIT|OBS_IMPACT_SPLIT]]        separate impact due to IC, forcing, and OBC<br />  [[Options#RPCG|RPCG]]                    Restricted B-preconditioned Lanczos minimization<br />  [[Options#TIME_CONV|TIME_CONV]]              Weak-constraint 4D-Var time convolution<br />  [[Options#WC13|WC13]]                    Application CPP option</div>


==Input NetCDF Files==
==Input NetCDF Files==
[[WC13]] requires the following input NetCDF files:
<div class="box">                      <span class="twilightBlue">Grid File:</span>  ../Data/wc13_grd.nc<br />          <span class="twilightBlue">Nonlinear Initial File:</span>  wc13_ini.nc<br />                <span class="twilightBlue">Forcing File 01:</span>  ../Data/coamps_wc13_lwrad_down.nc<br />                <span class="twilightBlue">Forcing File 02:</span>  ../Data/coamps_wc13_Pair.nc<br />                <span class="twilightBlue">Forcing File 03:</span>  ../Data/coamps_wc13_Qair.nc<br />                <span class="twilightBlue">Forcing File 04:</span>  ../Data/coamps_wc13_rain.nc<br />                <span class="twilightBlue">Forcing File 05:</span>  ../Data/coamps_wc13_swrad.nc<br />                <span class="twilightBlue">Forcing File 06:</span>  ../Data/coamps_wc13_Tair.nc<br />                <span class="twilightBlue">Forcing File 07:</span>  ../Data/coamps_wc13_wind.nc<br />                  <span class="twilightBlue">Boundary File:</span>  ../Data/wc13_ecco_bry.nc<br /><br />        <span class="twilightBlue">Adjoint Sensitivity File:</span>  wc13_ads.nc<br />    <span class="twilightBlue">Initial Conditions STD File:</span>  ../Data/wc13_std_i.nc<br />                  <span class="twilightBlue">Model STD File:</span>  ../Data/wc13_std_m.nc<br />    <span class="twilightBlue">Boundary Conditions STD File:</span>  ../Data/wc13_std_b.nc<br />        <span class="twilightBlue">Surface Forcing STD File:</span>  ../Data/wc13_std_f.nc<br />    <span class="twilightBlue">Initial Conditions Norm File:</span>  ../Data/wc13_nrm_i.nc<br />                <span class="twilightBlue">Model Norm File:</span>  ../Data/wc13_nrm_m.nc<br />  <span class="twilightBlue">Boundary Conditions Norm File:</span>  ../Data/wc13_nrm_b.nc<br />      <span class="twilightBlue">Surface Forcing Norm File:</span>  ../Data/wc13_nrm_f.nc<br/>              <span class="twilightBlue">Observations File:</span>  wc13_obs.nc<br />            <span class="twilightBlue">Lanczos Vectors File:</span>  wc13_lcz.nc</div>
<div class="box">                      <span class="twilightBlue">Grid File:</span>  ../Data/wc13_grd.nc<br />          <span class="twilightBlue">Nonlinear Initial File:</span>  wc13_ini.nc<br />                <span class="twilightBlue">Forcing File 01:</span>  ../Data/coamps_wc13_lwrad_down.nc<br />                <span class="twilightBlue">Forcing File 02:</span>  ../Data/coamps_wc13_Pair.nc<br />                <span class="twilightBlue">Forcing File 03:</span>  ../Data/coamps_wc13_Qair.nc<br />                <span class="twilightBlue">Forcing File 04:</span>  ../Data/coamps_wc13_rain.nc<br />                <span class="twilightBlue">Forcing File 05:</span>  ../Data/coamps_wc13_swrad.nc<br />                <span class="twilightBlue">Forcing File 06:</span>  ../Data/coamps_wc13_Tair.nc<br />                <span class="twilightBlue">Forcing File 07:</span>  ../Data/coamps_wc13_wind.nc<br />                  <span class="twilightBlue">Boundary File:</span>  ../Data/wc13_ecco_bry.nc<br /><br />        <span class="twilightBlue">Adjoint Sensitivity File:</span>  wc13_ads.nc<br />    <span class="twilightBlue">Initial Conditions STD File:</span>  ../Data/wc13_std_i.nc<br />                  <span class="twilightBlue">Model STD File:</span>  ../Data/wc13_std_m.nc<br />    <span class="twilightBlue">Boundary Conditions STD File:</span>  ../Data/wc13_std_b.nc<br />        <span class="twilightBlue">Surface Forcing STD File:</span>  ../Data/wc13_std_f.nc<br />    <span class="twilightBlue">Initial Conditions Norm File:</span>  ../Data/wc13_nrm_i.nc<br />                <span class="twilightBlue">Model Norm File:</span>  ../Data/wc13_nrm_m.nc<br />  <span class="twilightBlue">Boundary Conditions Norm File:</span>  ../Data/wc13_nrm_b.nc<br />      <span class="twilightBlue">Surface Forcing Norm File:</span>  ../Data/wc13_nrm_f.nc<br/>              <span class="twilightBlue">Observations File:</span>  wc13_obs.nc<br />            <span class="twilightBlue">Lanczos Vectors File:</span>  wc13_lcz.nc</div>


==Various Scripts and Include Files==
==Various Scripts and Include Files==
<div class="box">  [[build_Script|build.bash]]           bash shell script to compile application<br />  [[build_Script|build.sh]]             csh Unix script to compile application<br />  [[job_psas_sen|job_psas_sen.sh]]       job configuration script<br />  <span class="twilightBlue">ocean_wc13.in</span>        ROMS standard input script for WC13<br />  [[s4dvar.in]]            4D-Var standard input script template<br />  <span class="twilightBlue">wc13.h</span>                WC13 header with CPP options</div>
The following files will be found in <span class="twilightBlue">WC13/PSAS_impact</span> directory after downloading from ROMS test cases SVN repository:
<div class="box">  <span class="twilightBlue">Readme</span>                instructions<br />  [[build_Script|build_roms.bash]]       bash shell script to compile application<br />  [[build_Script|build_roms.sh]]         csh Unix script to compile application<br />  [[job_psas_sen|job_psas_impact.sh]]   job configuration script<br />  [[roms.in|roms_wc13.in]]          ROMS standard input script for WC13<br />  [[s4dvar.in]]            4D-Var standard input script template<br />  <span class="twilightBlue">wc13.h</span>                WC13 header with CPP options</div>


==Important parameters in standard input <span class="twilightBlue">ocean_wc13.in</span> script==
==Important parameters in standard input <span class="twilightBlue">roms_wc13.in</span> script==
*Notice that this driver uses the following adjoint sensitivity parameters (see input script for details):
*Notice that this driver uses the following adjoint sensitivity parameters (see input script for details):
:<div class="box">      [[Variables#DstrS|DstrS]] == 0.0d0                      ! starting day<br />      [[Variables#DendS|DendS]] == 0.0d0                      ! ending day<br /><br />      [[Variables#KstrS|KstrS]] ==  1                        ! starting level<br />      [[Variables#KendS|KendS]] == 30                        ! ending level<br /><br />      [[Variables#Lstate|Lstate(isFsur)]] == T                ! free-surface<br />      [[Variables#Lstate|Lstate(isUbar)]] == T                ! 2D U-momentum<br />      [[Variables#Lstate|Lstate(isVbar)]] == T                ! 2D V-momentum<br />      [[Variables#Lstate|Lstate(isUvel)]] == T                ! 3D U-momentum<br />      [[Variables#Lstate|Lstate(isVvel)]] == T                ! 3D V-momentum<br /><br />      [[Variables#Lstate|Lstate(isTvar)]] == T T              ! tracers</div>
:<div class="box">      [[Variables#DstrS|DstrS]] == 0.0d0                      ! starting day<br />      [[Variables#DendS|DendS]] == 0.0d0                      ! ending day<br /><br />      [[Variables#KstrS|KstrS]] ==  1                        ! starting level<br />      [[Variables#KendS|KendS]] == 30                        ! ending level<br /><br />      [[Variables#Lstate|Lstate(isFsur)]] == T                ! free-surface<br />      [[Variables#Lstate|Lstate(isUbar)]] == T                ! 2D U-momentum<br />      [[Variables#Lstate|Lstate(isVbar)]] == T                ! 2D V-momentum<br />      [[Variables#Lstate|Lstate(isUvel)]] == T                ! 3D U-momentum<br />      [[Variables#Lstate|Lstate(isVvel)]] == T                ! 3D V-momentum<br />      [[Variables#Lstate|Lstate(isWvel)]] == F                ! 3D W-momentum<br /><br />      [[Variables#Lstate|Lstate(isTvar)]] == T T              ! tracers</div>


*Both '''FWDNAME''' and '''HISNAME''' must be the same:
*Both '''FWDNAME''' and '''HISNAME''' must be the same:
Line 30: Line 38:
To run this application you need to take the following steps:
To run this application you need to take the following steps:


#We need to run the model application for a period that is long enough to compute meaningful circulation statistics, like mean and standard deviations for all prognostic state variables ([[Variables#zeta|zeta]], [[Variables#u|u]], [[Variables#v|v]], [[Variables#T|T]], and [[Variables#S|S]]). The standard deviations are written to NetCDF files and are read by the 4D-Var algorithm to convert modeled error correlations to error covariances. The error covariance matrix, '''D''', is very large and not well known. It is modeled by convolving a diffusion equation as in Weaver and Courtier (2001).<br /><br />In this application, we need standard deviations for initial conditions, surface forcing ([[ADJUST_WSTRESS]] and [[ADJUST_STFLUX]]), and open boundary conditions ([[ADJUST_BOUNDARY]]). The standard deviations for the initial and open boundary conditions are in terms of the unbalanced error covariance(K Du K<sup>T</sup>) since the balanced operator is activated ([[BALANCE_OPERATOR]] and [[ZETA_ELLIPTIC]]).<br /><br />The balance operator imposes a multivariate constraint on the error covariance such that the unobserved variable information is extracted from observed data by establishing balance relationships (i.e., T-S empirical formulas, hydrostactic balance, and geostrophic balance) with other state variables (Weaver ''et al.'', 2005).<br /><br />These standard deviations have already been created for you:<div class="box"><span class="twilightBlue">../Data/wc13_std_i.nc</span>    initial conditions<br /><span class="twilightBlue">../Data/wc13_std_b.nc</span>    open boundary conditions<br /><span class="twilightBlue">../Data/wc13_std_f.nc</span>    surface forcing (wind stress and<br />                                          net heat flux)</div>
#We need to run the model application for a period that is long enough to compute meaningful circulation statistics, like mean and standard deviations for all prognostic state variables ([[Variables#zeta|zeta]], [[Variables#u|u]], [[Variables#v|v]], [[Variables#T|T]], and [[Variables#S|S]]). The standard deviations are written to NetCDF files and are read by the 4D-Var algorithm to convert modeled error correlations to error covariances. The error covariance matrix, '''D'''=diag('''B<sub>x</sub>''', '''B<sub>b</sub>''', '''B<sub>f</sub>''', '''Q'''), is very large and not well known. '''B''' is modeled as the solution of a diffusion equation as in [[Bibliography#WeaverAT_2001a|Weaver and Courtier (2001)]]. Each covariance matrix is factorized as '''B = K &Sigma; C &Sigma;<sup>T</sup> K<sup>T</sup>''', where '''C''' is a univariate correlation matrix, '''&Sigma;''' is a diagonal matrix of error standard deviations, and '''K''' is a multivariate balance operator.<div class="para">&nbsp;</div>In this application, we need standard deviations for initial conditions, surface forcing ([[Options#ADJUST_WSTRESS|ADJUST_WSTRESS]] and [[Options#ADJUST_STFLUX|ADJUST_STFLUX]]), and open boundary conditions ([[Options#ADJUST_BOUNDARY|ADJUST_BOUNDARY]]). If the balance operator is activated ([[Options#BALANCE_OPERATOR|BALANCE_OPERATOR]] and [[Options#ZETA_ELLIPTIC|ZETA_ELLIPTIC]]), the standard deviations for the initial and boundary conditions error covariance are in terms of the unbalanced error covariance ('''K B<sub>u</sub> K<sup>T</sup>'''). The balance operator imposes a multivariate constraint on the error covariance such that the unobserved variable information is extracted from observed data by establishing balance relationships (i.e., T-S empirical formulas, hydrostatic balance, and geostrophic balance) with other state variables ([[Bibliography#WeaverAT_2005a|Weaver ''et al.'', 2005]]).  The balance operator is not used in the tutorial.<div class="para">&nbsp;</div>The standard deviations for [[Options#WC13|WC13]] have already been created for you:<div class="box"><span class="twilightBlue">../Data/wc13_std_i.nc</span>    initial conditions<br /><span class="twilightBlue">../Data/wc13_std_m.nc</span>    model error (if weak constraint)<br /><span class="twilightBlue">../Data/wc13_std_b.nc</span>    open boundary conditions<br /><span class="twilightBlue">../Data/wc13_std_f.nc</span>     surface forcing (wind stress and net heat flux)</div>
#Since we are modeling the error covariance matrix, '''D''', we need to compute the normalization coefficients to ensure that the diagonal elements of '''D''' are equal to unity. There are two methods to compute normalization coefficients: exact and randomization (an approximation).<br /><br />The exact method is very expensive on large grids. The normalization coefficients are computed by perturbing each model grid cell with a delta function scaled by the area (2D state variables) or volume (3D state variables), and then by convolving with the squared-root adjoint and tangent linear diffusion operators.<br /><br />The approximated method is cheaper: the normalization  coefficients are computed using the randomization approach of Fisher and Courtier (1995). The coefficients are initialized with random numbers having a uniform distribution (drawn from a normal distribution with zero mean and unity variance). Then, they are scaled by the inverse squared-root of the cell area (2D state variable) or volume (3D state variable) and convolved with the squared-root adjoint and tangent diffusion operators over a specified number of iterations, Nrandom.<br /><br />Check following parameters in the 4D-Var input script [[s4dvar.in]] (see input script for details):<div class="box">[[Variables#Nmethod|Nmethod]]  == 0            ! normalization method<br />[[Variables#Nrandom|Nrandom]]  == 5000          ! randomization iterations<br /><br />[[Variables#LdefNRM|LdefNRM]] == F F F F        ! Create a new normalization files<br />[[Variables#LwrtNRM|LwrtNRM]] == F F F F        ! Compute and write normalization<br /><br />[[Variables#CnormI|CnormI(isFsur)]] =  T      ! 2D variable at RHO-points<br />[[Variables#CnormI|CnormI(isUbar)]] =  T      ! 2D variable at U-points<br />[[Variables#CnormI|CnormI(isVbar)]] =  T      ! 2D variable at V-points<br />[[Variables#CnormI|CnormI(isUvel)]] =  T      ! 3D variable at U-points<br />[[Variables#CnormI|CnormI(isVvel)]] = T      ! 3D variable at V-points<br />[[Variables#CnormI|CnormI(isTvar)]] = T T    ! NT tracers<br /><br />[[Variables#CnormB|CnormB(isFsur)]] = T      ! 2D variable at RHO-points<br />[[Variables#CnormB|CnormB(isUbar)]] = T      ! 2D variable at U-points<br />[[Variables#CnormB|CnormB(isVbar)]] = T       ! 2D variable at V-points<br />[[Variables#CnormB|CnormB(isUvel)]] = T       ! 3D variable at U-points<br />[[Variables#CnormB|CnormB(isVvel)]] =  T      ! 3D variable at V-points<br />[[Variables#CnormB|CnormB(isTvar)]] =  T T    ! NT tracers<br /><br />[[Variables#CnormF|CnormF(isUstr)]] =  T      ! surface U-momentum stress<br />[[Variables#CnormF|CnormF(isVstr)]] =  T      ! surface V-momentum stress<br />[[Variables#CnormF|CnormF(isTsur)]] =  T T    ! NT surface tracers flux</div>These normalization coefficients have already been computed for you ('''../Normalization''') using the exact method since this application has a small grid (54x53x30):<div class="box"><span class="twilightBlue">../Data/wc13_nrm_i.nc</span>     initial conditions<br /><span class="twilightBlue">../Data/wc13_nrm_b.nc</span>    open boundary conditions<br /><span class="twilightBlue">../Data/wc13_nrm_f.nc</span>    surface forcing (wind stress and<br />                                          net heat flux)</div>Notice that the switches [[Variables#LdefNRM|LdefNRM]] and [[Variables#LwrtNRM|LwrtNRM]] are all '''.FALSE.''' (F) since we already computed these coefficients.<br /><br />The normalization coefficients need to be computed only once for a particular application provided that the grid, land/sea masking (if any), and decorrelation scales ([[Variables#HdecayI|HdecayI]], [[Variables#VdecayI|VdecayI]], [[Variables#HdecayB|HdecayB]], [[Variables#VdecayV|VdecayV]], and [[Variables#HdecayF|HdecayF]]) remain the same. Notice that large spatial changes in the normalization coefficient structure are observed near the open boundaries and land/sea masking regions.
#Since we are modeling the error covariance matrix, '''D''', we need to compute the normalization coefficients to ensure that the diagonal elements of the associated correlation matrix '''C''' are equal to unity. There are two methods to compute normalization coefficients: exact and randomization (an approximation).<div class="para">&nbsp;</div>The exact method is very expensive on large grids. The normalization coefficients are computed by perturbing each model grid cell with a delta function scaled by the area (2D state variables) or volume (3D state variables), and then by convolving with the squared-root adjoint and tangent linear diffusion operators.<div class="para">&nbsp;</div>The approximate method is cheaper: the normalization  coefficients are computed using the randomization approach of [[Bibliography#FisherM_1995a|Fisher and Courtier (1995)]]. The coefficients are initialized with random numbers having a uniform distribution (drawn from a normal distribution with zero mean and unit variance). Then, they are scaled by the inverse squared-root of the cell area (2D state variable) or volume (3D state variable) and convolved with the squared-root adjoint and tangent diffusion operators over a specified number of iterations, Nrandom.<div class="para">&nbsp;</div>Check following parameters in the 4D-Var input script [[s4dvar.in]] (see input script for details):<div class="box">[[Variables#Nmethod|Nmethod]] == 0            ! normalization method: 0=Exact (expensive) or 1=Approximated (randomization)<br />[[Variables#Nrandom|Nrandom]] == 5000          ! randomization iterations<br /><br />[[Variables#LdefNRM|LdefNRM]] == T T T T       ! Create a new normalization files<br />[[Variables#LwrtNRM|LwrtNRM]] == T T T T       ! Compute and write normalization<br /><br />[[Variables#CnormM|CnormM(isFsur)]] =  T      ! model error covariance, 2D variable at RHO-points<br />[[Variables#CnormM|CnormM(isUbar)]] =  T       ! model error covariance, 2D variable at U-points<br />[[Variables#CnormM|CnormM(isVbar)]] =  T      ! model error covariance, 2D variable at V-points<br />[[Variables#CnormM|CnormM(isUvel)]] =  T      ! model error covariance, 3D variable at U-points<br />[[Variables#CnormM|CnormM(isVvel)]] =  T       ! model error covariance, 3D variable at V-points<br />[[Variables#CnormM|CnormM(isTvar)]] =  T T    ! model error covariance, NT tracers<br /><br />[[Variables#CnormI|CnormI(isFsur)]] = T      ! IC error covariance, 2D variable at RHO-points<br />[[Variables#CnormI|CnormI(isUbar)]] =  T      ! IC error covariance, 2D variable at U-points<br />[[Variables#CnormI|CnormI(isVbar)]] = T      ! IC error covariance, 2D variable at V-points<br />[[Variables#CnormI|CnormI(isUvel)]] =  T      ! IC error covariance, 3D variable at U-points<br />[[Variables#CnormI|CnormI(isVvel)]] =  T      ! IC error covariance, 3D variable at V-points<br />[[Variables#CnormI|CnormI(isTvar)]] =  T T    ! IC error covariance, NT tracers<br /><br />[[Variables#CnormB|CnormB(isFsur)]] =  T      ! BC error covariance, 2D variable at RHO-points<br />[[Variables#CnormB|CnormB(isUbar)]] =  T      ! BC error covariance, 2D variable at U-points<br />[[Variables#CnormB|CnormB(isVbar)]] =  T      ! BC error covariance, 2D variable at V-points<br />[[Variables#CnormB|CnormB(isUvel)]] =  T      ! BC error covariance, 3D variable at U-points<br />[[Variables#CnormB|CnormB(isVvel)]] =  T      ! BC error covariance, 3D variable at V-points<br />[[Variables#CnormB|CnormB(isTvar)]] =  T T    ! BC error covariance, NT tracers<br /><br />[[Variables#CnormF|CnormF(isUstr)]] =  T      ! surface forcing error covariance, U-momentum stress<br />[[Variables#CnormF|CnormF(isVstr)]] =  T      ! surface forcing error covariance, V-momentum stress<br />[[Variables#CnormF|CnormF(isTsur)]] =  T T    ! surface forcing error covariance, NT tracers fluxes</div>These normalization coefficients have already been computed for you ('''../Normalization''') using the exact method since this application has a small grid (54x53x30):<div class="box"><span class="twilightBlue">../Data/wc13_nrm_i.nc</span>    initial conditions<br /><span class="twilightBlue">../Data/wc13_nrm_m.nc</span>    model error (if weak constraint)<br /><span class="twilightBlue">../Data/wc13_nrm_b.nc</span>    open boundary conditions<br /><span class="twilightBlue">../Data/wc13_nrm_f.nc</span>    surface forcing (wind stress and<br />                                          net heat flux)</div>Notice that the switches [[Variables#LdefNRM|LdefNRM]] and [[Variables#LwrtNRM|LwrtNRM]] are all '''false''' (F) since we already computed these coefficients.<div class="para">&nbsp;</div>The normalization coefficients need to be computed only once for a particular application provided that the grid, land/sea masking (if any), and decorrelation scales ([[Variables#HdecayI|HdecayI]], [[Variables#VdecayI|VdecayI]], [[Variables#HdecayB|HdecayB]], [[Variables#VdecayV|VdecayV]], and [[Variables#HdecayF|HdecayF]]) remain the same. Notice that large spatial changes in the normalization coefficient structure are observed near the open boundaries and land/sea masking regions.
#Before you run this application, you need to run the standard [[PSAS_Tutorial|4D-PSAS]] ('''../PSAS''' directory) since we need the Lanczos vectors. Notice that in [[job_psas_sen|job_psas_sen.sh]] we have the following operation:<div class="box"><span class="red">cp -p ${Dir}/PSAS/wc13_mod.nc wc13_lcz.nc</span></div>In 4D-Var (observartion space minimization), the Lanczos vectors are stored in the output 4D-Var NetCDF file <span class="twilightBlue">wc13_mod.nc</span>.
#Before you run this application, you need to run the standard [[PSAS_Tutorial|4D-PSAS]] ('''../PSAS''' directory) since we need the Lanczos vectors. Notice that in [[job_psas_impact.sh]] we have the following operation:<div class="box"><span class="red">cp -p ${Dir}/PSAS/wc13_mod.nc wc13_lcz.nc</span></div>In 4D-Var (observartion space minimization), the Lanczos vectors are stored in the output 4D-Var NetCDF file <span class="twilightBlue">wc13_mod.nc</span>.
#In addition, to run this application you need an adjoint sensitivity functional. This is computed by the following Matlab script:<div class="box"><span class="red">../Data/adsen_37N_transport.m</span></div>which creates the NetCDF file <span class="twilightBlue">wc13_ads.nc</span>. This file has already been created for you.<br /><br />The adjoint sensitivity functional is defined as the time-averaged transport crossing 37N in the upper 500m.
#In addition, to run this application you need an adjoint sensitivity functional. This is computed by the following Matlab script:<div class="box"><span class="red">../Data/adsen_37N_transport.m</span></div>which creates the NetCDF file <span class="twilightBlue">wc13_ads.nc</span>. This file has already been created for you.<div class="para">&nbsp;</div>The adjoint sensitivity functional is defined as the time-averaged transport crossing 37N in the upper 500m.
#Customize your preferred [[build_Script|build script]] and provide the appropriate values for:
#Customize your preferred [[build_Script|build script]] and provide the appropriate values for:
#*Root directory, MY_ROOT_DIR
#*Root directory, <span class="salmon">MY_ROOT_DIR</span>
#*ROMS source code, MY_ROMS_SRC
#*ROMS source code, <span class="salmon">MY_ROMS_SRC</span>
#*Fortran compiler, FORT
#*Fortran compiler, <span class="salmon">FORT</span>
#*MPI flags, USE_MPI and USE_MPIF90
#*MPI flags, <span class="salmon">USE_MPI</span> and <span class="salmon">USE_MPIF90</span>
#*Path of MPI, NetCDF, and ARPACK libraries according to the compiler. Notice that you need to provide the correct places of these libraries for your computer. If you want to ignore this section, comment out the assignment for the variable USE_MY_LIBS.
#*Path of MPI, NetCDF, and ARPACK libraries according to the compiler are set in [[my_build_paths.sh]]. Notice that you need to provide the correct places of these libraries for your computer. If you want to ignore this section, set <span class="salmon">USE_MY_LIBS</span> value to '''no'''.
#Notice that the most important CPP options for this application are specified in the [[build_Script|build script]] instead of <span class="twilightBlue">wc13.h</span>:<div class="box"><span class="twilightBlue">setenv MY_CPP_FLAGS "-DW4DPSAS_SENSITIVITY"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DAD_IMPULSE"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT_SPLIT"</span></div>This is to allow flexibility with different CPP options.<br /><br />For this to work, however, any '''#undef''' directives MUST be avoided in the header file <span class="twilightBlue">wc13.h</span> since it has precedence during C-preprocessing.
#Notice that the most important CPP options for this application are specified in the [[build_Script|build script]] instead of <span class="twilightBlue">wc13.h</span>:<div class="box"><span class="twilightBlue">setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DW4DPSAS_SENSITIVITY"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DANA_SPONGE"<br /><br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DAD_IMPULSE"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT"<br />setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT_SPLIT"<br \><br \>setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DRPCG"<br /></span></div>This is to allow flexibility with different CPP options.<div class="para">&nbsp;</div>For this to work, however, any '''#undef''' directives MUST be avoided in the header file <span class="twilightBlue">wc13.h</span> since it has precedence during C-preprocessing.
#You MUST use the [[build_Script|build script]] to compile.
#You MUST use the [[build_Script|build script]] to compile.
#Customize the ROMS input script <span class="twilightBlue">ocean_wc13.in</span> and specify the appropriate values for the distributed-memory partition. It is set by default to:<div class="box">[[Variables#NtileI|NtileI]] == 2                              ! I-direction partition<br />[[Variables#NtileJ|NtileJ]] == 2                               ! J-direction partition</div>Notice that the adjoint-based algorithms can only be run in parallel using MPI. This is because of the way that the adjoint model is constructed.
#Customize the ROMS input script <span class="twilightBlue">roms_wc13.in</span> and specify the appropriate values for the distributed-memory partition. It is set by default to:<div class="box">[[Variables#NtileI|NtileI]] == 2                              ! I-direction partition<br />[[Variables#NtileJ|NtileJ]] == 4                               ! J-direction partition</div>Notice that the adjoint-based algorithms can only be run in parallel using MPI. This is because of the way that the adjoint model is constructed.
#Customize the configuration script [[job_psas_sen|job_psas_sen.sh]] and provide the appropriate place for the "substitute" Perl script:<div class="box"><span class="twilightBlue">set SUBSTITUTE=${ROMS_ROOT}/ROMS/Bin/substitute</span></div>This script is distributed with ROMS and it is found in the ROMS/Bin sub-directory. Alternatively, you can define ROMS_ROOT environmental variable in your .cshrc login script. For example, I have:<div class="box"><span class="twilightBlue">setenv ROMS_ROOT /home/arango/ocean/toms/repository/trunk</span></div>
#Customize the configuration script [[job_psas_impact.sh]] and provide the appropriate place for the [[substitute]] Perl script:<div class="box"><span class="twilightBlue">set SUBSTITUTE=${ROMS_ROOT}/ROMS/Bin/substitute</span></div>This script is distributed with ROMS and it is found in the ROMS/Bin sub-directory. Alternatively, you can define ROMS_ROOT environmental variable in your .cshrc login script. For example, I have:<div class="box"><span class="twilightBlue">setenv ROMS_ROOT /home/arango/ocean/toms/repository/trunk</span></div>
#Execute the configuration [[job_psas_sen|job_psas_sen.sh]] '''BEFORE''' running the model. It copies the required files and creates <span class="twilightBlue">psas.in</span> input script from template <span class="twilightBlue">s4dvar.in</span>. This has to be done '''EVERY TIME''' that you run this application. We need a clean and fresh copy of the initial conditions and observation files since they are modified by ROMS during execution.
#Execute the configuration [[job_psas_impact.sh]] '''before''' running the model. It copies the required files and creates <span class="twilightBlue">psas.in</span> input script from template '''[[s4dvar.in]]'''. This has to be done '''every time''' that you run this application. We need a clean and fresh copy of the initial conditions and observation files since they are modified by ROMS during execution.
#Run ROMS with data assimilation:<div class="box"><span class="red">mpirun -np 4 oceanM ocean_wc13.in > & log &</span></div>
#Run ROMS with data assimilation:<div class="box"><span class="red">mpirun -np 8 romsM roms_wc13.in > & log &</span></div>
 
#We recommend creating a new subdirectory <span class="twilightBlue">EX5</span>, and saving the solution in it for analysis and plotting to avoid overwriting solutions when playing with different parameters. For example<div class="box">mkdir EX5<br />mv Build_roms psas.in *.nc log EX5<br />cp -p romsM roms_wc13.in EX5</div>where log is the ROMS standard output specified in the previous step.
==References==
<div class="bib">Moore, A.M., H.G. Arango, G. Broquet, B.S. Powell, J. Zavala-Garay, and A.T. Weaver, 2010: The Regional Ocean Modeling System (ROMS) 4-dimensional variational data  assimilation systems, Part I: System overview, ''Ocean Modelling'', draft.</div>
 
 
<div class="bib">Moore, A.M., H.G. Arango, and G. Broquet, C. Edwards, M. Veneziani, B.S. Powell, D. Foley, J. Doyle, D. Costa, and  P. Robinson, 2010: The Regional Ocean Modeling System (ROMS) 4-dimensional variational data assimilation systems, Part II: Performance and application to the California Current System, ''Ocean Modelling'', Draft.</div>


==Results==
The <span class="twilightBlue">WC13/plotting/plot_psas_analysis_impact.m</span> Matlab script will allow you to plot the [[Options#W4DPSAS|4D-PSAS]] analysis observation impacts:


<div class="bib">Moore, A.M., H.G. Arango, and G. Broquet, C. Edwards, M. Veneziani, B.S. Powell, D. Foley, J. Doyle, D. Costa, and P. Robinson, 2010: The Regional Ocean Modeling System (ROMS) 4-dimensional variational data assimilation systems, Part III: Observation Impact and Observation Sensitivity in the California Current System, ''Ocean Modelling'', Draft.</div>
{|align="center"
|-
|[[Image:psas_impact_2019.png|500px|thumb|center|<center>4D-PSAS Analysis Observation Impact<br />''prior'' saved daily</center>]]
|[[Image:psas_impact_2hour_2019.png|500px|thumb|center|<center>4D-PSAS Analysis Observation Impact<br />''prior'' saved every 2 hours</center>]]
|}
<div style="clear: both;"></div>

Revision as of 15:26, 30 August 2019

4D-PSAS Observation Impact



Introduction

During this exercise you will apply the strong/weak constraint, dual form 4-Dimensional Variational (4D-Var) data assimilation observation impact based on the Physical-space Statistical Analysis System (PSAS) algorithm to ROMS configured for the U.S. west coast and the California Current System (CCS). This configuration, referred to as WC13, has 30 km horizontal resolution, and 30 levels in the vertical. While 30 km resolution is inadequate for capturing much of the energetic mesoscale circulation associated with the CCS, WC13 captures the broad-scale features of the circulation quite well, and serves as a very useful and efficient illustrative example of 4D-PSAS observation impact.

Model Set-up

The WC13 model domain is shown in Fig. 1 and has open boundaries along the northern, western, and southern edges of the model domain.

Fig. 1: Model Bathymetry with 37°N Transect and Target Area

In the tutorial, you will perform a 4D-Var data assimilation cycle that spans the period 3-6 January, 2004. The 4D-Var control vector δz is comprised of increments to the initial conditions, δx(t0), surface forcing, δf(t), and open boundary conditions, δb(t). The prior initial conditions, xb(t0), are taken from the sequence of 4D-Var experiments described by Moore et al. (2011b) in which data were assimilated every 7 days during the period July 2002- December 2004. The prior surface forcing, fb(t), takes the form of surface wind stress, heat flux, and a freshwater flux computed using the ROMS bulk flux formulation, and using near surface air data from COAMPS (Doyle et al., 2009). Clamped open boundary conditions are imposed on (u,v) and tracers, and the prior boundary conditions, bb(t), are taken from the global ECCO product (Wunsch and Heimbach, 2007). The free-surface height and vertically integrated velocity components are subject to the usual Chapman and Flather radiation conditions at the open boundaries. The prior surface forcing and open boundary conditions are provided daily and linearly interpolated in time. Similarly, the increments δf(t) and δb(t) are also computed daily and linearly interpolated in time.

The observations assimilated into the model are satellite SST, satellite SSH in the form of a gridded product from Aviso, and hydrographic observations of temperature and salinity collected from Argo floats and during the GLOBEC/LTOP and CalCOFI cruises off the coast of Oregon and southern California, respectively. The observation locations are illustrated in Fig. 2.

Figure 2: WC13 Observations
a) Aviso SSH
b) Blended SST
c) In Situ Temperature
d) In Situ Salinity

Running 4D-PSAS Observation Impact

To run this exercise, go first to the directory WC13/PSAS_impact. Instructions for compiling and running the model are provided below or can be found in the Readme file. The recommended configuration for this exercise is one outer-loop and 50 inner-loops, and roms_wc13.in is configured for this default case. The number of inner-loops is controlled by the parameter Ninner in roms_wc13.in.

Important CPP Options

The following C-preprocessing options are activated in the build script:

W4DPSAS_SENSITIVITY 4D-PSAS observation sensitivity driver
ANA_SPONGE Analytical enhanced viscosity/diffusion sponge
AD_IMPULSE Force ADM with intermittent impulses
BGQC Backgound quality control of observations
IMPACT_INNER Writing observations impacts for each inner loop
MINRES Minimal Residual Method for 4D-Var minimization
OBS_IMPACT Compute observation impact
OBS_IMPACT_SPLIT separate impact due to IC, forcing, and OBC
RPCG Restricted B-preconditioned Lanczos minimization
TIME_CONV Weak-constraint 4D-Var time convolution
WC13 Application CPP option

Input NetCDF Files

WC13 requires the following input NetCDF files:

Grid File: ../Data/wc13_grd.nc
Nonlinear Initial File: wc13_ini.nc
Forcing File 01: ../Data/coamps_wc13_lwrad_down.nc
Forcing File 02: ../Data/coamps_wc13_Pair.nc
Forcing File 03: ../Data/coamps_wc13_Qair.nc
Forcing File 04: ../Data/coamps_wc13_rain.nc
Forcing File 05: ../Data/coamps_wc13_swrad.nc
Forcing File 06: ../Data/coamps_wc13_Tair.nc
Forcing File 07: ../Data/coamps_wc13_wind.nc
Boundary File: ../Data/wc13_ecco_bry.nc

Adjoint Sensitivity File: wc13_ads.nc
Initial Conditions STD File: ../Data/wc13_std_i.nc
Model STD File: ../Data/wc13_std_m.nc
Boundary Conditions STD File: ../Data/wc13_std_b.nc
Surface Forcing STD File: ../Data/wc13_std_f.nc
Initial Conditions Norm File: ../Data/wc13_nrm_i.nc
Model Norm File: ../Data/wc13_nrm_m.nc
Boundary Conditions Norm File: ../Data/wc13_nrm_b.nc
Surface Forcing Norm File: ../Data/wc13_nrm_f.nc
Observations File: wc13_obs.nc
Lanczos Vectors File: wc13_lcz.nc

Various Scripts and Include Files

The following files will be found in WC13/PSAS_impact directory after downloading from ROMS test cases SVN repository:

Readme instructions
build_roms.bash bash shell script to compile application
build_roms.sh csh Unix script to compile application
job_psas_impact.sh job configuration script
roms_wc13.in ROMS standard input script for WC13
s4dvar.in 4D-Var standard input script template
wc13.h WC13 header with CPP options

Important parameters in standard input roms_wc13.in script

  • Notice that this driver uses the following adjoint sensitivity parameters (see input script for details):
DstrS == 0.0d0  ! starting day
DendS == 0.0d0  ! ending day

KstrS == 1  ! starting level
KendS == 30  ! ending level

Lstate(isFsur) == T  ! free-surface
Lstate(isUbar) == T  ! 2D U-momentum
Lstate(isVbar) == T  ! 2D V-momentum
Lstate(isUvel) == T  ! 3D U-momentum
Lstate(isVvel) == T  ! 3D V-momentum
Lstate(isWvel) == F  ! 3D W-momentum

Lstate(isTvar) == T T  ! tracers
  • Both FWDNAME and HISNAME must be the same:
FWDNAME == wc13_fwd.nc
HISNAME == wc13_fwd.nc

Instructions

To run this application you need to take the following steps:

  1. We need to run the model application for a period that is long enough to compute meaningful circulation statistics, like mean and standard deviations for all prognostic state variables (zeta, u, v, T, and S). The standard deviations are written to NetCDF files and are read by the 4D-Var algorithm to convert modeled error correlations to error covariances. The error covariance matrix, D=diag(Bx, Bb, Bf, Q), is very large and not well known. B is modeled as the solution of a diffusion equation as in Weaver and Courtier (2001). Each covariance matrix is factorized as B = K Σ C ΣT KT, where C is a univariate correlation matrix, Σ is a diagonal matrix of error standard deviations, and K is a multivariate balance operator.
     
    In this application, we need standard deviations for initial conditions, surface forcing (ADJUST_WSTRESS and ADJUST_STFLUX), and open boundary conditions (ADJUST_BOUNDARY). If the balance operator is activated (BALANCE_OPERATOR and ZETA_ELLIPTIC), the standard deviations for the initial and boundary conditions error covariance are in terms of the unbalanced error covariance (K Bu KT). The balance operator imposes a multivariate constraint on the error covariance such that the unobserved variable information is extracted from observed data by establishing balance relationships (i.e., T-S empirical formulas, hydrostatic balance, and geostrophic balance) with other state variables (Weaver et al., 2005). The balance operator is not used in the tutorial.
     
    The standard deviations for WC13 have already been created for you:
    ../Data/wc13_std_i.nc initial conditions
    ../Data/wc13_std_m.nc model error (if weak constraint)
    ../Data/wc13_std_b.nc open boundary conditions
    ../Data/wc13_std_f.nc surface forcing (wind stress and net heat flux)
  2. Since we are modeling the error covariance matrix, D, we need to compute the normalization coefficients to ensure that the diagonal elements of the associated correlation matrix C are equal to unity. There are two methods to compute normalization coefficients: exact and randomization (an approximation).
     
    The exact method is very expensive on large grids. The normalization coefficients are computed by perturbing each model grid cell with a delta function scaled by the area (2D state variables) or volume (3D state variables), and then by convolving with the squared-root adjoint and tangent linear diffusion operators.
     
    The approximate method is cheaper: the normalization coefficients are computed using the randomization approach of Fisher and Courtier (1995). The coefficients are initialized with random numbers having a uniform distribution (drawn from a normal distribution with zero mean and unit variance). Then, they are scaled by the inverse squared-root of the cell area (2D state variable) or volume (3D state variable) and convolved with the squared-root adjoint and tangent diffusion operators over a specified number of iterations, Nrandom.
     
    Check following parameters in the 4D-Var input script s4dvar.in (see input script for details):
    Nmethod == 0  ! normalization method: 0=Exact (expensive) or 1=Approximated (randomization)
    Nrandom == 5000  ! randomization iterations

    LdefNRM == T T T T  ! Create a new normalization files
    LwrtNRM == T T T T  ! Compute and write normalization

    CnormM(isFsur) = T  ! model error covariance, 2D variable at RHO-points
    CnormM(isUbar) = T  ! model error covariance, 2D variable at U-points
    CnormM(isVbar) = T  ! model error covariance, 2D variable at V-points
    CnormM(isUvel) = T  ! model error covariance, 3D variable at U-points
    CnormM(isVvel) = T  ! model error covariance, 3D variable at V-points
    CnormM(isTvar) = T T  ! model error covariance, NT tracers

    CnormI(isFsur) = T  ! IC error covariance, 2D variable at RHO-points
    CnormI(isUbar) = T  ! IC error covariance, 2D variable at U-points
    CnormI(isVbar) = T  ! IC error covariance, 2D variable at V-points
    CnormI(isUvel) = T  ! IC error covariance, 3D variable at U-points
    CnormI(isVvel) = T  ! IC error covariance, 3D variable at V-points
    CnormI(isTvar) = T T  ! IC error covariance, NT tracers

    CnormB(isFsur) = T  ! BC error covariance, 2D variable at RHO-points
    CnormB(isUbar) = T  ! BC error covariance, 2D variable at U-points
    CnormB(isVbar) = T  ! BC error covariance, 2D variable at V-points
    CnormB(isUvel) = T  ! BC error covariance, 3D variable at U-points
    CnormB(isVvel) = T  ! BC error covariance, 3D variable at V-points
    CnormB(isTvar) = T T  ! BC error covariance, NT tracers

    CnormF(isUstr) = T  ! surface forcing error covariance, U-momentum stress
    CnormF(isVstr) = T  ! surface forcing error covariance, V-momentum stress
    CnormF(isTsur) = T T  ! surface forcing error covariance, NT tracers fluxes
    These normalization coefficients have already been computed for you (../Normalization) using the exact method since this application has a small grid (54x53x30):
    ../Data/wc13_nrm_i.nc initial conditions
    ../Data/wc13_nrm_m.nc model error (if weak constraint)
    ../Data/wc13_nrm_b.nc open boundary conditions
    ../Data/wc13_nrm_f.nc surface forcing (wind stress and
    net heat flux)
    Notice that the switches LdefNRM and LwrtNRM are all false (F) since we already computed these coefficients.
     
    The normalization coefficients need to be computed only once for a particular application provided that the grid, land/sea masking (if any), and decorrelation scales (HdecayI, VdecayI, HdecayB, VdecayV, and HdecayF) remain the same. Notice that large spatial changes in the normalization coefficient structure are observed near the open boundaries and land/sea masking regions.
  3. Before you run this application, you need to run the standard 4D-PSAS (../PSAS directory) since we need the Lanczos vectors. Notice that in job_psas_impact.sh we have the following operation:
    cp -p ${Dir}/PSAS/wc13_mod.nc wc13_lcz.nc
    In 4D-Var (observartion space minimization), the Lanczos vectors are stored in the output 4D-Var NetCDF file wc13_mod.nc.
  4. In addition, to run this application you need an adjoint sensitivity functional. This is computed by the following Matlab script:
    ../Data/adsen_37N_transport.m
    which creates the NetCDF file wc13_ads.nc. This file has already been created for you.
     
    The adjoint sensitivity functional is defined as the time-averaged transport crossing 37N in the upper 500m.
  5. Customize your preferred build script and provide the appropriate values for:
    • Root directory, MY_ROOT_DIR
    • ROMS source code, MY_ROMS_SRC
    • Fortran compiler, FORT
    • MPI flags, USE_MPI and USE_MPIF90
    • Path of MPI, NetCDF, and ARPACK libraries according to the compiler are set in my_build_paths.sh. Notice that you need to provide the correct places of these libraries for your computer. If you want to ignore this section, set USE_MY_LIBS value to no.
  6. Notice that the most important CPP options for this application are specified in the build script instead of wc13.h:
    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DW4DPSAS_SENSITIVITY"
    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DANA_SPONGE"

    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DAD_IMPULSE"
    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT"
    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DOBS_IMPACT_SPLIT"

    setenv MY_CPP_FLAGS "${MY_CPP_FLAGS} -DRPCG"
    This is to allow flexibility with different CPP options.
     
    For this to work, however, any #undef directives MUST be avoided in the header file wc13.h since it has precedence during C-preprocessing.
  7. You MUST use the build script to compile.
  8. Customize the ROMS input script roms_wc13.in and specify the appropriate values for the distributed-memory partition. It is set by default to:
    NtileI == 2  ! I-direction partition
    NtileJ == 4  ! J-direction partition
    Notice that the adjoint-based algorithms can only be run in parallel using MPI. This is because of the way that the adjoint model is constructed.
  9. Customize the configuration script job_psas_impact.sh and provide the appropriate place for the substitute Perl script:
    set SUBSTITUTE=${ROMS_ROOT}/ROMS/Bin/substitute
    This script is distributed with ROMS and it is found in the ROMS/Bin sub-directory. Alternatively, you can define ROMS_ROOT environmental variable in your .cshrc login script. For example, I have:
    setenv ROMS_ROOT /home/arango/ocean/toms/repository/trunk
  10. Execute the configuration job_psas_impact.sh before running the model. It copies the required files and creates psas.in input script from template s4dvar.in. This has to be done every time that you run this application. We need a clean and fresh copy of the initial conditions and observation files since they are modified by ROMS during execution.
  11. Run ROMS with data assimilation:
    mpirun -np 8 romsM roms_wc13.in > & log &
  12. We recommend creating a new subdirectory EX5, and saving the solution in it for analysis and plotting to avoid overwriting solutions when playing with different parameters. For example
    mkdir EX5
    mv Build_roms psas.in *.nc log EX5
    cp -p romsM roms_wc13.in EX5
    where log is the ROMS standard output specified in the previous step.

Results

The WC13/plotting/plot_psas_analysis_impact.m Matlab script will allow you to plot the 4D-PSAS analysis observation impacts:

4D-PSAS Analysis Observation Impact
prior saved daily
4D-PSAS Analysis Observation Impact
prior saved every 2 hours