# ROMS/TOMS Developers

Algorithms Update Web Log

arango - August 16, 2006 @ 17:38
Incremental, Strong Constraint 4DVAR- Comments (0)

The incremental, strong contraint 4DVAR (IS4DVAR) approximated approach was proposed by Courtier et al. (1994) to reduce the computational cost and facilitate better preconditioning. We follow the algorithm proposed by Weaver et al (2002) that defines the increment to the difference from previous reference state.

Let xk denote the state vector on the k-th outer loop such that δxk is the increment difference from the previous state

```
```

where xk -1 is the current estimate of the ocean state. It is equal to the background state for the first minimization (x0 = xb, δx1 = 0). The cost function on the k-th loop can be written as

```

```

where dk – 1 = xoHxk – 1 is the innovation vector, xo is the observation vector, xb is the background state, H is a linear approximation of the observation operator H, and B and O are the background and observation error covariance matrices. Let’s introduce a new minimization variable δv, such that:

```

```

It is more convenient to work in terms of δv (minimization space: v-space). The conditioning of the Jb in v-space is optimal since it’s Hessian, 2Jb/∂v2, yields the identity matrix. The gradient of J in v-space, denoted vJ, is given by:

```
```

where B = BT/2B1/2 and BT/2 denotes (B1/2)T, and xJo is the gradient of Jo in model space (x-space) which is computed using the adjoint model and given by:

```
```

Following Weaver and Courtier (2001), the background-error covariance matrix can be factored as:

```
```

where S is a diagonal matrix of background-error standard deviations, C is a symetric matrix of background-error correlations which can be factorized as C = C1/2CT/2, G is the normalization matrix which ensures that the diagonal elements of C are equal to unity, L is a 3D self-adjoint filtering operator, and W is the grid cell area (2D fields) or volume (3D fields).