wiki:TracQuery

Trac Ticket Queries

In addition to reports, Trac provides support for custom ticket queries, which can be used to display tickets that meet specified criteria.

To configure and execute a custom query, switch to the View Tickets module from the navigation bar, and select the Custom Query link.

Filters

When you first go to the query page, the default filter will display tickets relevant to you:

  • If logged in then all open tickets, it will display open tickets assigned to you.
  • If not logged in but you have specified a name or email address in the preferences, then it will display all open tickets where your email (or name if email not defined) is in the CC list.
  • If not logged in and no name/email is defined in the preferences, then all open issues are displayed.

Current filters can be removed by clicking the button to the left with the minus sign on the label. New filters are added from the dropdown lists at the bottom corners of the filters box; 'And' conditions on the left, 'Or' conditions on the right. Filters with either a text box or a dropdown menu of options can be added multiple times to perform an Or on the criteria.

You can use the fields just below the filters box to group the results based on a field, or display the full description for each ticket.

After you have edited your filters, click the Update button to refresh your results.

Clicking on one of the query results will take you to that ticket. You can navigate through the results by clicking the Next Ticket or Previous Ticket links just below the main menu bar, or click the Back to Query link to return to the query page.

You can safely edit any of the tickets and continue to navigate through the results using the Next/Previous/Back to Query links after saving your results. When you return to the query any tickets which were edited will be displayed with italicized text. If one of the tickets was edited such that it no longer matches the query criteria , the text will also be greyed. Lastly, if a new ticket matching the query criteria has been created, it will be shown in bold.

The query results can be refreshed and cleared of these status indicators by clicking the Update button again.

Saving Queries

Trac allows you to save the query as a named query accessible from the reports module. To save a query ensure that you have Updated the view and then click the Save query button displayed beneath the results. You can also save references to queries in Wiki content, as described below.

Note: one way to easily build queries like the ones below, you can build and test the queries in the Custom report module and when ready - click Save query. This will build the query string for you. All you need to do is remove the extra line breaks.

Note: you must have the REPORT_CREATE permission in order to save queries to the list of default reports. The Save query button will only appear if you are logged in as a user that has been granted this permission. If your account does not have permission to create reports, you can still use the methods below to save a query.

You may want to save some queries so that you can come back to them later. You can do this by making a link to the query from any Wiki page.

[query:status=new|assigned|reopened&version=1.0 Active tickets against 1.0]

Which is displayed as:

Active tickets against 1.0

This uses a very simple query language to specify the criteria, see Query Language.

Alternatively, you can copy the query string of a query and paste that into the Wiki link, including the leading ? character:

[query:?status=new&status=assigned&status=reopened&group=owner Assigned tickets by owner]

Which is displayed as:

Assigned tickets by owner

Customizing the table format

You can also customize the columns displayed in the table format (format=table) by using col=<field>. You can specify multiple fields and what order they are displayed in by placing pipes (|) between the columns:

[[TicketQuery(max=3,status=closed,order=id,desc=1,format=table,col=resolution|summary|owner|reporter)]]

This is displayed as:

Results (1 - 3 of 865)

1 2 3 4 5 6 7 8 9 10 11
Ticket Resolution Summary Owner Reporter
#867 Done Impotant: Miscellaneous Update arango
#866 Fixed Missing Drivers directory makefile macros xavierC
#865 Done IMPORTANT: Split 4D-Var algorithms, Phase II arango
1 2 3 4 5 6 7 8 9 10 11

Full rows

In table format you can also have full rows by using rows=<field>:

[[TicketQuery(max=3,status=closed,order=id,desc=1,format=table,col=resolution|summary|owner|reporter,rows=description)]]

This is displayed as:

Results (1 - 3 of 865)

1 2 3 4 5 6 7 8 9 10 11
Ticket Resolution Summary Owner Reporter
#867 Done Impotant: Miscellaneous Update arango
Description

Several changes are made to support new capabilities that will be released in the future. It is a safeguard for uniformity between official and research repositories.

  • mod_scalars.F: Added LreadFRC and LreadQCK switches that will be used in adjoint-based applications when NL_BULK_FLUXES is activated to process basic state, nonlinear surface fluxes from the QCK instead of the FWD (or HIS) NetCDF file. This change is needed for coupled 4D-Var applications.
  • cgradient.F: Corrected the reading of the converged Ritz eigenvalues in routine cg_read. We need to have:
    !
    !  Read in converged Ritz eigenvalues.
    !
          CALL netcdf_get_fvar (ng, model, DAV(ng)%name,                    &
         &                      'Ritz', Ritz,                               &
         &                      ncid = DAV(ng)%ncid,                        &
         &                      start = (/1/),                              &
         &                      total = (/Ninner/))
          IF (FoundError(exit_flag, NoError, __LINE__,                      &
         &               __FILE__)) RETURN
    
  • initial.F: Corrected a shared-memory bug in the scaling of the nesting interpolation weights with the Land/Sea masking. We need to have:
    !
    !-----------------------------------------------------------------------
    !  If nesting and Land/Sea masking, scale horizontal interpolation
    !  weights to account for land contact points.
    !-----------------------------------------------------------------------
    !
          DO ng=1,Ngrids
            CALL nesting (ng, iNLM, nmask)
          END DO
    
    instead of
    !
    !-----------------------------------------------------------------------
    !  If nesting and Land/Sea masking, scale horizontal interpolation
    !  weights to account for land contact points.
    !-----------------------------------------------------------------------
    !
          DO ng=1,Ngrids
            DO tile=first_tile(ng),last_tile(ng),+1
              CALL nesting (ng, iNLM, nmask)
            END DO
    !$OMP BARRIER
          END DO
    
    
    
    Many thanks to Andy Moore for bringing this to my attention.
  • extract_obs.F: A new CPP option ALLOW_BOTTOM_OBS is introduced to avoid rejecting observations (default behavior) in the lower bottom grid cell (k=1). That is, between -h < Zobs < Adepth(:,:,1).
    #  ifndef ALLOW_BOTTOM_OBS
    !
    !  Reject observations that lie in the lower bottom grid cell (k=1) to
    !  avoid clustering due shallowing of bathymetry during smoothing and
    !  coarse level half-thickness (-h < Zobs < Adepth(:,:,1)) in deep
    !  water.
    !
                IF ((Zobs(iobs).gt.0.0_r8).and.(Zobs(iobs).le.1.0_r8)) THEN
                  ObsVetting(iobs)=0.0_r8
                END IF
    #  endif
    
  • edit_multifile.F: Added loading I/O structure QCK into BLK:
    !
    !  Load QCK information into the BLK structure so it can be used to
    !  process the NLM background surface forcing to be read and processed
    !  by the TLM, RPM, and ADM kernels.
    !
              CASE ('QCK2BLK')
                IF (ndefQCK(ng).gt.0) THEN
                  IF (QCK(ng)%ncid.ne.-1) THEN
                    CALL netcdf_close (ng, iNLM, QCK(ng)%ncid)
                  END IF
                  Nfiles=ntimes(ng)/ndefQCK(ng)
                  IF (nQCK(ng).eq.ndefQCK(ng)) Nfiles=Nfiles+1
                  CALL edit_file_struct (ng, Nfiles, BLK)
                  DO ifile=1,Nfiles
                    BLK(ng)%files(ifile)=TRIM(QCK(ng)%files(ifile))
                  END DO
                  BLK(ng)%name=TRIM(BLK(ng)%files(1))
                ELSE
                  BLK(ng)%ncid=-1
                  BLK(ng)%name=TRIM(QCK(ng)%name)
                  BLK(ng)%files(1)=TRIM(QCK(ng)%name)
                END IF
    
    
  • stdinp_mod.F: Updated stdinp_unit function to take into account the different standard input script and unit in coupling applications.
    !
    !-----------------------------------------------------------------------
    !  Determine ROMS standard input unit.
    !-----------------------------------------------------------------------
    #ifdef DISTRIBUTE
    !
    !  In distributed-memory configurations, the input physical parameters
    !  script is opened as a regular file.  It is read and processed by all
    !  parallel nodes.  This is to avoid a very complex broadcasting of the
    !  input parameters to all nodes.
    # ifdef MODEL_COUPLING
    !  However, in ESM coupling, the submission argument is 'coupling.in'
    !  and the Iname was read and processed in the coupling configuration
    !  elsewhere.
    # endif
    !
          InpUnit=1
    # ifndef MODEL_COUPLING
          IF (localPET.eq.0) CALL my_getarg (1, Iname)
          CALL mp_bcasts (1, 1, Iname)
    # endif
          OPEN (InpUnit, FILE=TRIM(Iname), FORM='formatted', STATUS='old',  &
         &      ERR=10)
          GotFile=.TRUE.
    
  • makefile: The ROMS executable logically so the BIN macro value can be overwritten in the build script.
    #--------------------------------------------------------------------------
    #  Set ROMS/TOMS executable file name.
    #--------------------------------------------------------------------------
    
    ifdef USE_DEBUG
      BIN ?= $(BINDIR)/romsG
    else
     ifdef USE_MPI
       BIN ?= $(BINDIR)/romsM
     else
      ifdef USE_OpenMP
        BIN ?= $(BINDIR)/romsO
      else
        BIN ?= $(BINDIR)/romsS
      endif
     endif
    endif
    
  • nesting.F: Added missing IF-directives in fine2coarse to avoid uncessary exchanges.
#866 Fixed Missing Drivers directory makefile macros xavierC
Description

The makefile in the git repository needs be modified in order to compile Drivers sub-directotory library:

 modules  +=    ROMS/Nonlinear \
                ROMS/Nonlinear/Biology \
                ROMS/Nonlinear/Sediment \
                ROMS/Functionals \
                ROMS/Drivers \
                ROMS/Utility \
                ROMS/Modules

The WC13 I4d-Var gives the following error during compilation:

libDRIVER.a(i4dvar.o): In function `i4dvar_mod_mp_background_':
i4dvar.f90:(.text+0x60f): undefined reference to `initial_'
i4dvar.f90:(.text+0x90d): undefined reference to `main3d_'

libDRIVER.a(i4dvar.o): In function `i4dvar_mod_mp_increment_': 
i4dvar.f90:(.text+0x1977): undefined reference to `tl_main3d_' 
i4dvar.f90:(.text+0x19c3): undefined reference to `ad_initial_' 
i4dvar.f90:(.text+0x1a4b): undefined reference to `tl_initial_'
i4dvar.f90:(.text+0x1c99): undefined reference to `ad_main3d_' 
i4dvar.f90:(.text+0x1d9d): undefined reference to `ad_variability_mod_mp_ad_variability_'
i4dvar.f90:(.text+0x1dba): undefined reference to `ad_convolution_mod_mp_ad_convolution_'
i4dvar.f90:(.text+0x24ee): undefined reference to `tl_convolution_mod_mp_tl_convolution_'
i4dvar.f90:(.text+0x2507): undefined reference to `tl_variability_mod_mp_tl_variability_'
i4dvar.f90:(.text+0x2660): undefined reference to `tl_wrt_ini_' 
i4dvar.f90:(.text+0x275a): undefined reference to `ad_wrt_his_'
i4dvar.f90:(.text+0x3140): undefined reference to `tl_wrt_ini_'
i4dvar.f90:(.text+0x3f0a): undefined reference to `tl_wrt_ini_'
i4dvar.f90:(.text+0x3f46): undefined reference to `tl_wrt_ini_'
i4dvar.f90:(.text+0x3f82): undefined reference to `tl_wrt_ini_'
#865 Done IMPORTANT: Split 4D-Var algorithms, Phase II arango
Description

Testing is completed for the basic split 4D-Var algorithms: I4DVAR_SPLIT, RBL4DVAR_SPLIT, and R4DVAR_SPLIT. Now, we get identical solutions when comparing unsplit and split solutions with the WC13 test case.

The split strategy is scripted to run with different executables in the elementary background, increment, and analysis 4D-Var phases, as mentioned in src:ticket:862. The modules i4dvar.F, rbl4dvar.F, and r4dvar.F have additional code activated with the internal CPP option SPLIT_4DVAR to load values to several variables which are assigned independently in other phases or are in memory in the unsplit algorithm.

  • The minimization solvers congrad.F, cgradinet.F, and rpcg_lanczos.F now include routines cg_read_congrad, cg_read_gradient, and cg_read_rpcg, respectively, to read previous outer loop variables from the MODNAME (DAV structure), which will be on memory in the unsplit algorithm. It will come handy when re-starting 4D-Var in the future. This capability is straight forward in the split drivers. Such drivers (split_i4dvar_ocean.h, split_rbl4dvar_ocean.h, and split_r4dvar_ocean.h) will be released in the future.
  • The rpcg_lanczos solver is now inside of a module for strong type checking of routine arguments.
  • The routine wrt_impulse now reports statistics including the checksum to facilitate comparison of unsplit and split algorithms. For example, we can have:
    Converting Convolved Adjoint Trajectory to Impulses: Outer = 001 Inner = 004
    
       WRT_IMPULSE - processing convolved adjoint impulses, records: 1 to 2
                     file: wc13_roms_tlf_20040103.nc
                   - zeta
                      (Rec = 2 Min = -1.30850216E-02 Max =  1.55209772E-02 CheckSum = -1441498552)
                   - u
                      (Rec = 2 Min = -5.60027244E+01 Max =  5.89141275E+01 CheckSum = -1370629773)
                   - v
                      (Rec = 2 Min = -6.30660677E+01 Max =  8.89201837E+01 CheckSum = -1908057756)
                   - temp
                      (Rec = 2 Min = -3.88886922E+01 Max =  1.61558954E+01 CheckSum = -2119489230)
                   - salt
                      (Rec = 2 Min = -8.91395412E+02 Max =  3.81362699E+02 CheckSum = 310816551)
                   - zeta
                      (Rec = 1 Min =  0.00000000E+00 Max =  0.00000000E+00 CheckSum = 14278099)
                   - u
                      (Rec = 1 Min = -1.61792544E+01 Max =  1.64542330E+01 CheckSum = 919474362)
                   - v
                      (Rec = 1 Min = -1.44458820E+01 Max =  1.80490759E+01 CheckSum = 661583172)
                   - temp
                      (Rec = 1 Min = -2.87144548E+01 Max =  1.04972849E+01 CheckSum = -500278367)
                   - salt
                      (Rec = 1 Min = -3.36527609E+02 Max =  3.27248419E+02 CheckSum = -205855439)
    
  • The initial conditions for the next R4DVAR data assimilation cycle is now written in the DAINAME NetCDF file. Here, we use the last record of the representer model (RPM) for the final outer loop instead of the nonlinear model:
            DO ng=1,Ngrids
              LdefDAI(ng)=.TRUE.
              CALL def_dai (ng)
              IF (FoundError(exit_flag, NoError, __LINE__,                  &
         &                   __FILE__)) RETURN
    !
              WRITE (TLM(ng)%name,10) TRIM(FWD(ng)%head), Nouter
     10       FORMAT (a,'_outer',i0,'.nc')
              lstr=LEN_TRIM(TLM(ng)%name)
              TLM(ng)%base=TLM(ng)%name(1:lstr-3)
              IF (TLM(ng)%Nfiles.gt.1) THEN
                Nfiles=TLM(ng)%Nfiles
                DO ifile=1,Nfiles
                  WRITE (suffix,"('_',i4.4,'.nc')") ifile
                  TLM(ng)%files(ifile)=TRIM(TLM(ng)%base)//TRIM(suffix)
                END DO
                TLM(ng)%name=TRIM(TLM(ng)%files(Nfiles))
              ELSE
                TLM(ng)%files(1)=TRIM(TLM(ng)%name)
              END IF
    !
              CALL netcdf_get_dim (ng, iRPM, TLM(ng)%name,                  &
         &                         DimName = 'ocean_time',                  &
         &                         DimSize = InpRec)
              IF (FoundError(exit_flag, NoError, __LINE__,                  &
         &                   __FILE__)) RETURN
              Tindex=1
              CALL get_state (ng, iRPM, 1, TLM(ng)%name, InpRec, Tindex)
              IF (FoundError(exit_flag, NoError, __LINE__,                  &
         &                   __FILE__)) RETURN
    !
              KOUT=Tindex
              NOUT=Tindex
              CALL wrt_dai (ng, tile)
              IF (FoundError(exit_flag, NoError, __LINE__,                  &
         &                   __FILE__)) RETURN
            END DO
    
    Many thanks to Andy Moore for the suggestion.
  • Added new variable NLmodel_final to MODNAME NetCDF file containing the final values of the nonlinear model solution interpolated at the observation locations in a 4D-Var cycle:
           double NLmodel_final(datum) ;
                   NLmodel_final:long_name = "final nonlinear model at observation locations" ;
                   NLmodel_final:units = "state variable units" ;
                   NLmodel_final:_FillValue = 1.e+37 ;
    
    It is useful to have this variable for processing output MODNAME NetCDF file in ERDDAP when computing the Desroziers et al. (2005) 4D-Var diagnostics. Many thanks to John Wilkin for suggesting this additional variable.

WARNING: The file varinfo.dat was modified to include metadata for NLmodel_final:

'NLmodel_final'                                    ! Output
  'final nonlinear model at observation locations'
  'state variable units'
  'nulvar'
  'datum'
  'idNLmf'
  'nulvar'
  1.0d0
  • The nonlinear solution can be split into multi-file in 4D-Var applications to avoid getting huge files. See src:ticket:825 for detailed information. I have been testing this capability for a while and I found out that sometimes the solutions are not bit-by-bit identical. After some testing, I came to the conclusion that the differences are due to roundoff, and the 4D-Var convergence and solution are the same. For example, in our US West Coast DOPPIO application, we get the following cost functions convergence (Nouter=2 and Ninner=8):

http://www.myroms.org/trac/doppio_cost_MvsS.png

The above figure compares two Split RBL4D-Var runs showing the total cost function for the NLM multi-file trajectory case (cyan line) versus the NLM single-file trajectory case (green triangles).

http://www.myroms.org/trac/doppio_cost_SvsM.png

The above figure compares two runs showing the total cost function for Split RBL4D-Var with NLM single-file trajectory (cyan line) versus Unsplit RBLD-Var with multi-file NLM trajectory (green triangles).


Corrected Bug:

Corrected a multi-file bug in output.F, tl_output.F, rp_output.F, and ad_output.F for all ROMS output files in restart applications. For example in output.F, the counter HIS(ng)%load needs reset to zero in restart applications:

         IF ((nrrec(ng).ne.0).and.(iic(ng).eq.ntstart(ng))) THEN
           IF ((iic(ng)-1).eq.idefHIS(ng)) THEN
             HIS(ng)%load=0                  ! restart, reset counter
             Ldefine=.FALSE.                 ! finished file, delay
           ELSE                              ! creation of next file
             Ldefine=.TRUE.
             NewFile=.FALSE.                 ! unfinished file, inquire
           END IF                            ! content for appending
           idefHIS(ng)=idefHIS(ng)+nHIS(ng)  ! restart offset
         ELSE IF ((iic(ng)-1).eq.idefHIS(ng)) THEN
           ...
         END IF

The same needs to be corrected for the AVG(ng), ADM(ng), QCK(ng), and TLM(ng) structures field load. Many thanks to Julia Levin for bringing this to my attention.

1 2 3 4 5 6 7 8 9 10 11

Query Language

query: TracLinks and the [[TicketQuery]] macro both use a mini “query language” for specifying query filters. Filters are separated by ampersands (&). Each filter consists of the ticket field name, an operator and one or more values. More than one value are separated by a pipe (|), meaning that the filter matches any of the values. To include a literal & or | in a value, escape the character with a backslash (\).

The available operators are:

= the field content exactly matches one of the values
~= the field content contains one or more of the values
^= the field content starts with one of the values
$= the field content ends with one of the values

All of these operators can also be negated:

!= the field content matches none of the values
!~= the field content does not contain any of the values
!^= the field content does not start with any of the values
!$= the field content does not end with any of the values

The date fields created and modified can be constrained by using the = operator and specifying a value containing two dates separated by two dots (..). Either end of the date range can be left empty, meaning that the corresponding end of the range is open. The date parser understands a few natural date specifications like "3 weeks ago", "last month" and "now", as well as Bugzilla-style date specifications like "1d", "2w", "3m" or "4y" for 1 day, 2 weeks, 3 months and 4 years, respectively. Spaces in date specifications can be omitted to avoid having to quote the query string.

created=2007-01-01..2008-01-01 query tickets created in 2007
created=lastmonth..thismonth query tickets created during the previous month
modified=1weekago.. query tickets that have been modified in the last week
modified=..30daysago query tickets that have been inactive for the last 30 days

See also: TracTickets, TracReports, TracGuide, TicketQuery

Last modified 3 years ago Last modified on Jun 20, 2017 2:27:28 PM