next up previous contents
Next: Dispatcher Up: The MINOS Off-line Software Previous: R2HBOOK   Contents

Subsections


Database Maintenance

Last significant change: 2002/02/22

Eventually this chapter will probably be moved out of the User Manual into its own document.

Introduction

The DatabaseMaintenance package produces a single binary application: dbmjob that provides very basic database maintenance support. Specifically its current function is only as a tool to distribute data between databases.

Figure 18.1:
\includegraphics[scale=.35]{dbm_db_distribution.eps}

The flow of data is shown schematically in diagram 18.1. At the heart of the system is the Master Database at Soudan. Most database updates enter the database realm here. At regular intervals dbmjob is used to export all recently updated data and these export files are distributed to all other databases where the data is imported if not already present. This is done by the local database manager again using dbmjob. These primary data flows are shown in red.

Smaller amounts of data come from secondary databases e.g. at CalDet and these are exported up to the Master Database where they join other updates for distribution.

This system relies on the ability to:-

dbmjob exploits the fact that all Dbi compliant database tables come in pairs, the main data table and an auxiliary validity range table. The auxiliary table records insertion dates and have globally unique SeqNos (Sequence Numbers). The diagram shows how globally unique numbers are assigned. Every database that is a source of data has a GlobalSeqNo table that is used to generate sequence numbers. Each time one is allocated the count is incremented in the table. For each database the table operates in a different range of numbers hence ensuring that all are unique. dbmjob moves data in ``Validity Packets'' i.e. a single row in the auxiliary table and all its associated data rows. The insertion date and SeqNo on the auxiliary row allow dbmjob to support incremental updates and avoid data duplication.

All this implies a very important restriction on dbmjob:-

dbmjob can only distribute Dbi compliant database tables i.e. ones that come in pairs, the main data table and an auxiliary validity range table.

Building and Running dbmjob

Building

The DatabaseMaintenance package is a standard Framework package and the dbmjob application is build in the standard way:-

  cd $SRT_PUBLIC_CONTEXT    %$
  gmake DatabaseMaintenance.all

Running

Before running, a Database cascade must be defined using the ENV_TSQL_* variables as described in [*]. Alternatively use the -d, -u and -p switches that are also described there or use the ENV_TSQL_UPDATE_* (e.g. ENV_TSQL_UPDATE_USER) set of variables. Where they exist, they will take precedence over the equivalent ENV_TSQL_* variable. This allows for a safe read-only setting of the ENV_TSQL_* variables that can be shared by a group, with just the local database manager also having the ENV_TSQL_UPDATE_* set for write-access. Note that the job switches take priority over everything else.

To run, just type:-

  dbmjob

dbmjob enters interactive mode. For help type Help and to quit type Quit. The following illustrate simple exporting and importing. For more detail consult the Help command.

Exporting Data

dbmjob always exports data from the first database in the cascade.

To export data use the Export command. The syntax is:-

 Export {--Since <date>}  <table>  <file>

This exports the contents of $<$table$>$ into $<$file$>$ which can subsequently be imported into another database using the Import command. $<$table$>$ can be a specific table e.g. PlexPixelSpotToStripEnd or * for all tables. For example:-

Export * full_backup.dat
Export -since "2001-09-27 12:00:00" PlexPixelSpotToStripEnd update.dat

The first updates the entire database whilst the second just records updates to PlexPixelSpotToStripEnd since midday on the 27 September 2001.

Importing Data

By default dbmjob always imports into the first database in the cascade but this can be overridden.

To Import data use the Import command. The syntax is:-

Import {--Test } {--DatabaseNumber <no>} <file>

This imports the contents $<$file$>$ into the database. The insertion dates in the file's validity records are replaced by the current date and time so that the insertion dates in the database reflect the local insertion date. Any SeqNo already present will be skipped but the associated data is compared to the corresponding entries in the database to confirm that they are identical, neglecting differences in insertion dates. For example:-

Import full_backup.dat
Export --DatabaseNumber 1  update.dat
Import --Test full_backup.dat

The first updates the first database (Cascade number 0) whilst the second updates the second database in the cascade. The last does not import at all but still does comparisons so is a convenient way to compare a database to an import file.


next up previous contents
Next: Dispatcher Up: The MINOS Off-line Software Previous: R2HBOOK   Contents
MINOS Software 2017-11-20