The offline event builder and orderer is a repackaging of the online event builder for use on data that has already been written to disk. It enables the user to stream a run file through an event builder and write the output to disk.
The Offline Event Builder and Orderer effectively takes a built event file, separates the built events into individual fragments, pushes all of the fragments through an ordering stage and a potential correlation stage, and ultimately writes the output to disk. The outputted file will be formatted similar to the original file but with the proper ordering of fragments and likely a different correlation window. By default, the correlation stage is disabled such that the behavior will be to order the fragment and write them uncorrelated to disk. It is the user's responsibility to enable the correlation stage and specify the correlation window when setting up their job(s).
As a side note, one of the side effects of processing a file containing ring items that originally lacked body headers is that the newly reordered and built stream will be written with the body headers added using the info provided by the user during configuration.
The offline orderer and event builder is aimed at addressing two needs:
Fixing out of order fragments caused by a late fragment error during an online experiment.
Rebuilding already built data using a different correlation time.
Inserting body headers into a file that were missing for non-PHYSICS_EVENT items from a single source. It is important to know that this will treat all data without body headers identically, so if there were more than two data sources that were missing body headers, this will not work quite as expected because the two previously separate data streams will be treated as 1. It will fail.
Performing either of the above two tasks is possible by creating some command pipelines at the terminal but this program abstracts away a lot of the details and allows the user to accomplish his/her task through a simplified interface.
Unable to reorder or correlate data more than one event builder stage deep. In other words, if the disordered data occurs in the first stage of event building when the data were written to disk following multiple stages, fixing the problem will require first splitting the data apart using a filter program before running it through this software. This is a much more complicated business.
Processing the data for the same run with different correlation windows cannot be performed in a single processing run. Doing so requires using a different stagearea directory for each set of correlation parameters.
Data files that were written with more than one data source missing body headers cannot be handled with this tool. The BEGIN_RUN items from the previously separate data streams will be labeled as originating from the same data stream (i.e. they will be provided the same source id). However, the PHYSICS_EVENTS from those data streams will retain their unique identities because the FragmentHeaders within those streams will contain the proper source ids. The result of this will be that the ordering stage will have to undergo a barrier timeout, thereby waiting for 1 min 20 seconds before continuing. In that time, the process writing data to disk will stop waiting and terminate. Long story made short, it will fail!