diff --git a/doc/conf.py b/doc/conf.py index 5ece0ac..6d9a328 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -29,6 +29,7 @@ extensions = ['sphinx.ext.autodoc', 'sphinx.ext.pngmath'] autodoc_member_order = 'bysource' +autoclass_content="both" # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] diff --git a/doc/gui.rst b/doc/gui.rst index 14214ec..5a437a6 100644 --- a/doc/gui.rst +++ b/doc/gui.rst @@ -25,6 +25,14 @@ GUI Modules :undoc-members: :show-inheritance: +:mod:`gisans_dialog` Module +--------------------------- + +.. automodule:: quick_nxs.gisans_dialog + :members: + :undoc-members: + :show-inheritance: + :mod:`gui_logging` Module ------------------------- @@ -41,6 +49,14 @@ GUI Modules :undoc-members: :show-inheritance: +:mod:`help_widgets` Module +-------------------------- + +.. automodule:: quick_nxs.help_widgets + :members: + :undoc-members: + :show-inheritance: + :mod:`ipython_widget` Module ---------------------------- @@ -64,3 +80,20 @@ GUI Modules :members: :undoc-members: :show-inheritance: + +:mod:`polarization_gui` Module +------------------------------ + +.. automodule:: quick_nxs.polarization_gui + :members: + :undoc-members: + :show-inheritance: + +:mod:`rawcompare_plots` Module +------------------------------ + +.. automodule:: quick_nxs.rawcompare_plots + :members: + :undoc-members: + :show-inheritance: + diff --git a/doc/index.rst b/doc/index.rst index 286a7db..195e138 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -3,8 +3,8 @@ You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. -Welcome to QuickNXS's documentation! -==================================== +Welcome to QuickNXS's API documentation! +======================================== Contents of the quick_nxs Package: @@ -20,6 +20,10 @@ Contents of the quick_nxs Package: Indices and tables ================== +.. toctree:: + + references + * :ref:`genindex` * :ref:`modindex` * :ref:`search` diff --git a/doc/other.rst b/doc/other.rst index 8d862c7..becfb0d 100644 --- a/doc/other.rst +++ b/doc/other.rst @@ -1,6 +1,15 @@ Other Modules ============= +:mod:`config` Module +-------------------- + +.. automodule:: quick_nxs.config + :members: + :undoc-members: + :show-inheritance: + + :mod:`output_templates` Module ------------------------------ diff --git a/doc/references.rst b/doc/references.rst new file mode 100644 index 0000000..1272c74 --- /dev/null +++ b/doc/references.rst @@ -0,0 +1,17 @@ + +Scientific References +===================== + +.. [ARWildes2007] A.R. Wildes, + **Neutron Polarization Analysis Corrections Made Easy**, + *Neutron News* **17:2**, 17-25 (2007) + +.. [PDu2006] P. Du, W.A. Kibbe and S.M. Lin, + **Improved peak detection in mass spectrum by incorporating continuous + wavelet transform-based pattern matching** + *Bioinformatics* **22**, 17 (2006) + + +.. [CTorrance1998] C Torrance and GP Compo + **A practical guide to wavelet analysis** + *Bull Amer Meteor Soc* **79**, 1 61-78 (1998) diff --git a/doc/user_manual/QuickNXS_Users_Manual.pdf b/doc/user_manual/QuickNXS_Users_Manual.pdf index 3198aec..b7f7385 100644 Binary files a/doc/user_manual/QuickNXS_Users_Manual.pdf and b/doc/user_manual/QuickNXS_Users_Manual.pdf differ diff --git a/doc/user_manual/QuickNXS_Users_Manual.tex b/doc/user_manual/QuickNXS_Users_Manual.tex index 441c1e2..2af205d 100644 --- a/doc/user_manual/QuickNXS_Users_Manual.tex +++ b/doc/user_manual/QuickNXS_Users_Manual.tex @@ -1,6 +1,6 @@ % header file needs to be changed for latex2html to render formulas correctly -%\input{header.tex} -\input{header_html.tex} +\input{header.tex} +%\input{header_html.tex} \begin{document} \include{titlepage} diff --git a/doc/user_manual/advanced.tex b/doc/user_manual/advanced.tex index 607159a..338f7a8 100644 --- a/doc/user_manual/advanced.tex +++ b/doc/user_manual/advanced.tex @@ -2,16 +2,18 @@ \chapter{Advanced Usage} \label{chap:advanced_usage} \section{Event mode data} When the event mode is selected for file import additional options for the desired binning are shown. You can define the number of \textbf{Bins}, the bin steps (constant ToF or Q steps) and split datasets in time. + In all other aspects event mode data is not treated differently from histogram data. To be able to use a direct beam measurement, it needs to be extracted with the same binning as the actual measurement. \section{Re-reduction of already exported data} - You can read all datasets and options from an already exported reflectivity using the \textbf{File->Load Extraction...} menu and select a ASCII file. + You can read all datasets and options from an already exported reflectivity using the \textbf{File->Load Extraction...} menu and select an ASCII file. Afterwards the options in the reduction table can be changed as desired. \section{Overwrite direct beam parameters} + \label{sec:overwrite} If the the direct beam position was not saved correctly during instrument alignment all calculated \Qz-values for the reflectivity will be wrong. To correct this there are two parameters to overwrite it in the \textbf{Reflectivity Extraction (Advanced)} area, Direct Pixel and Dangle0. These parameters are ignored if they have the values -1 and None. - To overwrite the correct values you can open a direct beam dataset and activate the \textbf{Adjust Direct Beam} \icon{tthZero} to save the current DANGLE as \textbf{Dangle0} and the fitted x-position as \textbf{Direct Pixel}. + To overwrite the correct values you can open a direct beam dataset and activate the \textbf{Adjust Direct Beam action} \icon{tthZero} to save the current DANGLE as \textbf{Dangle0} and the fitted X-position as \textbf{Direct Pixel}. \clearpage \section{Advanced background subtraction} @@ -26,20 +28,23 @@ \chapter{Advanced Usage} You can use Polygons to define the extraction area for the background precisely, these polygons are than shown in the X vs. λ plot as gray areas. The normal extraction region defined by the main window parameters is shown in red. For each λ channel the average value of all pixels in the gray areas is taken as background. - If for a given value of λ no points are defined with a polygon the red area is taken. + If, for a given value of λ, no points are defined with a polygon the red area is taken. As an additional option it is possible to presume a background that is directly dependent on the incident intensity to reduce the error bars. \section{"Fan"-Reflectivity} + \label{sec:fan} Some samples have a weavy or bent surface and reflect the neutron beam into different angles. Treating these as one reflection will destroy the \Qz resolution of you measurement or needs to be restricted to a small X-region, which reduces the statistics. - For these cases the \textbf{Reflectivity Extraction (Advanced)} area has a \textbf{"Fan"-Reflectivity} option, which treats each pixel on the detector separately to calculate I(\Qz) and combines these reflectivity afterwards to get better statistics. + For these cases the \textbf{Reflectivity Extraction (Advanced)} area has a \textbf{"Fan"-Reflectivity} option, which treats each pixel on the detector separately to calculate I(\Qz) and combines these reflectivities afterwards to get better statistics. In this case you can widen the red area on the X-projection plot to take into account your full reflectivity. The underlying algorithm reduces the total width of a selected dataset in \Qz so it can be possible that for lower angles you do not get overlapping areas to stitch the data together. In this case you should select a smaller area for the lower angle measurements and/or extract two regions in X for one dataset. \section{Off-specular and GISANS scattering} Off-specular scattering can easily be extracted when the specular reflectivity is already defined in the reduction table. You can take a look at the \textbf{OffSpec Preview} tab to take a look in advance (does not show the active dataset, only the reduction table entries). - The reduction dialog as a separate option to export the off-specular data, where \textit{Raw} refers to the data as extractec, \textit{Corrected} applies an algorithm to reduce detector artifacts from high intensity areas and \textit{Smoothed} will interpolate the data to a regular grid with \Qz-dependent Gaussian smoothing (parameters are defined in a separate dialog when exporting). + The reduction dialog has a separate option to export the off-specular data, where \textit{Raw} refers to the data as extractec, \textit{Corrected} applies an algorithm to reduce detector artifacts from high intensity areas and \textit{Smoothed} will interpolate the data to a regular grid with \Qz-dependent Gaussian smoothing (parameters are defined in a separate dialog when exporting). + + \textit{The scaling factors to combine datasets for off-specular extraction will only be correct if the \textbf{full reflectivity} is within the X-width and if you \textbf{keep the X-width constant}! These conditions need not to be fulfilled when only specular data is extracted.} For GISANS measurements another dialog appears when exporting, where you can define the wavelength bands, which will be combined in one image. diff --git a/doc/user_manual/data_reduction.tex b/doc/user_manual/data_reduction.tex index 189547d..4d688ff 100644 --- a/doc/user_manual/data_reduction.tex +++ b/doc/user_manual/data_reduction.tex @@ -3,16 +3,16 @@ \chapter{Data Reduction} \section{Open and view a dataset} \label{sec:open_file} - The first step to start with is to enter the number of a normalization dataset in the "Open Number:" entry of the Files area and press enter. The program will now locate the file and open it. + The first step to start with is to enter the number of a direct beam dataset in the "Open Number:" entry of the Files area and press enter. The program will now locate the file and open it. The Files area list will be populated with all files in your current proposal data folder and the plot windows should look similar to this: \begin{tabular}{ccc} \includegraphics[width=155pt]{screenshots/normalizemap1.png} & \includegraphics[width=155pt]{screenshots/normalizemap2.png} &\\ Overview X-Y & Overview ToF-X & \\ - \includegraphics[width=155pt]{screenshots/normalize1.png} & - \includegraphics[width=155pt]{screenshots/normalize2.png} & - \includegraphics[width=120pt]{screenshots/normalize3.png} \\ + \includegraphics[width=145pt]{screenshots/normalize1.png} & + \includegraphics[width=145pt]{screenshots/normalize2.png} & + \includegraphics[width=145pt]{screenshots/normalize3.png} \\ X-Projection & Y-Projection & Reflectivity \end{tabular} @@ -21,10 +21,10 @@ \section{Open and view a dataset} \section{Go full-automatic: Reduction for dummies} For good quality data (enough intensity and narrow reflection) the program supports a fully automatized mode, where all reduction parameters are automatically calculated. This mode will be applied automatically when more than one dataset is selected at the File Open Dialog. - The direct beam measurement have to have lower scan numbers than the actual measurements or need to be set in advance for this method to work. + The direct beam measurements have to have lower scan numbers than the actual measurements or need to be set in advance for this method to work. The automatic algorithm performs the same steps as described in section \ref{sec:quick_start}, while trying to guess the best parameters. - The datasets are read one-by-one and, depending on the reflection angle, they are either set as normalization or reflectivity data in the reduction list. + The datasets are read one-by-one and, depending on the reflection angle, they are either set as direct beam or reflectivity data in the reduction list. Here is an example how the interface might look after the algorithm has finished: \includegraphics[width=460pt]{screenshots/overview.png} @@ -33,52 +33,53 @@ \section{Go full-automatic: Reduction for dummies} You can now scale individual datasets as described in \ref{sec:scaling}, if the stitching was not performed optimally. When satisfied with the result, you can save the data as described in the export section \ref{sec:export}. - +\clearpage \section{Quick start: Step-by-step standard reduction} \label{sec:quick_start} - For most datasets the reduction is done very similar to the fully automatized method but with more control of the user. - Every dataset is examined by the operator to select the best extraction parameters. + For most datasets the reduction is done very similar to the fully automatized method but with more control by the user. + Every dataset is examined by the operator to select the best extraction parameters. This description should work in + almost all circumstances. - \subsection{Step 1: Set wavelength normalization from direct beam} - \begin{wrapfigure}[11]{r}{0.55\textwidth} + \subsection{Step 1: Set direct beam runs} + \begin{wrapfigure}[9]{r}{0.55\textwidth} \begin{tabular}{cc} Reflectivity before & Reflectivity after \\ \includegraphics[width=115pt]{screenshots/normalize3.png} & \includegraphics[width=115pt]{screenshots/normalize_after.png} \end{tabular} \end{wrapfigure} - \textbf{Open your normalization file} as described in section \ref{sec:open_file}. - Make sure the SANGLE-calc value shown in the overview tab is close to zero and that the X- and Y-projections show the correct regions with the red indicators. - Activate the \textbf{Set Normalization action} \icon{extractNormalization}, this will add the current dataset to the "Normalization" list, the "Direct Beam Runs:" label will show the number of the dataset and the reflectivity will show the normalized intensities, which should all be one. - Repeat this step for each direct beam measurement needed for your dataset. + \textbf{Open your direct beam file} as described in section \ref{sec:open_file}. + Make sure the SANGLE-calc value shown in the overview tab is close to zero and that the X- and Y-projections show the correct regions indicated with vertical lines. + Activate the \textbf{Set Normalization action} \icon{extractNormalization}, this will add the current dataset to the "Direct Beam" list, the "Direct Beam Runs:" label will show the number of the dataset and the reflectivity will show the normalized intensities, which should all be one. You can use the \textbf{Cut Points (L/R) action} \icon{cutPoints} here to already set reasonable parameters for the Cut Pts entries. + Repeat this step for each direct beam measurement needed for your datasets. \subsection{Step 2: Define a suitable background- and y-region} - \begin{wrapfigure}[15]{r}{0.55\textwidth} + \begin{wrapfigure}[16]{r}{0.6\textwidth} \centering X-projection with background region (green)\\ - \includegraphics[width=160pt]{screenshots/background.png} + \includegraphics[width=155pt]{screenshots/background.png} \begin{tabular}{cc} Y-projection of small sample & X-Y map \\ - \includegraphics[width=160pt]{screenshots/yregion.png} & \includegraphics[width=70pt]{screenshots/yregionmap.png} + \includegraphics[width=150pt]{screenshots/yregion.png} & \includegraphics[width=100pt]{screenshots/yregionmap.png} \end{tabular} \end{wrapfigure} Although it is in principle possible to define the extraction and background region for each dataset separately, it is recommended to use the same parameters for all files. - From this perspective it is often a good idea to start with the dataset with the highest incident angle, as there the signal to background ration is the lowest. - To produce the best results you should select a large region (statistics), keeping enough distance from the reflected beam (especially when off-specular Bragg-sheets are present) and to not include regions where the background drops (shadowed by the right detector slit for example). + From this perspective it is often a good idea to start with the dataset with the highest incident angle, as the signal to background ration is the lowest there. + To produce the best results you should select a large region (statistics), keeping enough distance from the reflected beam (especially when off-specular Bragg-sheets are present) and to not include regions where the background drops (shadowed by the left detector slit for example). The Y-region, shown in the Y-projection of the first (low Q) dataset, is often detected very well automatically. Just check that it fits to the reflected intensity area. - For very small samples it can sometimes make sense to manually restrain the area to the sample reflection. + For very small samples it can sometimes make sense to manually restrain the area to the sample reflection using the right mouse button on X-Y map. \subsection{Step 3: Normalize to total reflection and add the first dataset} - \begin{wrapfigure}[13]{r}{0.4\textwidth} + \begin{wrapfigure}[12]{r}{0.4\textwidth} \includegraphics[width=165pt]{screenshots/totalreflection.png} \end{wrapfigure} Go to your dataset starting at the lowest \Qz value, remove points from the low \Qz region, which are not reasonable with the \textbf{Cut Pts parameters}: - \includegraphics[width=115pt]{screenshots/cutpoints.png} (can be done automatically with the \textbf{Cut Points (L/R)} \icon{cutPoints} action). Than activate the \textbf{Set Scaling action} \icon{totalReflection} to normalize the total reflection to one. + \includegraphics[width=115pt]{screenshots/cutpoints.png} (can be done automatically with the \textbf{Cut Points (L/R)} \icon{cutPoints} action if not already performed after direct beam selection). Than activate the \textbf{Set Scaling action} \icon{totalReflection} to normalize the total reflection to one. This should now look like the image on the right. - Next add the dataset to the refinement list using the \textbf{Keep Item in List action} \icon{addRef} to add the dataset with the current parameters in the list. + Next add the dataset to the refinement list using the \textbf{Keep Item in List action} \icon{addRef}, copying the current parameters to the list. This will automatically switch off the \textbf{Automatic Y Limits} \icon{limitYauto}, so all datasets will be reduced with the same Y-range. This is important for the high \Qz region as the background often inhibits a good automatic detection of the Y-region. @@ -89,7 +90,7 @@ \section{Quick start: Step-by-step standard reduction} \includegraphics[width=175pt]{screenshots/stitching1.png} \end{wrapfigure} Now you can continue adding each subsequent dataset one after another. - If nothing goes wrong, the only thing that needs to be changed from dataset are the \textbf{Cut Pts} and \textbf{Scaling} values. + If nothing goes wrong, the only thing that needs to be changed from dataset to dataset is the \textbf{Scaling} parameter. If the scaling of subsequent datasets does not fit, activate the \textbf{Set Scaling action} \icon{totalReflection} again. This fits a polynomial to the logarithmic data of both adjacent datasets including a scaling factor for the second, which is than used for the scaling after the fit. The error weighted $\chi^2$ used for this refinement is: @@ -98,25 +99,24 @@ \section{Quick start: Step-by-step standard reduction} \text{with } p(Q) =& a\cdot Q^2 + b\cdot Q +c \text{ for polynom order 3} \end{eqnarray*} The resulting fit function is shown in the reflectivity plot together with the scaled data as can be seen in the figure on the right. - For some datasets with very sharp features like multilayer Bragg-peaks this method will not work, in those cases you need to change the \textbf{Scale 10\^} parameter manually until the datasets fit together nicely. - For polarized measurements it can sometimes be helpful to switch back and forth between different polarization channels as the variation in contrast can lead to smooth transitions, where the other channel has a sharp feature. + For some datasets with very sharp features like multilayer Bragg-peaks this method will probably not work, in those cases you need to change the \textbf{Scale 10\^} parameter manually until the datasets fit together nicely. + For polarized measurements it can sometimes be helpful to switch back and forth between different polarization states as the variation in contrast can lead to smooth transitions, where the other state has a sharp feature. Now add the dataset to the reduction list with \textbf{Keep Item in List action} \icon{addRef} again and repeat the procedure for all datasets belonging to this measurement. \subsection{Step 5: Refine the reflectivity scaling and cutting} \label{sec:scaling} - \begin{wrapfigure}[10]{r}{0.6\textwidth} + \begin{wrapfigure}[10]{r}{0.33\textwidth} \centering \begin{tabular}{cc} - As added to reduction list& With changed cut points\\ - \includegraphics[width=129pt]{screenshots/stitching2.png} & - \includegraphics[width=129pt]{screenshots/cleanpoints.png} + \includegraphics[width=145pt]{screenshots/cleanpoints.png} \end{tabular} \end{wrapfigure} When all datasets of one measurement have been added, as can be seen in the image on the right, you can try to improve the scaling of the different parts, if needed, and change the cutting parameters. To change the scaling of one dataset you can either change the value of the \textbf{I0 column} entry in the reduction list or move the mouse \textbf{cursor on top of the curve} you want to scale and \textbf{move the mouse wheel}. To remove unwanted point you need to change the values of the \textbf{NL} and \textbf{NR column} entries as they define the number of points cut from the low- and hight-Q side respectively. - If the number of time of flight channels in the histogram dataset is larger than the wavelength window used for the measurement it is possible that large values are needed (<=60) to see changes in the dataset. + If the number of time of flight channels in the dataset is larger than the wavelength window it is possible that large values are needed (<=60) to see changes in the dataset. + (Removing of overlapping points with low statistics can be done with \textbf{Strip Overlap} \icon{stripOverlap}.) \subsection{Step 6: Export your data} \label{sec:export} @@ -127,9 +127,9 @@ \subsection{Step 6: Export your data} Now you are ready to export you reflectivity! Activate the \textbf{Reduce... action} \icon{reduce} from the menu, toolbar or the button below the reduction list. The reduce dialog has several options for the export of the dataset. - You can select which reductions should be stored, choose the channels to export and define which data formats should be created. + You can select which reductions should be stored, choose the spin states to export and define which data formats should be created. As a default, the specular reflectivity of all available channels will be exported to separate ASCII files and a dialog with a plot of the resulting data will be shown afterwards. - Additional output options are a combined ASCII file containing all channels, a matlab or numpy datafile for later processing, a Gnuplot script and image file to plot the ASCII data and a GenX reflectivity modeling template already containing the measured data. + Additional output options are a combined ASCII file containing all states, a matlab or numpy datafile for later processing, a Gnuplot script and image file to plot the ASCII data and a GenX reflectivity modeling template already containing the measured data. If you want to send the resulting data to your email address you can use the \textbf{Email Results} tab to enter your address and select which and if the data should be send after the export. @@ -137,4 +137,10 @@ \subsection{Step 6: Export your data} \section{Examples} This section will give three example datasets, which you can use to try the reduction yourself and compare the result with the images in this manual. -\section{Common problems to be aware of} +\section{Common complications to be aware of} + \begin{description} + \item[Bend/Twisted samples] Samples which do not have a flat surface can produce "fan"-like or split reflections. In these cases the peak fitting can result in different extraction areas for different incident angles. Use a small X window and/or the \textbf{"Fan" Reflection} option (see section \ref{sec:fan}) from the \textbf{Reflectivity Extraction (Advanced)} tab and define the reflection position manually. + + \item[Runs don't fit together] This often is caused by one of two reasons: One possibility is that there are very sharp features in the reflectivity which are measured differently for the different angles, as the resolution changes with the angle. In this case the only possibility is to try to merge the datasets manually as good as possible. If the measurement is polarized, take a look at the other spin state, sometimes the feature won't be as pronounced and make it easer to find the scaling factor. The other possibility is that the direct beam position, which should be measured before the experiment and written to the datafile, is not correct. This can be checked with the according direct beam run. If the \textbf{SANGLE-calc} value for the direct beam run is not close to zero you can use the \textbf{Adjust Direct Beam action} \icon{tthZero} to overwrite the values read from the datafile. Afterwards the it should be possible to extract the reflectivity normally. Don't forget to reset this overwrite afterwards using \textbf{Clear Overwrite} (see section \ref{sec:overwrite}). + \end{description} + diff --git a/doc/user_manual/introduction.tex b/doc/user_manual/introduction.tex index 8d18ca5..37fe3ca 100644 --- a/doc/user_manual/introduction.tex +++ b/doc/user_manual/introduction.tex @@ -2,5 +2,29 @@ \chapter{Introduction and Background} \label{chap:introduction} \section{The data recorded at Beamline 4A} + The raw data recorded at the magnetism reflectometer stores each detected neutron as one event, which includes information on + the position on the detector, the relative time passed after the neutron pulse was created at the target, the absolute time of the + according pulse and some instrument flags as the flipper ON/OFF state. + After one run is finished this data is translated to the NeXus (.nxs) file format (which is based on the HDF5 standard) as two separate files, + one with event information (flexible) and one with 3D histograms in X,Y and time of flight coordinates (fast). + + While the X and Y position can be used in conjunction with the instrument motor positions to gain information about the scattering + angle the time of flight (ToF) together with the moderator to detector distance allows to deduce the neutron wavelength. + Combining these three degrees of freedom allows to transform the coordinates into reciprocal space coordinates (for reflectivity + only the \Qz coordinate is relevant. + + In order to derive the correct \Qz dependent intensities a normalization to the incident beam is necessary. + For this the direct beam intensity of the same wavelength band and with the same instrumental setup is measured + prior to the actual data. This measurement is later used as reference when calculating the reflectivity. + When different incident angles are measured (as is most often the case), each incident angle takes a different part out of + the direct beam and generally won't have the same scaling factor as the earlier measurement. For the extracted reflectivity + these different runs need to be combined. \section{What does QuickNXS data reduction do?} + QuickNXS is a comprehensive tool to carry out the operations described above starting from the NeXus files and export the + data in a form usable for plotting and fit with reflectivity modeling software. + The program includes real time plots of several projections of the raw file data and previews of the exported data. + Several automated algorithms aid the process to improve the learning curve of non expert users and speed up + the extraction process of normal datasets while keeping the flexibility to apply special treatments for more + complicated cases. + \textbf{For a quick start reflectivity extraction guide see section \ref{sec:quick_start}.} diff --git a/doc/user_manual/quicknxsusersmanual.kilepr b/doc/user_manual/quicknxsusersmanual.kilepr index feb169c..2ebad54 100644 --- a/doc/user_manual/quicknxsusersmanual.kilepr +++ b/doc/user_manual/quicknxsusersmanual.kilepr @@ -3,7 +3,7 @@ def_graphic_ext=png img_extIsRegExp=false img_extensions=.eps .jpg .jpeg .png .pdf .ps .fig .gif kileprversion=2 -kileversion=2.1.0 +kileversion=2.1.3 lastDocument=QuickNXS_Users_Manual.tex masterDocument=QuickNXS_Users_Manual.tex name=QuickNXS Users Manual @@ -76,7 +76,7 @@ Mode=LaTeX ReadWrite=true [document-settings,item:user_interface.tex] -Bookmarks= +Bookmarks=43 Encoding=UTF-8 FoldedColumns= FoldedLines= @@ -90,17 +90,17 @@ archive=true column=0 encoding=UTF-8 highlight=LaTeX -line=2 +line=1 mode=LaTeX -open=false -order=5 +open=true +order=0 [item:advanced.tex] archive=true -column=0 +column=4 encoding=UTF-8 highlight=LaTeX -line=7 +line=46 mode=LaTeX open=false order=4 @@ -117,17 +117,17 @@ order=-1 [item:data_reduction.tex] archive=true -column=18 +column=307 encoding=UTF-8 highlight=LaTeX -line=122 +line=141 mode=LaTeX open=false order=2 [item:header_html.tex] archive=true -column=17 +column=0 encoding=UTF-8 highlight=LaTeX line=0 @@ -137,10 +137,10 @@ order=0 [item:introduction.tex] archive=true -column=0 +column=96 encoding=UTF-8 highlight=LaTeX -line=6 +line=29 mode=LaTeX open=false order=1 @@ -157,62 +157,62 @@ order=-1 [item:titlepage.tex] archive=true -column=52 +column=15 encoding=UTF-8 highlight=LaTeX -line=17 +line=12 mode=LaTeX open=false order=6 [item:user_interface.tex] archive=true -column=85 +column=23 encoding=UTF-8 highlight=LaTeX -line=60 +line=50 mode=LaTeX open=false order=3 [view-settings,view=0,item:QuickNXS_Users_Manual.tex] CursorColumn=0 -CursorLine=2 +CursorLine=1 JumpList= ViMarks= [view-settings,view=0,item:advanced.tex] -CursorColumn=0 -CursorLine=7 +CursorColumn=4 +CursorLine=46 JumpList= ViMarks= [view-settings,view=0,item:data_reduction.tex] -CursorColumn=18 -CursorLine=122 +CursorColumn=307 +CursorLine=141 JumpList= ViMarks= [view-settings,view=0,item:header_html.tex] -CursorColumn=17 +CursorColumn=0 CursorLine=0 JumpList= ViMarks= [view-settings,view=0,item:introduction.tex] -CursorColumn=0 -CursorLine=6 +CursorColumn=96 +CursorLine=29 JumpList= ViMarks= [view-settings,view=0,item:titlepage.tex] -CursorColumn=52 -CursorLine=17 +CursorColumn=15 +CursorLine=12 JumpList= ViMarks= [view-settings,view=0,item:user_interface.tex] -CursorColumn=85 -CursorLine=60 +CursorColumn=23 +CursorLine=50 JumpList= -ViMarks= +ViMarks=a,43,0 diff --git a/doc/user_manual/screenshots/background.png b/doc/user_manual/screenshots/background.png index c283caf..5941f1b 100644 Binary files a/doc/user_manual/screenshots/background.png and b/doc/user_manual/screenshots/background.png differ diff --git a/doc/user_manual/screenshots/normalize1.png b/doc/user_manual/screenshots/normalize1.png index a492ef9..d55d2f6 100644 Binary files a/doc/user_manual/screenshots/normalize1.png and b/doc/user_manual/screenshots/normalize1.png differ diff --git a/doc/user_manual/screenshots/normalize2.png b/doc/user_manual/screenshots/normalize2.png index 2d3e12a..6b520aa 100644 Binary files a/doc/user_manual/screenshots/normalize2.png and b/doc/user_manual/screenshots/normalize2.png differ diff --git a/doc/user_manual/screenshots/normalize3.png b/doc/user_manual/screenshots/normalize3.png index 8ff5d14..21a22ad 100644 Binary files a/doc/user_manual/screenshots/normalize3.png and b/doc/user_manual/screenshots/normalize3.png differ diff --git a/doc/user_manual/screenshots/normalize_after.png b/doc/user_manual/screenshots/normalize_after.png index 6982dca..2bbeb3f 100644 Binary files a/doc/user_manual/screenshots/normalize_after.png and b/doc/user_manual/screenshots/normalize_after.png differ diff --git a/doc/user_manual/screenshots/normalizemap1.png b/doc/user_manual/screenshots/normalizemap1.png index ae6b0e0..a2734bd 100644 Binary files a/doc/user_manual/screenshots/normalizemap1.png and b/doc/user_manual/screenshots/normalizemap1.png differ diff --git a/doc/user_manual/screenshots/normalizemap2.png b/doc/user_manual/screenshots/normalizemap2.png index d589b78..a311bd5 100644 Binary files a/doc/user_manual/screenshots/normalizemap2.png and b/doc/user_manual/screenshots/normalizemap2.png differ diff --git a/doc/user_manual/screenshots/overview.png b/doc/user_manual/screenshots/overview.png index 967f0a8..102efc7 100644 Binary files a/doc/user_manual/screenshots/overview.png and b/doc/user_manual/screenshots/overview.png differ diff --git a/doc/user_manual/screenshots/overview.svg b/doc/user_manual/screenshots/overview.svg index f63a522..2032f8f 100644 --- a/doc/user_manual/screenshots/overview.svg +++ b/doc/user_manual/screenshots/overview.svg @@ -14,11 +14,11 @@ height="590" id="svg2" version="1.1" - inkscape:version="0.48.3.1 r9886" + inkscape:version="0.48.4 r9939" sodipodi:docname="overview.svg" inkscape:export-filename="/home/agf/Software/Scripte/QuickNXS/doc/user_manual/screenshots/overview_labels.png" - inkscape:export-xdpi="300" - inkscape:export-ydpi="300"> + inkscape:export-xdpi="300.04153" + inkscape:export-ydpi="300.04153"> activeFolder Toolbar @@ -319,7 +319,7 @@ xml:space="preserve" id="flowRoot4558" style="font-size:40px;font-style:normal;font-weight:normal;text-align:center;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;font-family:Sans" - transform="translate(267.14286,-73.57143)"> Open... (CTRL+O)}] Shows a dialog to select a file to be loaded. The filename filter depends if "Histogram" or "Event" mode are selected in the Files area. + \item[{\icon{document-open} File -> Open... (CTRL+O)}] Shows a dialog to select a file to be loaded. The filename filter depends on the selection of "Histogram" or "Event" mode in the Files area. \item[File -> Open Sum... (CTRL+SHIFT+O)] Allows to open several files to sum up their intensities \item[{\icon{listDown} File -> Next File (CTRL+D)}] Opens the file below the active selection in the Files area. \item[{\icon{listUp} File -> Previous File (CTRL+SHIFT+D)}] Opens the file above the active selection in the Files area. + \item[File -> Load Extraction... (CTRL+ALT+O)] Reads the header of an exported dataset (.dat) to reconstruct the options and load the data files used for the reduction. This can be used to improve an already exported dataset. - \item[{\icon{extractNormalization} Reduction -> Set Normalization (CTRL+W)}] Use the data extracted from the current file as a normalization dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate normalization file is selected using the number of time of flight channels in a file and the central wavelength, if this is ambiguous a dialog is shown to the user to select one dataset. - \item[{\icon{clearNorm} Reduction -> Clear Normalization (CTRL+SHIFT+W)}] Use the data extracted from the current file as a normalization dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate normalization file is selected using the number of time of flight channels in a file and the central wavelength, if this is ambiguous a dialog is shown to the user to select one dataset. + \item[{\icon{extractNormalization} Reduction -> Set Normalization (CTRL+W)}] Use the data extracted from the current file as a direct beam dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate direct beam run is selected using the number of time of flight channels in the file and the central wavelength, if this is ambiguous a dialog is shown to select one dataset. + \item[{\icon{clearNorm} Reduction -> Clear Normalization (CTRL+SHIFT+W)}] Empty the list of direct beam datasets. - \item[{\icon{totalReflection} Reduction -> Set Scaling (CTRL+S)}] For the first dataset it tries to find the edge of total reflection and fit a constant to all points before it to normalize it to one. For the second dataset it fits a polynomial to the overlapping region of the active dataset and the closest one found in the reduction table to stitch them together. It is helpful to first define a suitable range of cut points to improve the results. - \item[{\icon{cutPoints} Reduction -> Cut Points (L/R) (CTRL+SHIFT+C)}] Tries to select good cut points for the given wavelength band based on the corresponding direct beam measurement - \item[{\icon{addRef} Reduction -> Keep Item in List (CTRL+Q)}] Use the reflectivity from the current dataset and add it the the reduction list. Only works for already normalized dataset. The options in the reduction list can still be changed later. + \item[{\icon{totalReflection} Reduction -> Set Scaling (CTRL+S)}] For the first dataset this tries to find the edge of total reflection and fit a constant to all points before it normalizes it to one. For subsequent datasets it fits a polynomial to the overlapping region of the active dataset and the closest one found in the reduction table to stitch them together. It is helpful to first define a suitable range of cut points to improve the results. + \item[{\icon{cutPoints} Reduction -> Cut Points (L/R) (CTRL+SHIFT+C)}] Tries to select good cut points for the given wavelength band based on the corresponding direct beam measurement. + \item[{\icon{addRef} Reduction -> Keep Item in List (CTRL+Q)}] Use the reflectivity from the current dataset and add it to the reduction list. Only works for dataset with direct beam run. The options in the reduction list can still be changed later. \item[{\icon{delRef} Reduction -> Remove Line}] Remove the selected line from the reduction list. \item[{\icon{clearRef} Reduction -> Clear List (CTRL+SHIFT+Q)}] Clear the full reduction list to start a new set of reflectivity. \item[{\icon{reduce} Reduction -> Reduce...}] Use the items and options in the reduction table to export a dataset. Shows a dialog to select how the export should be done. - \item[{\icon{findXauto} Advanced -> Automatic Peak Finder}] If checked, the program runs a peak finder and peak fitting algorithm on the X-projection of the data each time a new dataset is loaded and sets the X-center parameter accordingly. - \item[{\icon{limitYauto} Advanced -> Automatic Y Limits}] If checked, the program detects the region, where the intensity in the Y-projection drops below a certain threshold and sets the Y-center and Y-width parameters accordingly. After adding the first dataset to the reduction table the option is switched off automatically. - \item[{\icon{fitXPos} Advanced -> Refine X}] If checked, each time the user clicks on the X-projection plot to select another X-center position, a Gaussian fit is executed to refine the position. - \item[{Advanced -> Advanced Background ...}] Open a dialog with additional options for the background subtraction. + \item[{Advanced -> Advanced Background ... (CTRL+B)}] Open a dialog with additional options for the background subtraction. \item[{\icon{tthZero} Advanced -> Adjust Direct Beam}] For datasets where the direct pixel and or DANGLE0 values are not correctly defined, this action can take the current X-position of a direct beam measurement to set overwrite parameters accordingly. \item[{ Advanced -> Clear Overwrite}] This clears the overwrite parameters defined with "Adjust Direct Beam". + + \item[{ Advanced -> Polarization...}] Open a tool to plot polarization parameters of dedicated measurements. Not intended for users. + \item[{ Advanced -> Raw Data Comparison...}] Show a dialog to compare the raw data of the current run (not divided by direct beam) with the direct beam measurement and background. Can be helpful to identify problems with the direct beam normalization or the extraction window. \item[{ Advanced -> Open Compare Window...}] Open a dialog which allows the direct comparison of different reflectivity measurements, can be used several times and is equivalent to the Compare tab of the main window. - \end{description} + + \item[{\icon{findXauto} Automatics -> Automatic Peak Finder}] If checked, the program runs a peak finder and peak fitting algorithm on the X-projection of the data each time a new dataset is loaded and sets the X-center parameter accordingly. This works very reliable in general and is activated by default. + \item[{\icon{limitYauto} Automatics -> Automatic Y Limits}] If checked, the program detects the region, where the intensity in the Y-projection drops below a certain threshold and sets the Y-center and Y-width parameters accordingly. After adding the first dataset to the reduction table the option is switched off automatically to prevent issues at higher incident angle, where the intensity is comparable with the background. Clearing the reduction list will reactivate the option. + \item[{\icon{autoRef} Automatics -> Auto Reflectivity}] When extracting several reflectivities with similar experimental setup the extraction parameters will likely be similar, as well. Activate this action at the total reflection run and it will use the current settings to subsequently scale and add all following datasets as long as the incident angle increases. Afterwards the \textbf{Strip Overlap} action is invoked. For good quality data this is a convenient way to reduce the full reflectivity very quickly. + \item[{\icon{stripOverlap} Automatics -> Strip Overlap}] This option can be invoked directly before reducing the data to remove overlapping points between subsequent runs. The points are removed from the lower \Qz run, as these normally will have much lower statistics. + + \item[Help Menu] Online access to this manual and an about dialog. + + \item[Debug Menu] Useful things for program debugging. + \end{description} \section{The Overview Tab} - This central tab shows information on the current dataset and the data reduction. A label at the top indicates the current file number, experiment ID, measurement type and the currently selected channel. + This central tab shows information on the current dataset and the data reduction. A label at the top indicates the current file number, experiment ID, measurement type and the currently selected spin state. The two map plots below the label show the projected intensities on the horizontal and vertical detector axes (left) and on the time of flight and horizontal detector axes (right), in the same way as it is show during data acquisition. - In the center some important parameters, extracted from the datafile header, are displayed. The αi, 2Θ and Counts ROI parameter also depends on the selected X- and Y- region and is thus not directly read from the file. + In the center some important parameters, extracted from the datafile header, are displayed. The \textit{SANGLE-calc} and \textit{Counts ROI} parameters also depends on the selected X- and Y- region and are thus not directly read from the file. The mouse can be used to define the X- and Y-region in this plots similar as in the projection areas described below. - At the bottom you can find the reduction table and an additional tab with a list of defined normalization datasets. + At the bottom you can find the reduction table and an additional tab with a list of defined direct beam datasets. These tables show the parameters used for the respective intensity extractions. These parameters in the Data tab can be edited afterwards and will be applied directly to the reflectivity curve shown in the Reflectivity area. - Directly above the table is a label showing the numbers of all defined normalization files and a drop-down selection for the current dataset channel shown in the Overview, projections and reflectivity plots. - Selecting a channel not present in the current file will result in a fallback to the first channel. + Directly above the table is a label showing the numbers of all defined direct beam runs and radio buttons to select the current datasets spin state shown in the Overview, projections and reflectivity plots. \section{Plots and Options} The areas on the left and right of the window contain e.g. the projection plots and extraction parameters and are visible on any tab of the main interface. The left side contains option fields for the extraction parameters and file readout while the right side is dedicated for important plots. Here is a list of the available areas from top to bottom, first on the left and than on the right hand side. \begin{description} - \item[Files] A list of all datafile in the current directory together with an entry to search for a file by number and select to extract either histogram or event mode data. In the event mode setting additional options will be displayed. + \item[Files] A list of all datafiles in the current directory together with an entry to search for a file by number and select to extract either histogram or event mode data. In the event mode setting additional options will be displayed. - \item[Reflectivity Extraction (Basic)] The parameters used to extract the active reflectivity. When adding a dataset to the reduction list, these parameters are stored. + \item[Reflectivity Extraction (Basic)] The parameters used to extract the active reflectivity. When adding a dataset to the reduction list, these parameters are stored in the Data table. \item[Reflectivity Extraction (Advanced)] Settings to change the extraction method or overwrite parameters otherwise read from the datafile. Options for the stitching algorithm can be found here as well. \item[Algorithm Parameters] Settings for the peak finder algorithm not important for normal user operation. - \item[Plot Options] Global settings for the shown plots, does not effect the data reduction in any way. Here you can also chose to show the 2D datasets in wavelength and angle instead of time of flight and pixel. + \item[Plot Options] Global settings for the shown plots, does not effect the data reduction in any way. Here you can also chose to show the 2D datasets in wavelength and angle coordinates instead of time of flight and pixel. - \item[X-Projection] A plot with the data of the loaded file projected on the detector X-axis. Green lines indicate the background region defined at the moment. The X-position is marked with a black line and the X-width with two red lines. The mouse can be used to change the background region and X-center using the left mouse button and set the X-width using the right mouse button. - \item[Y-Projection] An equivalent projection on the detector Y-axis, showing the selected Y-region with red lines. The mouse can be used to change the Y-region using left clicks. + \item[X-Projection] A plot with the data of the loaded file projected on the detector X-axis. Black lines indicate the background region defined at the moment. The X-position is marked with a black line and the X-width with two red lines. The mouse can be used to change the background region and X-center using the left mouse button and set the X-width using the right mouse button. + \item[Y-Projection] An equivalent projection on the detector Y-axis, showing the selected Y-region with green lines. The mouse can be used to change the Y-region using left clicks. - \item[Reflectivity] Show all datasets already added to the reduction list and the currently selected one. For unnormalized datasets it show intensity and background vs. wavelength. Datasets in the reduction list can be scaled with the mouse wheel when at the right x-coordinates (faster scaling when CTRL is pressed while scrolling). + \item[Reflectivity] Shows all datasets already added to the reduction list (colored lines) and the currently selected one (black line labeled 'Active'). For datasets without fitting direct beam runs it shows intensity and background vs. wavelength. + Datasets in the reduction list can be scaled with the mouse wheel when at the right x-coordinates (faster scaling when CTRL is pressed while scrolling). + Note that a dataset can be shown twice (as "Active" and with it's run number) when it is already in the reduction list and selected. \end{description} \section{Convenient Parameter Alteration with the Mouse Wheel} - You can use the mouse wheel when you cursor is on top of a value entry to increase or decrease the according parameter. + You can use the mouse wheel when your cursor is on top of a value entry to increase or decrease the according parameter. This can be very convenient to see the result of e.g. changing the scaling factor for the current reflectivity. Holding down the CTRL key while scrolling increases the speed of the parameter changes. The same method can also be used to scale datasets in the reduction table, simply by moving the mouse at the curve in the reflectivity plot and scrolling with the mouse wheel. @@ -94,9 +106,10 @@ \section{Plots} Each of the plots described above are created with the same framework and have a toolbar below them:\\ \includegraphics[width=6cm]{screenshots/plottoolbar.png}\\ The first 5 items allow the navigation on the plot, like zooming in and out or moving the current view position. - The third icon from the left opens a dialog, which can be used to change the amount of freespace around the plot to fit the current window scaling. - The last two icons can be used to save or print the plot. - \textbf{Keep in mind that the X- and Y-projection as well as the overview maps can be used to select the extraction parameters.} + The fourth icon from the left opens a dialog, which can be used to change the amount of frees pace around the plot to fit the current window scaling. + The following two icons can be used to save or print the plot. The last button allows to toggle between logarithmic and linear plotting (this is not persistent after changing the dataset). + + \textbf{Keep in mind that the X- and Y-projections as well as the overview maps can be used to select the extraction parameters.} This will only work when no scaling tool is selected from the plot toolbar. \section{Data reduction table} diff --git a/quick_nxs/config.py b/quick_nxs/config.py index 1e7ebf7..a200ca5 100644 --- a/quick_nxs/config.py +++ b/quick_nxs/config.py @@ -1,7 +1,7 @@ #-*- coding: utf-8 -*- ''' - Global configurations for e.g. default paths etc. Some of these are stored - in the users account for easy changes. +Global configurations for e.g. default paths etc. Some of these are stored +in the users account for easy changes. ''' import os diff --git a/quick_nxs/htmldoc/QuickNXS_Users_Manual.html b/quick_nxs/htmldoc/QuickNXS_Users_Manual.html index 7f78186..4b6d66f 100644 --- a/quick_nxs/htmldoc/QuickNXS_Users_Manual.html +++ b/quick_nxs/htmldoc/QuickNXS_Users_Manual.html @@ -90,7 +90,7 @@ HREF="node3.html#SECTION00330000000000000000">3.3 Quick start: Step-by-step standard reduction
  • -Babel and hyphenation patterns for english, usenglishmax, dumylang, noh -yphenation, farsi, arabic, croatian, bulgarian, ukrainian, russian, czech, slov -ak, danish, dutch, finnish, french, basque, ngerman, german, german-x-2009-06-1 -9, ngerman-x-2009-06-19, ibycus, monogreek, greek, ancientgreek, hungarian, san -skrit, italian, latin, latvian, lithuanian, mongolian2a, mongolian, bokmal, nyn -orsk, romanian, irish, coptic, serbian, turkish, welsh, esperanto, uppersorbian -, estonian, indonesian, interlingua, icelandic, kurmanji, slovenian, polish, po -rtuguese, spanish, galician, catalan, swedish, ukenglish, pinyin, loaded. +LaTeX2e <2011/06/27> +Babel and hyphenation patterns for english, dumylang, nohyphenation, et +hiopic, farsi, arabic, pinyin, croatian, bulgarian, ukrainian, russian, slovak, + czech, danish, dutch, usenglishmax, ukenglish, finnish, french, basque, ngerma +n, german, swissgerman, ngerman-x-2012-05-30, german-x-2012-05-30, monogreek, g +reek, ibycus, ancientgreek, hungarian, bengali, tamil, hindi, telugu, gujarati, + sanskrit, malayalam, kannada, assamese, marathi, oriya, panjabi, italian, lati +n, latvian, lithuanian, mongolian, mongolianlmc, nynorsk, bokmal, indonesian, e +speranto, coptic, welsh, irish, interlingua, serbian, serbianc, slovenian, friu +lan, romansh, estonian, romanian, armenian, uppersorbian, turkish, afrikaans, i +celandic, kurmanji, polish, portuguese, galician, catalan, spanish, swedish, th +ai, loaded. ! LaTeX Error: \usepackage before \documentclass. @@ -25,9 +29,9 @@ l.3 \usepackage{ \usepackage may only appear in the document preamble, i.e., between \documentclass and \begin{document}. -(/usr/share/texmf-texlive/tex/latex/base/book.cls +(/usr/share/texlive/texmf-dist/tex/latex/base/book.cls Document Class: book 2007/10/19 v1.4h Standard LaTeX document class -(/usr/share/texmf-texlive/tex/latex/base/bk12.clo +(/usr/share/texlive/texmf-dist/tex/latex/base/bk12.clo File: bk12.clo 2007/10/19 v1.4h Standard LaTeX file (size option) ) \c@part=\count79 @@ -42,19 +46,19 @@ File: bk12.clo 2007/10/19 v1.4h Standard LaTeX file (size option) \abovecaptionskip=\skip41 \belowcaptionskip=\skip42 \bibindent=\dimen102 -) (/usr/share/texmf-texlive/tex/latex/base/ifthen.sty +) (/usr/share/texlive/texmf-dist/tex/latex/base/ifthen.sty Package: ifthen 2001/05/26 v1.1c Standard LaTeX ifthen package (DPC) -) (/usr/share/texmf-texlive/tex/latex/base/inputenc.sty +) (/usr/share/texlive/texmf-dist/tex/latex/base/inputenc.sty Package: inputenc 2008/03/30 v1.1d Input encoding file \inpenc@prehook=\toks14 \inpenc@posthook=\toks15 -(/usr/share/texmf-texlive/tex/latex/base/utf8.def +(/usr/share/texlive/texmf-dist/tex/latex/base/utf8.def File: utf8.def 2008/04/05 v1.1m UTF-8 support for inputenc Now handling font encoding OML ... ... no UTF-8 mapping file for font encoding OML Now handling font encoding T1 ... ... processing UTF-8 mapping file for font encoding T1 -(/usr/share/texmf-texlive/tex/latex/base/t1enc.dfu +(/usr/share/texlive/texmf-dist/tex/latex/base/t1enc.dfu File: t1enc.dfu 2008/04/05 v1.1m UTF-8 support for inputenc defining Unicode char U+00A1 (decimal 161) defining Unicode char U+00A3 (decimal 163) @@ -203,7 +207,7 @@ File: t1enc.dfu 2008/04/05 v1.1m UTF-8 support for inputenc ) Now handling font encoding OT1 ... ... processing UTF-8 mapping file for font encoding OT1 -(/usr/share/texmf-texlive/tex/latex/base/ot1enc.dfu +(/usr/share/texlive/texmf-dist/tex/latex/base/ot1enc.dfu File: ot1enc.dfu 2008/04/05 v1.1m UTF-8 support for inputenc defining Unicode char U+00A1 (decimal 161) defining Unicode char U+00A3 (decimal 163) @@ -233,7 +237,7 @@ File: ot1enc.dfu 2008/04/05 v1.1m UTF-8 support for inputenc ) Now handling font encoding OMS ... ... processing UTF-8 mapping file for font encoding OMS -(/usr/share/texmf-texlive/tex/latex/base/omsenc.dfu +(/usr/share/texlive/texmf-dist/tex/latex/base/omsenc.dfu File: omsenc.dfu 2008/04/05 v1.1m UTF-8 support for inputenc defining Unicode char U+00A7 (decimal 167) defining Unicode char U+00B6 (decimal 182) @@ -256,7 +260,7 @@ Now handling font encoding U ... defining Unicode char U+2026 (decimal 8230) defining Unicode char U+2122 (decimal 8482) defining Unicode char U+2423 (decimal 9251) -)) (/usr/share/texmf-texlive/tex/latex/psnfss/mathptmx.sty +)) (/usr/share/texlive/texmf-dist/tex/latex/psnfss/mathptmx.sty Package: mathptmx 2005/04/12 PSNFSS-v9.2a Times w/ Math, improved (SPQR, WaS) LaTeX Font Info: Redeclaring symbol font `operators' on input line 28. LaTeX Font Info: Overwriting symbol font `operators' in version `normal' @@ -291,14 +295,14 @@ LaTeX Font Info: Overwriting math alphabet `\mathit' in version `normal' LaTeX Font Info: Overwriting math alphabet `\mathit' in version `bold' (Font) OT1/cmr/bx/it --> OT1/ptm/m/it on input line 35. LaTeX Info: Redefining \hbar on input line 50. -) (/usr/share/texmf-texlive/tex/latex/psnfss/helvet.sty +) (/usr/share/texlive/texmf-dist/tex/latex/psnfss/helvet.sty Package: helvet 2005/04/12 PSNFSS-v9.2a (WaS) -(/usr/share/texmf-texlive/tex/latex/graphics/keyval.sty +(/usr/share/texlive/texmf-dist/tex/latex/graphics/keyval.sty Package: keyval 1999/03/16 v1.13 key=value parser (DPC) \KV@toks@=\toks16 -)) (/usr/share/texmf-texlive/tex/latex/base/fontenc.sty +)) (/usr/share/texlive/texmf-dist/tex/latex/base/fontenc.sty Package: fontenc 2005/09/27 v1.99g Standard LaTeX package -(/usr/share/texmf-texlive/tex/latex/base/t1enc.def +(/usr/share/texlive/texmf-dist/tex/latex/base/t1enc.def File: t1enc.def 2005/09/27 v1.99g Standard LaTeX file LaTeX Font Info: Redeclaring font encoding T1 on input line 43. )) @@ -314,42 +318,42 @@ LaTeX Font Info: Redeclaring font encoding T1 on input line 43. defining Unicode char U+931 (decimal 2353) defining Unicode char U+916 (decimal 2326) defining Unicode char U+960 (decimal 2400) -(/usr/share/texmf-texlive/tex/latex/graphics/graphicx.sty +(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty Package: graphicx 1999/02/16 v1.0f Enhanced LaTeX Graphics (DPC,SPQR) -(/usr/share/texmf-texlive/tex/latex/graphics/graphics.sty +(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty Package: graphics 2009/02/05 v1.0o Standard LaTeX Graphics (DPC,SPQR) -(/usr/share/texmf-texlive/tex/latex/graphics/trig.sty +(/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty Package: trig 1999/03/16 v1.09 sin cos tan (DPC) -) (/etc/texmf/tex/latex/config/graphics.cfg -File: graphics.cfg 2009/08/28 v1.8 graphics configuration of TeX Live +) (/usr/share/texlive/texmf-dist/tex/latex/latexconfig/graphics.cfg +File: graphics.cfg 2010/04/23 v1.9 graphics configuration of TeX Live ) Package graphics Info: Driver file: dvips.def on input line 91. -(/usr/share/texmf-texlive/tex/latex/graphics/dvips.def +(/usr/share/texlive/texmf-dist/tex/latex/graphics/dvips.def File: dvips.def 1999/02/16 v3.0i Driver-dependant file (DPC,SPQR) )) \Gin@req@height=\dimen103 \Gin@req@width=\dimen104 -) (/usr/share/texmf-texlive/tex/latex/wrapfig/wrapfig.sty +) (/usr/share/texlive/texmf-dist/tex/latex/wrapfig/wrapfig.sty \wrapoverhang=\dimen105 \WF@size=\dimen106 \c@WF@wrappedlines=\count88 \WF@box=\box26 \WF@everypar=\toks17 Package: wrapfig 2003/01/31 v 3.6 -) (/usr/share/texmf-texlive/tex/latex/amsmath/amsmath.sty +) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsmath.sty Package: amsmath 2000/07/18 v2.13 AMS math features \@mathmargin=\skip43 For additional information on amsmath, use the `?' option. -(/usr/share/texmf-texlive/tex/latex/amsmath/amstext.sty +(/usr/share/texlive/texmf-dist/tex/latex/amsmath/amstext.sty Package: amstext 2000/06/29 v2.01 -(/usr/share/texmf-texlive/tex/latex/amsmath/amsgen.sty +(/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsgen.sty File: amsgen.sty 1999/11/30 v2.0 \@emptytoks=\toks18 \ex@=\dimen107 -)) (/usr/share/texmf-texlive/tex/latex/amsmath/amsbsy.sty +)) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsbsy.sty Package: amsbsy 1999/11/29 v1.2d \pmbraise@=\dimen108 -) (/usr/share/texmf-texlive/tex/latex/amsmath/amsopn.sty +) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsopn.sty Package: amsopn 1999/12/14 v2.01 operator names ) \inf@bad=\count89 @@ -389,36 +393,36 @@ LaTeX Font Info: Redeclaring font encoding OMS on input line 568. \mathdisplay@stack=\toks22 LaTeX Info: Redefining \[ on input line 2666. LaTeX Info: Redefining \] on input line 2667. -) (/usr/share/texmf-texlive/tex/latex/amsfonts/amssymb.sty +) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amssymb.sty Package: amssymb 2009/06/22 v3.00 -(/usr/share/texmf-texlive/tex/latex/amsfonts/amsfonts.sty +(/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amsfonts.sty Package: amsfonts 2009/06/22 v3.00 Basic AMSFonts support \symAMSa=\mathgroup6 \symAMSb=\mathgroup7 LaTeX Font Info: Overwriting math alphabet `\mathfrak' in version `bold' (Font) U/euf/m/n --> U/euf/b/n on input line 96. -)) (/usr/share/texmf-texlive/tex/latex/was/upgreek.sty +)) (/usr/share/texlive/texmf-dist/tex/latex/was/upgreek.sty Package: upgreek 2003/02/12 v2.0 (WaS) Package upgreek Info: Using Euler Roman for upright Greek on input line 31. \symugrf@m=\mathgroup8 LaTeX Font Info: Overwriting symbol font `ugrf@m' in version `bold' (Font) U/eur/m/n --> U/eur/b/n on input line 38. -) (/usr/share/texmf-texlive/tex/latex/units/units.sty +) (/usr/share/texlive/texmf-dist/tex/latex/units/units.sty Package: units 1998/08/04 v0.9b Typesetting units -(/usr/share/texmf-texlive/tex/latex/units/nicefrac.sty +(/usr/share/texlive/texmf-dist/tex/latex/units/nicefrac.sty Package: nicefrac 1998/08/04 v0.9b Nice fractions \L@UnitsRaiseDisplaystyle=\skip46 \L@UnitsRaiseTextstyle=\skip47 \L@UnitsRaiseScriptstyle=\skip48 -)) (/usr/share/texmf-texlive/tex/latex/tools/xspace.sty -Package: xspace 2006/05/08 v1.12 Space after command names (DPC,MH) -) (/usr/share/texmf-texlive/tex/latex/graphics/color.sty +)) (/usr/share/texlive/texmf-dist/tex/latex/tools/xspace.sty +Package: xspace 2009/10/20 v1.13 Space after command names (DPC,MH) +) (/usr/share/texlive/texmf-dist/tex/latex/graphics/color.sty Package: color 2005/11/14 v1.0j Standard LaTeX Color (DPC) -(/etc/texmf/tex/latex/config/color.cfg +(/usr/share/texlive/texmf-dist/tex/latex/latexconfig/color.cfg File: color.cfg 2007/01/18 v1.5 color configuration of teTeX/TeXLive ) Package color Info: Driver file: dvips.def on input line 130. -(/usr/share/texmf-texlive/tex/latex/graphics/dvipsnam.def +(/usr/share/texlive/texmf-dist/tex/latex/graphics/dvipsnam.def File: dvipsnam.def 1999/02/16 v3.0i Driver-dependant file (DPC,SPQR) )) @@ -465,7 +469,7 @@ LaTeX Font Info: ... okay on input line 181. LaTeX Font Info: Checking defaults for U/cmr/m/n on input line 181. LaTeX Font Info: ... okay on input line 181. LaTeX Font Info: Try loading font information for T1+ptm on input line 181. -(/usr/share/texmf-texlive/tex/latex/psnfss/t1ptm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/t1ptm.fd File: t1ptm.fd 2001/06/04 font definitions for T1/ptm. ) @@ -493,27 +497,27 @@ latex2htmlLength evensidemargin=39.0pt LaTeX Font Info: Try loading font information for OT1+ztmcm on input line 20 5. -(/usr/share/texmf-texlive/tex/latex/psnfss/ot1ztmcm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/ot1ztmcm.fd File: ot1ztmcm.fd 2000/01/03 Fontinst v1.801 font definitions for OT1/ztmcm. ) LaTeX Font Info: Try loading font information for OML+ztmcm on input line 20 5. -(/usr/share/texmf-texlive/tex/latex/psnfss/omlztmcm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/omlztmcm.fd File: omlztmcm.fd 2000/01/03 Fontinst v1.801 font definitions for OML/ztmcm. ) LaTeX Font Info: Try loading font information for OMS+ztmcm on input line 20 5. -(/usr/share/texmf-texlive/tex/latex/psnfss/omsztmcm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/omsztmcm.fd File: omsztmcm.fd 2000/01/03 Fontinst v1.801 font definitions for OMS/ztmcm. ) LaTeX Font Info: Try loading font information for OMX+ztmcm on input line 20 5. -(/usr/share/texmf-texlive/tex/latex/psnfss/omxztmcm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/omxztmcm.fd File: omxztmcm.fd 2000/01/03 Fontinst v1.801 font definitions for OMX/ztmcm. ) LaTeX Font Info: Try loading font information for OT1+ptm on input line 205. -(/usr/share/texmf-texlive/tex/latex/psnfss/ot1ptm.fd +(/usr/share/texlive/texmf-dist/tex/latex/psnfss/ot1ptm.fd File: ot1ptm.fd 2001/06/04 font definitions for OT1/ptm. ) LaTeX Font Info: Font shape `OT1/ptm/bx/n' in size <12> not available @@ -522,63 +526,63 @@ LaTeX Font Info: Font shape `OT1/ptm/bx/n' in size <9> not available (Font) Font shape `OT1/ptm/b/n' tried instead on input line 205. LaTeX Font Info: Font shape `OT1/ptm/bx/n' in size <7> not available (Font) Font shape `OT1/ptm/b/n' tried instead on input line 205. -l2hSize :tex2html_wrap_inline1350:8.09999pt::0.0pt::8.50151pt. +l2hSize :tex2html_wrap_inline1582:8.09999pt::0.0pt::8.50151pt. [1 ] -l2hSize :tex2html_wrap_inline1352:8.79688pt::8.79688pt::8.55013pt. +l2hSize :tex2html_wrap_inline1584:8.79688pt::8.79688pt::8.55013pt. [2 ] -l2hSize :tex2html_wrap_inline1354:8.59999pt::8.59999pt::7.90193pt. +l2hSize :tex2html_wrap_inline1586:8.59999pt::8.59999pt::7.90193pt. [3 ] -l2hSize :tex2html_wrap_inline1356:8.29688pt::0.0pt::6.61203pt. +l2hSize :tex2html_wrap_inline1588:8.29688pt::0.0pt::6.61203pt. [4 ] -l2hSize :tex2html_wrap_inline1358:8.29688pt::0.0pt::7.10141pt. +l2hSize :tex2html_wrap_inline1590:8.29688pt::0.0pt::7.10141pt. [5 ] -l2hSize :tex2html_wrap_inline1360:8.09999pt::0.0pt::7.86304pt. +l2hSize :tex2html_wrap_inline1592:8.09999pt::0.0pt::7.86304pt. [6 ] -l2hSize :tex2html_wrap_inline1362:8.59999pt::8.59999pt::7.17595pt. +l2hSize :tex2html_wrap_inline1594:8.59999pt::8.59999pt::7.17595pt. [7 ] -l2hSize :tex2html_wrap_inline1364:8.29688pt::0.0pt::7.25049pt. +l2hSize :tex2html_wrap_inline1596:8.29688pt::0.0pt::7.25049pt. [8 ] -l2hSize :tex2html_wrap_inline1366:8.69586pt::8.69586pt::9.49191pt. +l2hSize :tex2html_wrap_inline1598:8.69586pt::8.69586pt::9.49191pt. [9 ] -l2hSize :tex2html_wrap_inline1368:8.09999pt::0.0pt::7.70393pt. +l2hSize :tex2html_wrap_inline1600:8.09999pt::0.0pt::7.70393pt. [10 ] -l2hSize :tex2html_wrap_inline1370:8.19586pt::0.0pt::7.94403pt. +l2hSize :tex2html_wrap_inline1602:8.19586pt::0.0pt::7.94403pt. [11 ] -l2hSize :tex2html_wrap_inline1372:8.09999pt::0.0pt::7.90842pt. +l2hSize :tex2html_wrap_inline1604:8.09999pt::0.0pt::7.90842pt. [12 @@ -618,7 +622,7 @@ If that doesn't work, type X to quit. File: ../../icons/logo.pdf Graphic file (type eps) <../../icons/logo.pdf>) -l2hSize :figure1373:487.15965pt::0.0pt::349.0pt. +l2hSize :figure1605:487.15965pt::0.0pt::349.0pt. [1 @@ -626,43 +630,43 @@ l2hSize :figure1373:487.15965pt::0.0pt::349.0pt. ] -l2hSize :tex2html_wrap_inline1448:8.59999pt::8.59999pt::13.2649pt. +l2hSize :tex2html_wrap_inline1609:8.59999pt::8.59999pt::13.2649pt. [2 ] -l2hSize :tex2html_wrap_inline1455:10.8768pt::10.8768pt::13.48398pt. +l2hSize :tex2html_wrap_inline1699:10.8768pt::10.8768pt::13.48398pt. [3 ] -l2hSize :tex2html_wrap_indisplay1458:11.47679pt::11.47679pt::40.12003pt. +l2hSize :tex2html_wrap_indisplay1702:11.47679pt::11.47679pt::40.12003pt. [4 ] -l2hSize :tex2html_wrap_indisplay1460:19.23732pt::19.23732pt::256.65932pt. +l2hSize :tex2html_wrap_indisplay1704:19.23732pt::19.23732pt::256.65932pt. [5 ] -l2hSize :tex2html_wrap_indisplay1462:9.5pt::9.5pt::37.46619pt. +l2hSize :tex2html_wrap_indisplay1706:9.5pt::9.5pt::37.46619pt. [6 ] -l2hSize :tex2html_wrap_indisplay1464:11.47679pt::11.47679pt::78.88448pt. +l2hSize :tex2html_wrap_indisplay1708:11.47679pt::11.47679pt::78.88448pt. [7 ] (./images.aux (./titlepage.aux)) ) Here is how much of TeX's memory you used: - 2511 strings out of 493849 - 29748 string characters out of 1152846 - 75805 words of memory out of 3000000 - 5721 multiletter control sequences out of 15000+50000 + 2517 strings out of 493486 + 30335 string characters out of 3143550 + 76945 words of memory out of 3000000 + 5759 multiletter control sequences out of 15000+200000 23177 words of font info for 48 fonts, out of 3000000 for 9000 - 714 hyphenation exceptions out of 8191 + 957 hyphenation exceptions out of 8191 27i,5n,24p,224b,372s stack positions out of 5000i,500n,10000p,200000b,50000s Output written on images.dvi (19 pages, 5280 bytes). diff --git a/quick_nxs/htmldoc/images.pl b/quick_nxs/htmldoc/images.pl index 76f42b1..e52c201 100644 --- a/quick_nxs/htmldoc/images.pl +++ b/quick_nxs/htmldoc/images.pl @@ -4,49 +4,49 @@ $key = q/upalpha;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \upalpha$|; $key = q/upTheta;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \upTheta$|; $key = q/upgamma;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \upgamma$|; $key = q/upbeta;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \upbeta$|; $key = q/uppi;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \uppi$|; $key = q/displaystylep(Q)=;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$\displaystyle p(Q) =$|; $key = q/uplambda;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \uplambda$|; $key = q/uprho;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \uprho$|; @@ -58,7 +58,7 @@ $key = q/chi^2;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \chi^2$|; @@ -70,38 +70,38 @@ $key = q/upDelta;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \upDelta$|; $key = q/displaystyleunderset{DS1}{sum}frac{(log(I_i)-p(Q_i))^2}{(delta{}I_islashI_i)^2}+um}frac{(log(I_jcdotscale)-p(Q_j))^2}{(delta{}I_jslashI_j)^2};MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$\displaystyle \underset{DS1}{\sum} \frac{(\log(I_i)-p(Q_i))^2}{(\delta{}I_i/I_i...
 ...underset{DS2}{\sum} \frac{(\log(I_j\cdot scale)-p(Q_j))^2}{(\delta{}I_j/I_j)^2}$|; $key = q/displaystyleacdotQ^2+bcdotQ+c;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$\displaystyle a\cdot Q^2 + b\cdot Q +c$|; $key = q/displaystylechi_{stitch}^2=;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$\displaystyle \chi_{stitch}^2 =$|; $key = q/Q_z;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ Q_z$|; $key = q/uptheta;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \uptheta$|; @@ -114,7 +114,7 @@ $key = q/updelta;MSF=1.6;LFS=12;AAT/; $cached_env_img{$key} = q|$ \updelta$|; diff --git a/quick_nxs/htmldoc/images.tex b/quick_nxs/htmldoc/images.tex index 54681d6..4019f6a 100644 --- a/quick_nxs/htmldoc/images.tex +++ b/quick_nxs/htmldoc/images.tex @@ -201,79 +201,79 @@ % !!! IMAGES START HERE !!! {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1350}% +\lthtmlinlinemathA{tex2html_wrap_inline1582}% $ \upalpha$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1352}% +\lthtmlinlinemathA{tex2html_wrap_inline1584}% $ \upbeta$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1354}% +\lthtmlinlinemathA{tex2html_wrap_inline1586}% $ \upgamma$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1356}% +\lthtmlinlinemathA{tex2html_wrap_inline1588}% $ \updelta$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1358}% +\lthtmlinlinemathA{tex2html_wrap_inline1590}% $ \uplambda$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1360}% +\lthtmlinlinemathA{tex2html_wrap_inline1592}% $ \upsigma$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1362}% +\lthtmlinlinemathA{tex2html_wrap_inline1594}% $ \uprho$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1364}% +\lthtmlinlinemathA{tex2html_wrap_inline1596}% $ \uptheta$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1366}% +\lthtmlinlinemathA{tex2html_wrap_inline1598}% $ \upTheta$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1368}% +\lthtmlinlinemathA{tex2html_wrap_inline1600}% $ \upSigma$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1370}% +\lthtmlinlinemathA{tex2html_wrap_inline1602}% $ \upDelta$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1372}% +\lthtmlinlinemathA{tex2html_wrap_inline1604}% $ \uppi$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlfigureA{figure1373}% +\lthtmlfigureA{figure1605}% \begin{figure}\vbox{\include{titlepage} }\end{figure}% \lthtmlfigureZ @@ -281,6 +281,12 @@ \stepcounter{chapter} \stepcounter{section} +{\newpage\clearpage +\lthtmlinlinemathA{tex2html_wrap_inline1609}% +$ Q_z$% +\lthtmlinlinemathZ +\lthtmlcheckvsize\clearpage} + \stepcounter{section} \stepcounter{chapter} \stepcounter{section} @@ -297,39 +303,33 @@ \stepcounter{subsection} \stepcounter{subsection} \stepcounter{subsection} -{\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1448}% -$ Q_z$% -\lthtmlinlinemathZ -\lthtmlcheckvsize\clearpage} - \stepcounter{subsection} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_inline1455}% +\lthtmlinlinemathA{tex2html_wrap_inline1699}% $ \chi^2$% \lthtmlinlinemathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_indisplay1458}% +\lthtmlinlinemathA{tex2html_wrap_indisplay1702}% $\displaystyle \chi_{stitch}^2 =$% \lthtmlindisplaymathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_indisplay1460}% +\lthtmlinlinemathA{tex2html_wrap_indisplay1704}% $\displaystyle \underset{DS1}{\sum} \frac{(\log(I_i)-p(Q_i))^2}{(\delta{}I_i/I_i)^2} + \underset{DS2}{\sum} \frac{(\log(I_j\cdot scale)-p(Q_j))^2}{(\delta{}I_j/I_j)^2}$% \lthtmlindisplaymathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_indisplay1462}% +\lthtmlinlinemathA{tex2html_wrap_indisplay1706}% $\displaystyle p(Q) =$% \lthtmlindisplaymathZ \lthtmlcheckvsize\clearpage} {\newpage\clearpage -\lthtmlinlinemathA{tex2html_wrap_indisplay1464}% +\lthtmlinlinemathA{tex2html_wrap_indisplay1708}% $\displaystyle a\cdot Q^2 + b\cdot Q +c$% \lthtmlindisplaymathZ \lthtmlcheckvsize\clearpage} diff --git a/quick_nxs/htmldoc/img1.png b/quick_nxs/htmldoc/img1.png index 63b6f48..c1f5776 100644 Binary files a/quick_nxs/htmldoc/img1.png and b/quick_nxs/htmldoc/img1.png differ diff --git a/quick_nxs/htmldoc/img11.png b/quick_nxs/htmldoc/img11.png index 3a500ad..1e0754b 100644 Binary files a/quick_nxs/htmldoc/img11.png and b/quick_nxs/htmldoc/img11.png differ diff --git a/quick_nxs/htmldoc/img12.png b/quick_nxs/htmldoc/img12.png index 65b09c9..93871be 100644 Binary files a/quick_nxs/htmldoc/img12.png and b/quick_nxs/htmldoc/img12.png differ diff --git a/quick_nxs/htmldoc/img13.png b/quick_nxs/htmldoc/img13.png index 4a4a221..3343644 100644 Binary files a/quick_nxs/htmldoc/img13.png and b/quick_nxs/htmldoc/img13.png differ diff --git a/quick_nxs/htmldoc/img14.png b/quick_nxs/htmldoc/img14.png index 24751e8..18d4c4e 100644 Binary files a/quick_nxs/htmldoc/img14.png and b/quick_nxs/htmldoc/img14.png differ diff --git a/quick_nxs/htmldoc/img15.png b/quick_nxs/htmldoc/img15.png index 730b907..78e0708 100644 Binary files a/quick_nxs/htmldoc/img15.png and b/quick_nxs/htmldoc/img15.png differ diff --git a/quick_nxs/htmldoc/img16.png b/quick_nxs/htmldoc/img16.png index 0be3216..6742b0f 100644 Binary files a/quick_nxs/htmldoc/img16.png and b/quick_nxs/htmldoc/img16.png differ diff --git a/quick_nxs/htmldoc/img17.png b/quick_nxs/htmldoc/img17.png index deee8eb..a01a326 100644 Binary files a/quick_nxs/htmldoc/img17.png and b/quick_nxs/htmldoc/img17.png differ diff --git a/quick_nxs/htmldoc/img18.png b/quick_nxs/htmldoc/img18.png index f58d2da..b35d4ae 100644 Binary files a/quick_nxs/htmldoc/img18.png and b/quick_nxs/htmldoc/img18.png differ diff --git a/quick_nxs/htmldoc/img19.png b/quick_nxs/htmldoc/img19.png index 860942c..c2e2251 100644 Binary files a/quick_nxs/htmldoc/img19.png and b/quick_nxs/htmldoc/img19.png differ diff --git a/quick_nxs/htmldoc/img2.png b/quick_nxs/htmldoc/img2.png index 14bfa85..623e49c 100644 Binary files a/quick_nxs/htmldoc/img2.png and b/quick_nxs/htmldoc/img2.png differ diff --git a/quick_nxs/htmldoc/img3.png b/quick_nxs/htmldoc/img3.png index 0e75b69..cc7a73c 100644 Binary files a/quick_nxs/htmldoc/img3.png and b/quick_nxs/htmldoc/img3.png differ diff --git a/quick_nxs/htmldoc/img4.png b/quick_nxs/htmldoc/img4.png index ea98b93..e10b439 100644 Binary files a/quick_nxs/htmldoc/img4.png and b/quick_nxs/htmldoc/img4.png differ diff --git a/quick_nxs/htmldoc/img5.png b/quick_nxs/htmldoc/img5.png index 9cf6630..2498923 100644 Binary files a/quick_nxs/htmldoc/img5.png and b/quick_nxs/htmldoc/img5.png differ diff --git a/quick_nxs/htmldoc/img6.png b/quick_nxs/htmldoc/img6.png index b933e52..e05fa31 100644 Binary files a/quick_nxs/htmldoc/img6.png and b/quick_nxs/htmldoc/img6.png differ diff --git a/quick_nxs/htmldoc/img7.png b/quick_nxs/htmldoc/img7.png index 503e3b0..e81cc43 100644 Binary files a/quick_nxs/htmldoc/img7.png and b/quick_nxs/htmldoc/img7.png differ diff --git a/quick_nxs/htmldoc/img8.png b/quick_nxs/htmldoc/img8.png index adfeab7..2bf72ac 100644 Binary files a/quick_nxs/htmldoc/img8.png and b/quick_nxs/htmldoc/img8.png differ diff --git a/quick_nxs/htmldoc/img9.png b/quick_nxs/htmldoc/img9.png index d7cc301..2f7e6c7 100644 Binary files a/quick_nxs/htmldoc/img9.png and b/quick_nxs/htmldoc/img9.png differ diff --git a/quick_nxs/htmldoc/index.html b/quick_nxs/htmldoc/index.html index 7f78186..4b6d66f 100644 --- a/quick_nxs/htmldoc/index.html +++ b/quick_nxs/htmldoc/index.html @@ -90,7 +90,7 @@ HREF="node3.html#SECTION00330000000000000000">3.3 Quick start: Step-by-step standard reduction
  • 1.1 The data recorded at Beamline 4A

    + The raw data recorded at the magnetism reflectometer stores each detected neutron as one event, which includes information on + the position on the detector, the relative time passed after the neutron pulse was created at the target, the absolute time of the + according pulse and some instrument flags as the flipper ON/OFF state. + After one run is finished this data is translated to the NeXus (.nxs) file format (which is based on the HDF5 standard) as two separate files, + one with event information (flexible) and one with 3D histograms in X,Y and time of flight coordinates (fast). + +

    +While the X and Y position can be used in conjunction with the instrument motor positions to gain information about the scattering + angle the time of flight (ToF) together with the moderator to detector distance allows to deduce the neutron wavelength. + Combining these three degrees of freedom allows to transform the coordinates into reciprocal space coordinates (for reflectivity + only the $ Q_z$ coordinate is relevant. + +

    +In order to derive the correct $ Q_z$ dependent intensities a normalization to the incident beam is necessary. + For this the direct beam intensity of the same wavelength band and with the same instrumental setup is measured + prior to the actual data. This measurement is later used as reference when calculating the reflectivity. + When different incident angles are measured (as is most often the case), each incident angle takes a different part out of + the direct beam and generally won't have the same scaling factor as the earlier measurement. For the extracted reflectivity + these different runs need to be combined.

    1.2 What does QuickNXS data reduction do?

    -

    + QuickNXS is a comprehensive tool to carry out the operations described above starting from the NeXus files and export the + data in a form usable for plotting and fit with reflectivity modeling software. + The program includes real time plots of several projections of the raw file data and previews of the exported data. + Several automated algorithms aid the process to improve the learning curve of non expert users and speed up + the extraction process of normal datasets while keeping the flexibility to apply special treatments for more + complicated cases. + For a quick start reflectivity extraction guide see section 3.3. + + + diff --git a/quick_nxs/htmldoc/node2.html b/quick_nxs/htmldoc/node2.html index 166ed64..d7d8d9f 100644 --- a/quick_nxs/htmldoc/node2.html +++ b/quick_nxs/htmldoc/node2.html @@ -93,7 +93,7 @@

    In addition to the default elements of a typical graphical user interface (GUI) like menu-, tool- and statusbar the QuickNXS program has a central area, which can be switch using the tab bar above, and several plot and option areas on the left and right side, always visible. Most users will work only with the "Overview" tab visible in the image above as it contains all valuable information about the current loaded dataset and the reduction parameters. The on the left and right areas can be customized in size and hold parameter entries for the reduction as well as the projection and reflectivity plot, most important for the extraction. -The other tabs, shown below, allow a parallel view on the 2D maps of all channels in the active dataset as well as an preview of off-specular scattering and motor/controller logs from the current file. +The other tabs, shown below, allow a parallel view on the 2D maps of all spin states in the active dataset as well as a preview of off-specular and GISANS scattering, motor/controller logs from the current file and a large plot area to compare different reflectivities extracted earlier.

    @@ -122,8 +122,9 @@

    -Each of the plots has an own toolbar described in the plots section and show one specific aspect of your dataset. Often it is important to look not only on one of these plots to analyze the data, so it is good to familiarize with what you see there. - All options available in the toolbar are duplicated somewhere in the menus, to make it easier to find, what you need. +Each of the plots has its own toolbar described in the plots section and shows one specific aspect of your dataset. + It is often important to look not only on one of these plots to analyze the data, so it is good to familiarize with what you see there. + All options available in the toolbar are duplicated somewhere in the menus, to make it easier to find what you need. The most important actions have keystrokes assigned to them for convenience. The keys have been chosen to be accessible only with the right hand, so you can use them together with the mouse. Most GUI elements have a tool tip assigned, so you can always position the mouse cursor over any element to get a more detailed description, what it is used for. @@ -140,7 +141,7 @@

    WIDTH="16" HEIGHT="16" ALIGN="BOTTOM" BORDER="0" SRC="./document-open.png" ALT="Image document-open"> File -> Open... (CTRL+O) -
    Shows a dialog to select a file to be loaded. The filename filter depends if "Histogram" or "Event" mode are selected in the Files area. +
    Shows a dialog to select a file to be loaded. The filename filter depends on the selection of "Histogram" or "Event" mode in the Files area.
    File -> Open Sum... (CTRL+SHIFT+O)
    @@ -159,6 +160,10 @@

    SRC="./listUp.png" ALT="Image listUp"> File -> Previous File (CTRL+SHIFT+D)
    Opens the file above the active selection in the Files area. + +
    +
    File -> Load Extraction... (CTRL+ALT+O)
    +
    Reads the header of an exported dataset (.dat) to reconstruct the options and load the data files used for the reduction. This can be used to improve an already exported dataset.

    @@ -166,14 +171,14 @@

    WIDTH="16" HEIGHT="16" ALIGN="BOTTOM" BORDER="0" SRC="./extractNormalization.png" ALT="Image extractNormalization"> Reduction -> Set Normalization (CTRL+W) -
    Use the data extracted from the current file as a normalization dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate normalization file is selected using the number of time of flight channels in a file and the central wavelength, if this is ambiguous a dialog is shown to the user to select one dataset. +
    Use the data extracted from the current file as a direct beam dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate direct beam run is selected using the number of time of flight channels in the file and the central wavelength, if this is ambiguous a dialog is shown to select one dataset.
    Image clearNorm Reduction -> Clear Normalization (CTRL+SHIFT+W)
    -
    Use the data extracted from the current file as a normalization dataset. You can add as many of these datasets as you like or remove them by activating this action again. The appropriate normalization file is selected using the number of time of flight channels in a file and the central wavelength, if this is ambiguous a dialog is shown to the user to select one dataset. +
    Empty the list of direct beam datasets.

    @@ -181,21 +186,21 @@

    WIDTH="16" HEIGHT="16" ALIGN="BOTTOM" BORDER="0" SRC="./totalReflection.png" ALT="Image totalReflection"> Reduction -> Set Scaling (CTRL+S) -
    For the first dataset it tries to find the edge of total reflection and fit a constant to all points before it to normalize it to one. For the second dataset it fits a polynomial to the overlapping region of the active dataset and the closest one found in the reduction table to stitch them together. It is helpful to first define a suitable range of cut points to improve the results. +
    For the first dataset this tries to find the edge of total reflection and fit a constant to all points before it normalizes it to one. For subsequent datasets it fits a polynomial to the overlapping region of the active dataset and the closest one found in the reduction table to stitch them together. It is helpful to first define a suitable range of cut points to improve the results.
    Image cutPoints Reduction -> Cut Points (L/R) (CTRL+SHIFT+C)
    -
    Tries to select good cut points for the given wavelength band based on the corresponding direct beam measurement +
    Tries to select good cut points for the given wavelength band based on the corresponding direct beam measurement.
    Image addRef Reduction -> Keep Item in List (CTRL+Q)
    -
    Use the reflectivity from the current dataset and add it the the reduction list. Only works for already normalized dataset. The options in the reduction list can still be changed later. +
    Use the reflectivity from the current dataset and add it to the reduction list. Only works for dataset with direct beam run. The options in the reduction list can still be changed later.
    Use the items and options in the reduction table to export a dataset. Shows a dialog to select how the export should be done.

    +

    +
    Advanced -> Advanced Background ... (CTRL+B)
    +
    Open a dialog with additional options for the background subtraction. +
    Image findXauto Advanced -> Automatic Peak Finder
    -
    If checked, the program runs a peak finder and peak fitting algorithm on the X-projection of the data each time a new dataset is loaded and sets the X-center parameter accordingly. + SRC="./tthZero.png" + ALT="Image tthZero"> Advanced -> Adjust Direct Beam +
    For datasets where the direct pixel and or DANGLE0 values are not correctly defined, this action can take the current X-position of a direct beam measurement to set overwrite parameters accordingly. + +
    +
    Advanced -> Clear Overwrite
    +
    This clears the overwrite parameters defined with "Adjust Direct Beam". + +

    +

    +
    Advanced -> Polarization...
    +
    Open a tool to plot polarization parameters of dedicated measurements. Not intended for users. + +
    +
    Advanced -> Raw Data Comparison...
    +
    Show a dialog to compare the raw data of the current run (not divided by direct beam) with the direct beam measurement and background. Can be helpful to identify problems with the direct beam normalization or the extraction window. +
    +
    Advanced -> Open Compare Window...
    +
    Open a dialog which allows the direct comparison of different reflectivity measurements, can be used several times and is equivalent to the Compare tab of the main window. + +

    Image limitYauto Advanced -> Automatic Y Limits
    -
    If checked, the program detects the region, where the intensity in the Y-projection drops below a certain threshold and sets the Y-center and Y-width parameters accordingly. After adding the first dataset to the reduction table the option is switched off automatically. + SRC="./findXauto.png" + ALT="Image findXauto"> Automatics -> Automatic Peak Finder +
    If checked, the program runs a peak finder and peak fitting algorithm on the X-projection of the data each time a new dataset is loaded and sets the X-center parameter accordingly. This works very reliable in general and is activated by default.
    Image fitXPos Advanced -> Refine X
    -
    If checked, each time the user clicks on the X-projection plot to select another X-center position, a Gaussian fit is executed to refine the position. + SRC="./limitYauto.png" + ALT="Image limitYauto"> Automatics -> Automatic Y Limits +
    If checked, the program detects the region, where the intensity in the Y-projection drops below a certain threshold and sets the Y-center and Y-width parameters accordingly. After adding the first dataset to the reduction table the option is switched off automatically to prevent issues at higher incident angle, where the intensity is comparable with the background. Clearing the reduction list will reactivate the option.
    -
    Advanced -> Advanced Background ...
    -
    Open a dialog with additional options for the background subtraction. +
    Image autoRef Automatics -> Auto Reflectivity
    +
    When extracting several reflectivities with similar experimental setup the extraction parameters will likely be similar, as well. Activate this action at the total reflection run and it will use the current settings to subsequently scale and add all following datasets as long as the incident angle increases. Afterwards the Strip Overlap action is invoked. For good quality data this is a convenient way to reduce the full reflectivity very quickly.
    Image tthZero Advanced -> Adjust Direct Beam
    -
    For datasets where the direct pixel and or DANGLE0 values are not correctly defined, this action can take the current X-position of a direct beam measurement to set overwrite parameters accordingly. - + SRC="./stripOverlap.png" + ALT="Image stripOverlap"> Automatics -> Strip Overlap +
    This option can be invoked directly before reducing the data to remove overlapping points between subsequent runs. The points are removed from the lower $ Q_z$ run, as these normally will have much lower statistics. + +

    -
    Advanced -> Clear Overwrite
    -
    This clears the overwrite parameters defined with "Adjust Direct Beam". - +
    Help Menu
    +
    Online access to this manual and an about dialog. + +

    -
    Advanced -> Open Compare Window...
    -
    Open a dialog which allows the direct comparison of different reflectivity measurements, can be used several times and is equivalent to the Compare tab of the main window. - +
    Debug Menu
    +
    Useful things for program debugging. +
    @@ -268,17 +303,16 @@

    2.3 The Overview Tab

    - This central tab shows information on the current dataset and the data reduction. A label at the top indicates the current file number, experiment ID, measurement type and the currently selected channel. + This central tab shows information on the current dataset and the data reduction. A label at the top indicates the current file number, experiment ID, measurement type and the currently selected spin state. The two map plots below the label show the projected intensities on the horizontal and vertical detector axes (left) and on the time of flight and horizontal detector axes (right), in the same way as it is show during data acquisition. - In the center some important parameters, extracted from the datafile header, are displayed. The αi, 2Θ and Counts ROI parameter also depends on the selected X- and Y- region and is thus not directly read from the file. + In the center some important parameters, extracted from the datafile header, are displayed. The SANGLE-calc and Counts ROI parameters also depends on the selected X- and Y- region and are thus not directly read from the file. The mouse can be used to define the X- and Y-region in this plots similar as in the projection areas described below.

    -At the bottom you can find the reduction table and an additional tab with a list of defined normalization datasets. +At the bottom you can find the reduction table and an additional tab with a list of defined direct beam datasets. These tables show the parameters used for the respective intensity extractions. These parameters in the Data tab can be edited afterwards and will be applied directly to the reflectivity curve shown in the Reflectivity area. - Directly above the table is a label showing the numbers of all defined normalization files and a drop-down selection for the current dataset channel shown in the Overview, projections and reflectivity plots. - Selecting a channel not present in the current file will result in a fallback to the first channel. + Directly above the table is a label showing the numbers of all defined direct beam runs and radio buttons to select the current datasets spin state shown in the Overview, projections and reflectivity plots.

    @@ -291,12 +325,12 @@

    Files
    -
    A list of all datafile in the current directory together with an entry to search for a file by number and select to extract either histogram or event mode data. In the event mode setting additional options will be displayed. +
    A list of all datafiles in the current directory together with an entry to search for a file by number and select to extract either histogram or event mode data. In the event mode setting additional options will be displayed.

    Reflectivity Extraction (Basic)
    -
    The parameters used to extract the active reflectivity. When adding a dataset to the reduction list, these parameters are stored. +
    The parameters used to extract the active reflectivity. When adding a dataset to the reduction list, these parameters are stored in the Data table.
    Reflectivity Extraction (Advanced)
    @@ -308,21 +342,23 @@

    Plot Options
    -
    Global settings for the shown plots, does not effect the data reduction in any way. Here you can also chose to show the 2D datasets in wavelength and angle instead of time of flight and pixel. +
    Global settings for the shown plots, does not effect the data reduction in any way. Here you can also chose to show the 2D datasets in wavelength and angle coordinates instead of time of flight and pixel.

    X-Projection
    -
    A plot with the data of the loaded file projected on the detector X-axis. Green lines indicate the background region defined at the moment. The X-position is marked with a black line and the X-width with two red lines. The mouse can be used to change the background region and X-center using the left mouse button and set the X-width using the right mouse button. +
    A plot with the data of the loaded file projected on the detector X-axis. Black lines indicate the background region defined at the moment. The X-position is marked with a black line and the X-width with two red lines. The mouse can be used to change the background region and X-center using the left mouse button and set the X-width using the right mouse button.
    Y-Projection
    -
    An equivalent projection on the detector Y-axis, showing the selected Y-region with red lines. The mouse can be used to change the Y-region using left clicks. +
    An equivalent projection on the detector Y-axis, showing the selected Y-region with green lines. The mouse can be used to change the Y-region using left clicks.

    Reflectivity
    -
    Show all datasets already added to the reduction list and the currently selected one. For unnormalized datasets it show intensity and background vs. wavelength. Datasets in the reduction list can be scaled with the mouse wheel when at the right x-coordinates (faster scaling when CTRL is pressed while scrolling). +
    Shows all datasets already added to the reduction list (colored lines) and the currently selected one (black line labeled 'Active'). For datasets without fitting direct beam runs it shows intensity and background vs. wavelength. + Datasets in the reduction list can be scaled with the mouse wheel when at the right x-coordinates (faster scaling when CTRL is pressed while scrolling). + Note that a dataset can be shown twice (as "Active" and with it's run number) when it is already in the reduction list and selected.

    @@ -331,7 +367,7 @@

    2.5 Convenient Parameter Alteration with the Mouse Wheel

    - You can use the mouse wheel when you cursor is on top of a value entry to increase or decrease the according parameter. + You can use the mouse wheel when your cursor is on top of a value entry to increase or decrease the according parameter. This can be very convenient to see the result of e.g. changing the scaling factor for the current reflectivity. Holding down the CTRL key while scrolling increases the speed of the parameter changes. The same method can also be used to scale datasets in the reduction table, simply by moving the mouse at the curve in the reflectivity plot and scrolling with the mouse wheel. @@ -342,14 +378,16 @@

    Each of the plots described above are created with the same framework and have a toolbar below them:
    Image plottoolbar
    The first 5 items allow the navigation on the plot, like zooming in and out or moving the current view position. - The third icon from the left opens a dialog, which can be used to change the amount of freespace around the plot to fit the current window scaling. - The last two icons can be used to save or print the plot. - Keep in mind that the X- and Y-projection as well as the overview maps can be used to select the extraction parameters. + The fourth icon from the left opens a dialog, which can be used to change the amount of frees pace around the plot to fit the current window scaling. + The following two icons can be used to save or print the plot. The last button allows to toggle between logarithmic and linear plotting (this is not persistent after changing the dataset). + +

    +Keep in mind that the X- and Y-projections as well as the overview maps can be used to select the extraction parameters. This will only work when no scaling tool is selected from the plot toolbar.

    diff --git a/quick_nxs/htmldoc/node3.html b/quick_nxs/htmldoc/node3.html index ebc9407..1b77987 100644 --- a/quick_nxs/htmldoc/node3.html +++ b/quick_nxs/htmldoc/node3.html @@ -62,7 +62,7 @@ HREF="node3.html#SECTION00330000000000000000">3.3 Quick start: Step-by-step standard reduction


    @@ -96,17 +96,17 @@


    3.1 Open and view a dataset

    - The first step to start with is to enter the number of a normalization dataset in the "Open Number:" entry of the Files area and press enter. The program will now locate the file and open it. + The first step to start with is to enter the number of a direct beam dataset in the "Open Number:" entry of the Files area and press enter. The program will now locate the file and open it. The Files area list will be populated with all files in your current proposal data folder and the plot windows should look similar to this:

    @@ -116,15 +116,15 @@

    @@ -141,16 +141,16 @@

    For good quality data (enough intensity and narrow reflection) the program supports a fully automatized mode, where all reduction parameters are automatically calculated. This mode will be applied automatically when more than one dataset is selected at the File Open Dialog. - The direct beam measurement have to have lower scan numbers than the actual measurements or need to be set in advance for this method to work. + The direct beam measurements have to have lower scan numbers than the actual measurements or need to be set in advance for this method to work.

    The automatic algorithm performs the same steps as described in section 3.3, while trying to guess the best parameters. - The datasets are read one-by-one and, depending on the reflection angle, they are either set as normalization or reflectivity data in the reduction list. + The datasets are read one-by-one and, depending on the reflection angle, they are either set as direct beam or reflectivity data in the reduction list. Here is an example how the interface might look after the algorithm has finished:

    Image overview @@ -165,13 +165,14 @@


    3.3 Quick start: Step-by-step standard reduction

    - For most datasets the reduction is done very similar to the fully automatized method but with more control of the user. - Every dataset is examined by the operator to select the best extraction parameters. + For most datasets the reduction is done very similar to the fully automatized method but with more control by the user. + Every dataset is examined by the operator to select the best extraction parameters. This description should work in + almost all circumstances.

    -3.3.1 Step 1: Set wavelength normalization from direct beam +3.3.1 Step 1: Set direct beam runs

    @@ -181,11 +182,11 @@

    @@ -193,13 +194,16 @@

    -Open your normalization file as described in section 3.1. - Make sure the SANGLE-calc value shown in the overview tab is close to zero and that the X- and Y-projections show the correct regions with the red indicators. +Open your direct beam file as described in section 3.1. + Make sure the SANGLE-calc value shown in the overview tab is close to zero and that the X- and Y-projections show the correct regions indicated with vertical lines. Activate the Set Normalization action Image extractNormalization, this will add the current dataset to the "Normalization" list, the "Direct Beam Runs:" label will show the number of the dataset and the reflectivity will show the normalized intensities, which should all be one. - Repeat this step for each direct beam measurement needed for your dataset. + ALT="Image extractNormalization">, this will add the current dataset to the "Direct Beam" list, the "Direct Beam Runs:" label will show the number of the dataset and the reflectivity will show the normalized intensities, which should all be one. You can use the Cut Points (L/R) action Image cutPoints here to already set reasonable parameters for the Cut Pts entries. + Repeat this step for each direct beam measurement needed for your datasets.

    @@ -208,13 +212,13 @@

    Image normalizemap1 Image normalizemap2   
    Image normalize1 Image normalize2 Image normalize3
    Reflectivity after
    Image normalize3 Image normalize_after
    @@ -222,11 +226,11 @@

    @@ -235,11 +239,11 @@

    Although it is in principle possible to define the extraction and background region for each dataset separately, it is recommended to use the same parameters for all files. - From this perspective it is often a good idea to start with the dataset with the highest incident angle, as there the signal to background ration is the lowest. - To produce the best results you should select a large region (statistics), keeping enough distance from the reflected beam (especially when off-specular Bragg-sheets are present) and to not include regions where the background drops (shadowed by the right detector slit for example). + From this perspective it is often a good idea to start with the dataset with the highest incident angle, as the signal to background ration is the lowest there. + To produce the best results you should select a large region (statistics), keeping enough distance from the reflected beam (especially when off-specular Bragg-sheets are present) and to not include regions where the background drops (shadowed by the left detector slit for example). The Y-region, shown in the Y-projection of the first (low Q) dataset, is often detected very well automatically. Just check that it fits to the reflected intensity area. - For very small samples it can sometimes make sense to manually restrain the area to the sample reflection. + For very small samples it can sometimes make sense to manually restrain the area to the sample reflection using the right mouse button on X-Y map.

    @@ -250,25 +254,25 @@

    r0.4 Image totalreflection
    Go to your dataset starting at the lowest $ Q_z$ value, remove points from the low $ Q_z$ region, which are not reasonable with the Cut Pts parameters: Image cutpoints (can be done automatically with the Cut Points (L/R) Image cutPoints action). Than activate the Set Scaling action Image cutPoints action if not already performed after direct beam selection). Than activate the Set Scaling action Image totalReflection to normalize the total reflection to one. @@ -276,13 +280,13 @@

    Next add the dataset to the refinement list using the Keep Item in List action Image addRef to add the dataset with the current parameters in the list. + ALT="Image addRef">, copying the current parameters to the list. This will automatically switch off the Automatic Y Limits Image limitYauto, so all datasets will be reduced with the same Y-range. This is important for the high $ Q_z$ region as the background often inhibits a good automatic detection of the Y-region. @@ -303,14 +307,14 @@

    Now you can continue adding each subsequent dataset one after another. - If nothing goes wrong, the only thing that needs to be changed from dataset are the Cut Pts and Scaling values. + If nothing goes wrong, the only thing that needs to be changed from dataset to dataset is the Scaling parameter. If the scaling of subsequent datasets does not fit, activate the Set Scaling action Image totalReflection again. This fits a polynomial to the logarithmic data of both adjacent datasets including a scaling factor for the second, which is than used for the scaling after the fit. The error weighted $ \chi^2$ used for this refinement is:
    @@ -323,24 +327,24 @@

    -->

    X-Y map
    Image yregion Image yregionmap
    $\displaystyle \chi_{stitch}^2 =$   $\displaystyle \underset{DS1}{\sum} \frac{(\log(I_i)-p(Q_i))^2}{(\delta{}I_i/I_i...
 ...underset{DS2}{\sum} \frac{(\log(I_j\cdot scale)-p(Q_j))^2}{(\delta{}I_j/I_j)^2}$  
    with $\displaystyle p(Q) =$   $\displaystyle a\cdot Q^2 + b\cdot Q +c$    for polynom order 3 @@ -349,8 +353,8 @@


    The resulting fit function is shown in the reflectivity plot together with the scaled data as can be seen in the figure on the right. - For some datasets with very sharp features like multilayer Bragg-peaks this method will not work, in those cases you need to change the Scale 10ˆ parameter manually until the datasets fit together nicely. - For polarized measurements it can sometimes be helpful to switch back and forth between different polarization channels as the variation in contrast can lead to smooth transitions, where the other channel has a sharp feature. + For some datasets with very sharp features like multilayer Bragg-peaks this method will probably not work, in those cases you need to change the Scale 10ˆ parameter manually until the datasets fit together nicely. + For polarized measurements it can sometimes be helpful to switch back and forth between different polarization states as the variation in contrast can lead to smooth transitions, where the other state has a sharp feature. Now add the dataset to the reduction list with Keep Item in List action

    -r0.6 +r0.33
    - - - - +
    As added to reduction listWith changed cut points
    Image stitching2Image cleanpoints 
    When all datasets of one measurement have been added, as can be seen in the image on the right, you can try to improve the scaling of the different parts, if needed, and change the cutting parameters. To change the scaling of one dataset you can either change the value of the I0 column entry in the reduction list or move the mouse cursor on top of the curve you want to scale and move the mouse wheel. To remove unwanted point you need to change the values of the NL and NR column entries as they define the number of points cut from the low- and hight-Q side respectively. - If the number of time of flight channels in the histogram dataset is larger than the wavelength window used for the measurement it is possible that large values are needed (<=60) to see changes in the dataset. + If the number of time of flight channels in the dataset is larger than the wavelength window it is possible that large values are needed (<=60) to see changes in the dataset. + (Removing of overlapping points with low statistics can be done with Strip Overlap Image stripOverlap.)

    @@ -412,9 +414,9 @@

    SRC="./reduce.png" ALT="Image reduce"> from the menu, toolbar or the button below the reduction list. The reduce dialog has several options for the export of the dataset. - You can select which reductions should be stored, choose the channels to export and define which data formats should be created. + You can select which reductions should be stored, choose the spin states to export and define which data formats should be created. As a default, the specular reflectivity of all available channels will be exported to separate ASCII files and a dialog with a plot of the resulting data will be shown afterwards. - Additional output options are a combined ASCII file containing all channels, a matlab or numpy datafile for later processing, a Gnuplot script and image file to plot the ASCII data and a GenX reflectivity modeling template already containing the measured data. + Additional output options are a combined ASCII file containing all states, a matlab or numpy datafile for later processing, a Gnuplot script and image file to plot the ASCII data and a GenX reflectivity modeling template already containing the measured data. If you want to send the resulting data to your email address you can use the Email Results tab to enter your address and select which and if the data should be send after the export.

    @@ -427,8 +429,22 @@

    -3.5 Common problems to be aware of +3.5 Common complications to be aware of

    +
    +
    Bend/Twisted samples
    +
    Samples which do not have a flat surface can produce "fan"-like or split reflections. In these cases the peak fitting can result in different extraction areas for different incident angles. Use a small X window and/or the "Fan" Reflection option (see section 4.5) from the Reflectivity Extraction (Advanced) tab and define the reflection position manually. + +

    +

    +
    Runs don't fit together
    +
    This often is caused by one of two reasons: One possibility is that there are very sharp features in the reflectivity which are measured differently for the different angles, as the resolution changes with the angle. In this case the only possibility is to try to merge the datasets manually as good as possible. If the measurement is polarized, take a look at the other spin state, sometimes the feature won't be as pronounced and make it easer to find the scaling factor. The other possibility is that the direct beam position, which should be measured before the experiment and written to the datafile, is not correct. This can be checked with the according direct beam run. If the SANGLE-calc value for the direct beam run is not close to zero you can use the Adjust Direct Beam action Image tthZero to overwrite the values read from the datafile. Afterwards the it should be possible to extract the reflectivity normally. Don't forget to reset this overwrite afterwards using Clear Overwrite (see section 4.3). + +
    +