A third approach is to manage data as separate files which contain similar or compatible
information. For example, one file would contain in situ data collected from the lakes, another would
contain stream data, and yet another would contain meteorological data or operational data. This third
approach often tends to evolve as new sources of data begin to be collected during existing studies. In
this approach the management tools can include statistics packages, database management software, or
popular spreadsheet programs such as Excel. The advantage is that the management tool can be
chosen to optimally fit the data needs. The disadvantages are that the data files exist in different formats
that may not be compatible and that multiple files increase the possibility of management errors. The
first disadvantage is easily overcome by maintaining files in an ASCII (text) format or by exporting them
as needed in compatible formats. The second disadvantage is unavoidable.
A final mention of a recent development is necessary - GIS. Geographical Information Systems
are important for management and analysis of spatially dependent data of many types. These systems
are powerful tools but they are specialized in application. As methods such as remote sensing become
more prevalent and the datasets produced become more common, GIS will be a dominant means of
managing and understanding the results. However, for smaller datasets and those which are not
spatially complex, GIS is not necessary and analysis can be completed with some attention to detail
using other more conventional methods.
This step is important and is similar for the transition from the laboratory to the database as
well. If field data are taken electronically, the transition is relatively simple and the manager merely must
be careful that only the intended data are included in the database. The manager must also take care
that all of the intended data are included. If taken manually, however, the paper datasheets must be
transcribed into digital form. This is labor- and time-intensive. Once transcribed, a third party must
proof-read the digitally printed form to detect transcription errors. Errors detected must be corrected
and the new printout proof-read again. This process must be repeated until no errors are detected.
This does not insure detection of all errors and the analysis stage serves as a final detection process.
Identical precautions must be taken for laboratory or other sources of data such as old printed reports.
There are many approaches to data analysis. However, all persons engaged in data analysis
need to recognize that we live in exciting times - with new and greater computational ability each year.
Everyone must remember, however, that computational tools are intended to aid in the analysis and are
not ends in themselves. People create the ideas and people must judge value and results. There is still
good work that can be done with modest computational means as long as there is sound thought.
If studies are designed according to strict statistical guidelines, then the first analyses are
predetermined statistical routines. In reality, many assessment efforts do not have this luxury and this


Privacy Statement - Copyright Information. - Contact Us

Integrated Publishing, Inc.