I need to time correlate four arrays. The arrays are stored into Final Storage Area 1 as follows:
100 - once/second
101 - once/60 seconds
102 - once/five minutes
103 - once/60 minutes
All arrays use the following first five elements for time stamping:
e1 - array number
e2 - year
e3 - julian date
e4 - hour/minute
e5 - second
I want to time correlate the data on a daily basis only. When I run a report, I only care about the data stored for that day only. The start of data recording varies each day. For example, during one test we started recording data at around 0400 - 1230. The next day we started recording data at 0930 - 1600. The last day, we recorded data from 0500 - 0900. All times are approximate only (data not started at exactly 0400, it may have been 04:12:37).
I'd like to write a single statement that will time correlate one day's worth of data regardless of when data starts being recorded. I have a starting condition of:
3[-0]:4[0001]:
We will run the reports daily for each set of data recorded each day, and data will never cross midnight.
This works fine for individual reports, but I have not been able to combine even two of the arrays correctly, much less all four. My problems seem to be in the copy line. Any help would be greatly appreciated.
* Last updated by: MichaelM on 4/18/2010 @ 9:03 AM *
新对话如下:
Update.
I've been able to correlate by minute, more or less.
For the start condition, I've entered:
3[-0]:4[0001]:
in all four inputs.
In the copy field, I've entered the following for each array:
array copy field
100 1[100]
101 1[101]
102 1[102]
103 1[103]and::4[1]:
My outputs look like this:
---
big blank space
---
04/19/2010 07:57:15 100
04/19/2010 07:58:00 100 07:58:10 101
04/19/2010 07:59:00 100 07:59:10 101 07:59:00 102 07:59:00 103
04/19/2010 08:00:00 100 08:00:10 101
04/19/2010 08:01:00 100 08:01:10 101
04/19/2010 08:02:00 100 08:02:10 101
04/19/2010 08:03:00 100 08:03:10 101
04/19/2010 08:04:00 100 08:04:10 101 08:04:00 102
04/19/2010 08:05:00 100 08:05:10 101
04/19/2010 08:06:00 100 08:06:10 101
no matter what I put in the start conditions or the copy conditions regarding element 5 (seconds), I can not get this data to sync on a per-second basis. Also, noticeably missing is all my 1-second data from array 100.
* Last updated by: MichaelM on 4/20/2010 @ 7:15 AM *
新对话如下:
Hello Michael,
I don't have any files to test it with, and I am working off-site until Thursday.. but... I think you need to be trying to sync on element 5, which is where the seconds are held. Syncing on element 4 (using the 4(1) in the copy field)I THINK will only sync to 1 minute. Also, you do not need the colons in the copy condition.
Sorry I can't be more helpful. In the Split manual there is a phrase "Split lends itself to experimentation." I've always disliked that statement, but it is true :)
If you don't get it resolved (and can wait until Thursday or Friday), send your data files to me at dana at campbellsci dot com and I will "experiment" with them to see if I can get it working.
Regards,
Dana
新对话如下:
Thanks Dana.
I have tried entering this in the copy field:
1[103]and5[1]
and it acts as if there is no conditional at all, except that whichever array contains the statement does not display any data at all.
For example, if all start conditions are the same, and the copy conditions are as follows:
1[100]and5[1]
I get no 1-second data displayed,
1[101]and5[1]
I get no 60-second data displayed,
1[102]and5[1]
I get no 5-minute data displayed,
1[103]and5[1]
I get no 60-minute data displayed.
I've tried various permutations, and have discovered that adding the colons in the copy field, in certain situations, seems to help include some data. I can't really figure a rhyme or reason for it.
What seems like it should work is:
Start condition:
::5[01]
Copy condition:
1[103]and5[1]
but all I get is empty data.
I'll send you the files later.
新对话如下:
Okay, after working with this on and off today, I have something that works if there is only one day's worth of data collected. However, this won't work for appended input files that span multiple days:
Start Condition:
2:3:4:5
Copy Condition:
1[100]
1[101]
1[102]
1[103]and5[1]
This gives the desired result, but the output is for all data in the input file. Since we only want to process the data collected on any given day, I would think that the start condition should be:
2:3[-0]:4:5
but curiously, this does not work. This is so frustrating.
* Last updated by: MichaelM on 4/20/2010 @ 1:15 PM *
新对话如下:
In another update, I have now collected data for a few days and have tried, without success, to separate out one day's worth of data from the output file.
If I enter a start condition of either
2:3:4:5
or
3:4:5
and a copy condition of:
1[100]
1[101]
1[102]
1[103]and5[1]
I get a report that is synced to the second, going back to the very first record of data. If there are gaps of data, I get large pockets of empty space in my reports. For example, if I stop logging data on day 109 at 1345, and start logging again on day 110 at 0800, I get empty data, I assume synced to the second, for all times between 1345 and 0800 of the next day.
If I change any part of the start condition to any of the following:
2:3[-0]:4:5
:3[-0]:4:5
3[-0]:4:5
or any of the above with anything else in [] after element 3, such as:
3[109]
3[110]
3[-1]
I get thousands of pages of completely empty data in my generated report.
If I don't include the :4:5 in the start condition, I do not get data in my reports that include records for each second of recorded data.
I'll wait to hear back from you before I continue to play with it anymore. I will send you an updated copy of my input file so you can see what I have been playing with.
新对话如下:
Hi Dana,
In that regard with split function, I have tried to split hourly files from TS tables. However during the process timestamp is reported as bad data, resulting in an output without the timestamp column. Do you have any suggestions as for how can I work around this?
Thank you!!!
新对话如下:
Dear LATR,
On the output file tab, set the column width for the timestamp large enough to accommodate the string you are using for the date/time.
Also, you may want to search the Split help file for Bad Data. The help provides information on the different problems that would cause this message to appear.
Regards,
Dana
新对话如下:
EDITED on 5-10-2010
The below solution DOES NOT WORK for collecting data. If selecting the overwrite function in the Setup option of the LoggerNet Toolbar, data is overwritten at your collection interval. In my case, that is every second! I'm working on a new batch file that implements the ideas from below, but will require user input to determine whether or not new data will be collected or whether previously collected data will be processed. In either case, the overwrite option will NOT be used, the APPEND TO FILE option must be selected.
/EDIT
For anyone who can use the answer, it turns out it is two-fold.
First (in an email from Dana): Part of the problem is that the data in the 1 second file is not on an even interval (it's on a .25 interval). Split is very literal, so it could never find a record that matched the copy condition. This is resolved by using a "slop factor" (for lack of a better term) on the 1 second file's copy condition (e.g., 1[103]and5[1, 0.5 ]).
Second, I had to write a batch file and place a shortcut to it in the Windows XP 'Startup' directory to copy and protect the daily data, then select the over-write option in the Loggernet Setup utility in order to only process daily data. My batch file is below if anyone finds it useful.
This file could also be run from the scheduled tasks.
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
::
:: The following script is used in conjunction with the Campbell Scientific
:: Loggernet Software version 3.4.1. It's purpose is to make a copy of the latest
:: data output file from the datalogger so that daily data can be collected and
:: processed.
::
:: For this script to work as intended and to allow daily output logs to be post-
:: processed by the Script function of the Loggernet software, the following steps
:: must be taken.
::
:: 1. Open the Loggernet toolbar, located at:
:: C:\Program Files\Campbellsci\LoggerNet\ToolBar.exe
::
:: 2. Click on the 'Setup' icon on the Toolbar, and navigate to the correct data
:: datalogger. In case of Weather Station 5 (WS-5), this is a CR10X.
::
:: 3. Click on the FS Area 1 tab, and enter the following path in the Output File
:: Name block (also check the 'Enabled for Collection' checkbox:
:: C:\Campbellsci\LoggerNet\WS-5_raw_data.dat
::
:: 4. Ensure that 'Use Default File Name' is NOT checked.
::
:: 5. Select 'Overwrite Existing File' in the File Output Option dropdown box, and
:: make sure Output Format is set to 'ASCII, Comma Separated'.
::
:: 6. Click on the Apply button, and close the Setup program.
::
:: Now, we will ensure that the data collection sream is correct for the Loggernet
:: Split data post-processor.
::
:: 1. Click on the 'Split' icon on the Toolbar.
::
:: 2. Click File --> Open, and navigate to the WS-5_parameter_file.PAR located at:
:: C:\Campbellsci\SplitW\WS-5_parameter_file.PAR
::
:: 3. Click 'Open.
::
:: 4. Verify that there are four tabs at the bottom. On each tab, click on the
:: 'Browse' button in the Input Data File section, and ensure the following
:: input data file is loaded for all four tabs:
:: C:\Campbellsci\LoggerNet\WS-5_RAW_DATA.DAT
::
:: These steps, along with the running of this script at machine startup, should
:: ensure reliable and easy collection/post-processing of data for any given day
:: on the weather station.
::
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
setlocal
@echo off
cls
echo.
:: Set default Raw Data file
set RAWDATA=WS-5_RAW_DATA.DAT
:: Set default source directory and destination directory and log file
set SRC=C:\Campbellsci\LoggerNet
set DEST=C:\Campbellsci\LoggerNet\Backup
set LOGNAME=RAW_DATA_Daily_Backup.log
set LOGFILE=%SRC%\%LOGNAME%
:: Use command-line settings if given
if not (%1)==() set RAWDATA=%1
if not (%2)==() set SRC=%2
if not (%3)==() set DEST=%3
:: Check to see if log file exists
If NOT Exist "%LOGFILE%" goto:noseparator
Echo.>>"%LOGFILE%"
Echo.==========================================>>"%LOGFILE%"
:noseparator
echo.%Date% >>"%LOGFILE%"
echo.%Time% >>"%LOGFILE%"
:: Check to see if raw data file exists
if not exist %SRC%.\NUL goto E_FOLDER not found
if not exist %SRC%.\%RAWDATA% goto EOF && echo.%RAWDATA% does not exist>>"%LOGFILE%"
echo.Source: %SRC%\%RAWDATA%>>"%LOGFILE%"
echo. Source Raw Data File is:
echo.
echo. [ %RAWDATA% ]
echo.
echo. Source Directory:
echo.
echo. [ %SRC%\ ]
echo.
echo. Backup Directory:
echo.
echo. [ %DEST%\ ]
echo.
:GETDATE gets the date from the first line of the data file and
:: sets the date from the data file to DATADATE
for /f "tokens=2-5 delims=,." %%D in (%SRC%\%RAWDATA%) do (
set DATADATE=%%D-%%E-%%F-%%G
goto JUMPOUT
)
echo. Houston, we have a problem!
goto EOF
:JUMPOUT of the for loop after parsing only the first line of text
echo. Data was last recorded on %DATADATE%
echo. (DDD is Julian) YYYY-DDD-HHRR-SS
echo.
for /f "tokens=1 delims=." %%N in ("%RAWDATA%") do (
set NEWFILE=%%N-%DATADATE%.DAT
)
echo. The backup file for the current data set is:
echo.
echo. [ %NEWFILE% ]
echo.
:: Check to see if a copy of the backup data file exists
if NOT exist %DEST%\%NEWFILE% goto MAKECOPY
echo. Backup exists, copy not made...
echo.Backup: %DEST%\%NEWFILE%>>"%LOGFILE%"
goto COMPARE backup
:MAKECOPY of the original datafile
echo.f|xcopy %SRC%\%RAWDATA% %DEST%\%NEWFILE% /y /v>NUL
attrib +R %DEST%\%NEWFILE%
echo. File copied and set to "read only".
echo.%DEST%\%NEWFILE% created and set to read only.>>"%LOGFILE%"
echo.
:COMPARE backup
echo. Comparing backup to Raw Data File...
echo.
:: do a file compare and ensure that the backup matches the datafile
fc /A/L/W %SRC%\%RAWDATA% %DEST%\%NEWFILE%>>"%LOGFILE%"
if ERRORLEVEL 1 goto E_BACKUP
echo. Files match, launching Loggernet software...
echo.
:CLEANUP, log file, and error handling
echo. Log file of this batch script saved at:
echo.
echo. [ %LOGFILE% ]
echo.
echo. Launching LOGGERNET software now...
echo.
ping -n 6 127.0.0.1>nul
FOR %%V IN (SRC DEST RAWDATA LOGNAME LOGFILE DATADATE NEWFILE BACKERR DATADAT2) DO SET %%V=
goto DATALOGger call
:E_FOLDER not found
echo. %SRC% does not exist. Batch script will not run.
echo.%SRC% not found.>>"%LOGFILE%"
echo.
goto EOF
:E_BACKUP and Raw Data File do not match
for /f "tokens=2-5 delims=,." %%W in (%DEST%\%NEWFILE%) do (
set DATADAT2=%%W-%%X-%%Y-%%Z
goto EJUMPOUT
)
:EJUMPOUT
for /f "tokens=1 delims=." %%R in ("%NEWFILE%") do (
set BACKERR=%%R-%DATADAT2%.ERR
)
echo. Raw Data File and Backup File do not match.
echo. Renaming Backup File to:
echo.
echo. %BACKERR%
echo.
echo. Creating new backup for Raw Data File...
echo.
echo.%BACKERR% created.>>"%LOGFILE%"
ren %DEST%\%NEWFILE% %BACKERR%
goto GETDATE
:DATALOGger call
Start /d"C:\Program Files\Campbellsci\LoggerNet" ToolBar.exe
:EOF end-of-file
endlocal
exit
* Last updated by: MichaelM on 5/10/2010 @ 9:35 AM *