Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.


Log in or register to post/reply in the forum.

file format convert to split files by month


tjwilli_58 Apr 3, 2017 02:37 PM

Hi,

I found some posts about using Split and File Format Convert to split files. I'd like to be able to split CR3000 *.dat files by month. I'm currently doing this is Python with the datetime and csv modules and outputting to CSV. 

I figured out how to use Split to get that timestamp range right

e.g.

Start Condition = 1:[2017]:1[3%1]::1

Stop Condition = 1:[2017]:1[4%1]::1

to get March 2017, The problem I have with this is that it is very slow. (My data is every 5 minutes) and the headers are thrown away.

It seems that File Format convert would be the way to go, but all months do not have the same number of days. It does seem to be faster than Split. 

Is there a way to do this with FFC, or should I just stick with my python script?

Thanks.

 


Dana Apr 3, 2017 11:29 PM

There are a couple of options you can consider. 

  • Make sure that on the Output File tab for Split's PAR file, the "Screen Display" checkbox is cleared. Split runs faster if it doesn't have to display the lines of data scrolling by.
  • Try processing the PAR file using the Splitr.exe (runtime version of Split). You can launch it from a Task within LoggerNet's Task Master. The runtime might be a little faster. Make sure to use the /r switch to close the run-time or a bunch of run-times will stack up and eventually cause memory issues on the PC. There are other command line switches you might want to review in the Split help file. 
  • Split has an option under the Offsets/Options button on the Input File tab to start at "last count". Split basically seeks into the file to the point where it last left off, and then begins processing the file. The problem with this is that you can only start processing on new data, which means it wouldn't work for your purposes if you ran the process more than once per month.
  • An alternative to Split might be to set up a Calendar based Task in LoggerNet's Task Master, that does something to process the file based on the "32-last" selection under Days of Month. This will run a task on the last day of every month, regardless if that day is 28, 30, or 31. If the only thing you are trying to accomplish with Split is getting "month-sized" files, you could forgo running split and use this interval to run a batch file at the end of every month to Move the file to a new directory, or Rename it with a new name. At the next data collection, LoggerNet would just create a new file with any new data (it won't go back and collect prior data -- it doesn't care that the file is "missing", as it knows that data collection was successful). This would also resolve the issue with losing headers that you have in split.

I hope one of these will help!

Best, Dana


tjwilli_58 Apr 4, 2017 12:34 PM

Dana,

Thanks for the suggestions. I think the last option using Task Master sounds like the best option for me. I want to keep the headers in each file. Now if I'm collecting data every 5 minutes, should I schedule this at say 2357 at the end of each month, so I get the last collection of the month at 2355? 

I'll also test out the other options, but I'm definitely leaning on the last option

Regards, Tim


Dana Apr 4, 2017 03:08 PM

Yes, I think collecting at some point between 2355 and the top of the hour should work (I also think this is your best option if the only reason you were using Split was to get a monthly file). Keep in mind that the downside to this option is if you have a failed data collection attempt, you could potentially end up with two file(s) that have more or less data than you expect. You'll still end up with all the data, but you might have to occassionally edit the files to get the right month into the right file. 

Unless you have a problematic communication link, however, the likelihood of this happening is fairly low.

Dana

Log in or register to post/reply in the forum.