-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: raw_timestamps support for writer and defragment #320
Conversation
resolves #319 |
Codecov ReportAll modified and coverable lines are covered by tests ✅
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## master #320 +/- ##
==========================================
+ Coverage 96.90% 96.92% +0.01%
==========================================
Files 19 19
Lines 2621 2632 +11
==========================================
+ Hits 2540 2551 +11
Misses 81 81 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for making a PR for this @jimkring! The approach looks good but I've got a few suggestions
nptdms/test/test_example_files.py
Outdated
"""Test defragmenting a file with raw timestamps""" | ||
test_file_path = DATA_DIR + '/raw_timestamps.tdms' | ||
output_file = BytesIO() | ||
TdmsWriter.defragment(test_file_path, output_file, raw_timestamps=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be good to have a check that if you read in the defragmented file, there hasn't been a loss of precision of the timestamp properties
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@adamreeve great idea. I've added assertions that the channel.data
and channel.time_track()
are the same in both the input and output data (using np.testing.assert_equal
). I'm not sure if this is checking for loss of precision of the timestamps and could use your input on that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I don't think that's enough as the time track will be cast down to a numpy datetime64. To check that this test is actually testing the fix, it should fail if we remove raw_timestamps=True
from TdmsWriter.defragment
.
I think what you'll want to do is pass raw_timestamps=True
when opening both output_tdms
and input_tdms
, and then add a check like:
# verify the written timestamp matches the input
output_channel = output_tdms[group.name][channel.name]
assert output_channel.properties['wf_start_time'] == channel.properties['wf_start_time']
To make that work correctly you'll also need to implement __eq__
for TdmsTimestamp
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@adamreeve BTW, I also just refactored the test_defragment to test more example files. In doing so, I noticed that one of the example files raises an error, due to scalar channel data. However, this error should probably be addressed in a separate issue and PR.
- added tdms_files_assert_equal() support function that compares two TdmsFile instances - parametrized test_defragment() with multiple example files, for more test coverage
|
||
@pytest.mark.parametrize("tdms_file", [ | ||
'raw_timestamps.tdms', # Test defragmentation of a file with raw timestamps | ||
# 'raw1.tdms', # <- cannot defragment this file (ValueError: Channel data must be a 1d array) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@adamreeve note that trying to defragment this file fails due to it having scalar channel.data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right yes we don't yet support writing or defragmenting files with DaqMxRawData
This looks good now, thanks for fixing this @jimkring |
Thanks @adamreeve. Some of my teammates were interested in using the new version, once it's ready. Will there be a 1.7.2 release? |
Sure, I've released version 1.8.0 with these changes (new features should increment the minor version according to semver) |
This PR adds support for writing raw_timestamps (thus preserving timestamp precision) when defragmenting a TDMS file. This ensures that timestamp precision is preserved when defragmenting a TDMS file created with LabVIEW, for example.
Notable changes:
TdmsTimestamp
type now behave likeTdmsType
when writing bytes to fileenum_value=0x44
class attributebytes()
property that returns the struct.pack'ed databytes=_struct_pack('<Qq', second_fractions, seconds)
nptdms.writer
now handles theTdmsTimestamp
(in the_to_tdms_value()
function)