Skip to content

Update NDAttribute datasets to increase in dimension as required Fix #74#81

Merged
ulrikpedersen merged 1 commit intohdf5-issuesfrom
issue-74
Mar 12, 2015
Merged

Update NDAttribute datasets to increase in dimension as required Fix #74#81
ulrikpedersen merged 1 commit intohdf5-issuesfrom
issue-74

Conversation

@ajgdls
Copy link
Copy Markdown
Contributor

@ajgdls ajgdls commented Mar 11, 2015

Introduced a new parameter NDAttributeChunk for setting chunking. Updated
NDAttribute datasets to use unlimited maximum length and then increase the
size as required. If the user stops the collection early then only
collected data is stored to the datasets.
This resolves issue #74.

)

Introduced a new parameter NDAttributeChunk for setting chunking.  Updated
NDAttribute datasets to use unlimited maximum length and then increase the
size as required.  If the user stops the collection early then only
collected data is stored to the datasets.
This resolves issue #74.
@ulrikpedersen
Copy link
Copy Markdown
Member

I've tested this and it works for me.

Note that the default value of NDAttributeChunk=0 is probably the best option for 99% of use-cases. Except if you are writing really huge datasets (100.000+ frames). Because of that I think it is OK not to add this parameter to the screens - if someone accidentally (or without understanding the parameter) sets a small value, the performance of writing the attribute datasets may well suffer significantly.

ulrikpedersen added a commit that referenced this pull request Mar 12, 2015
Update NDAttribute datasets to increase in dimension as required Fix #74
@ulrikpedersen ulrikpedersen merged commit dfb40e6 into hdf5-issues Mar 12, 2015
@ulrikpedersen ulrikpedersen deleted the issue-74 branch May 1, 2015 14:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants