Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • T TurTLE
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 3
    • Issues 3
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • TurTLE
  • TurTLE
  • Issues
  • #17

Closed
Open
Created Aug 18, 2017 by Cristian Lalescu@clalescuMaintainer

optmize sampling HDF5 access

Each call to sample_from_particles_system opens the particle file, writes, and then closes the particle file. As far as I know, this means flushing the buffer, so we lose any "small data writes" optimization that HDF5 can supply.

Related: see https://gitlab.mpcdf.mpg.de/clalescu/bfps_addons/blob/feature/new-multiscale-particles/bfps_addons/cpp/full_code/multi_scale_particles.cpp. That file shows the common usage pattern for the sampling functionality, that is what needs to be optimized.

All of this applies to sample_particle_system_position as well, obviously.

Assignee
Assign to
Time tracking