/
Benchmarking problems for DEM on 5/9/2024

Benchmarking problems for DEM on 5/9/2024

Discussion leader(s)

  • Retief Lubbe

  • Daniel Barreto

Key Achievements

  1. Enhanced/extended industrial engagement

  2. Agreement on definitions for benchmarking, validation, verification and DEM challenges to be adopted across all Working Groups and ON-DEM deliverables.

Focus for next period

  1. Categorisation of problems into benchmarking, verification, validation and/or challenge.

  2. Precise definition of problems for external (and internal) promotion.

Brief Summary

Extensive discussion in order to agree on a common terminology. Agreement reached and to be adopted within ON-DEM context as follows:

Benchmarking: Comparison between two or more codes (e.g. in terms of performance, accuracy, etc.)

Verification: A given code does what is supposed to. Verification is provided by means of an analytical solution (if available) or in terms of the ability of the code to replicate the relevant physics.

Validation: A given code agrees with observed reality. Validation is usually provided by comparison against physical experiments/measurements.

Challenge: Where experimental measurements are made independently (i.e. particle properties), and these are simulated using a given DEM code. The simulation results are then independently verified against experimental results which were not known by the people simulating them. Standards of benchmarking may change and evolve over time.

Recommendations: Develop guidelines for verifying basic sub-modules and algorithms (e.g., unit tests), while prioritizing accurate physics representation. Consider both performance and accuracy when evaluating the results (e.g., floating precision).

 

Brief Summary

Extensive discussion in order to agree on a common terminology. Agreement reached and to be adopted within ON-DEM context as follows:

Benchmarking: Comparison between two or more codes (e.g. in terms of performance, accuracy, etc.)

Verification: A given code does what is supposed to. Verification is provided by means of an analytical solution (if available) or in terms of the ability of the code to replicate the relevant physics.

Validation: A given code agrees with observed reality. Validation is usually provided by comparison against physical experiments/measurements.

Challenge: Where experimental measurements are made independently (i.e. particle properties), and these are simulated using a given DEM code. The simulation results are then independently verified against experimental results which were not known by the people simulating them. Standards of benchmarking may change and evolve over time.

Recommendations: Develop guidelines for verifying basic sub-modules and algorithms (e.g., unit tests), while prioritizing accurate physics representation. Consider both performance and accuracy when evaluating the results (e.g., floating precision).

 

Important Links

Relevant papers to be added here.

Related content

WG4 meeting on 5/9/2024
WG4 meeting on 5/9/2024
More like this
CHoPS (satellite) ON-DEM Working Group Meeting
CHoPS (satellite) ON-DEM Working Group Meeting
Read with this
WG4: Review of meeting in Helsinki, Finland on 16th May 2024
WG4: Review of meeting in Helsinki, Finland on 16th May 2024
More like this
Outreach: Communication, dissemination and valorisation on 5/9/2024
Outreach: Communication, dissemination and valorisation on 5/9/2024
Read with this
WG1 Meeting/Hackathon Plan and Outcomes (30th Jan, Thursday 1400 - 1730)
WG1 Meeting/Hackathon Plan and Outcomes (30th Jan, Thursday 1400 - 1730)
More like this
WG4: Review of Workgroup Meeting at Vilnius on 1st August 2024
WG4: Review of Workgroup Meeting at Vilnius on 1st August 2024
More like this