Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel
panelIconIdatlassian-info
panelIcon:info:
bgColor#F4F5F7

Discussion leader(s)

  • Retief Lubbe

  • Daniel Barreto

Tip

Key Achievements

  1. Enhanced/extended industrial engagement

  2. Agreement on definitions for benchmarking, validation, verification and DEM challenges to be adopted across all Working Groups and ON-DEM deliverables.

Focus for next period

  1. Categorisation of problems into benchmarking, verification, validation and/or challenge.

  2. Precise definition of problems for external (and internal) promotion.

Note

Input needed from other WGs

  1. Help from WG6 in dissemination to get further engagement on benchmarking, validation and verification problems.

  2. Communication from WG1 and WG5 on relevant problems which require joint work (e.g. industrially relevant problems).

  3. Details of common DEM output file format from WG3 to be adopted in benchmarking, validation and verification campaigns, where possible.

Info

Key contributors (in alphabetical order)

  1. Karol Brzeziński

  2. Hongyang Cheng

  3. Yannick Descantes

  4. Nicolin Govender

  5. Prashant Gupta

  6. Kevin Hanley

  7. Radan Ivanov

  8. Vanessa Magnanimo

  9. Luisa Orozco

  10. Catherine O’Sullivan

  11. Marco Previtali

  12. Dingena Schott

  13. Hao Shi

  14. Paal Skjetne

  15. Anthony Thornton

  16. Deepak Tunungutla

Brief Summary

Extensive discussion in order to agree on a common terminology. Agreement reached and to be adopted within ON-DEM context as follows:

Benchmarking: Comparison between two or more codes (e.g. in terms of performance, accuracy, etc.)

Verification: A given code does what is supposed to. Verification is provided by means of an analytical solution (if available) or in terms of the ability of the code to replicate the relevant physics.

Validation: A given code agrees with observed reality. Validation is usually provided by comparison against physical experiments/measurements.

Challenge: Where experimental measurements are made independently (i.e. particle properties), and these are simulated using a given DEM code. The simulation results are then independently verified against experimental results which were not known by the people simulating them. Standards of benchmarking may change and evolve over time.

Recommendations: Develop guidelines for verifying basic sub-modules and algorithms (e.g., unit tests), while prioritizing accurate physics representation. Consider both performance and accuracy when evaluating the results (e.g., floating precision).

Important Links

Relevant papers to be added here.