...
Brief Summary | Extensive discussion in order to agree on a common terminology. Agreement reached and to be adopted within ON-DEM context as follows: Benchmarking: Comparison between two or more codes (e.g. in terms of performance, accuracy, etc.) Verification: A given code does what is supposed to. Verification is provided by means of an analytical solution (if available) or in terms of the ability of the code to replicate the relevant physics. Validation: A given code agrees with observed reality. Validation is usually provided by comparison against physical experiments/measurements. Challenge: Where experimental measurements are made independently (i.e. particle properties), and these are simulated using a given DEM code and the . The simulation results are then independently verified against experimental results which were not known by the people simulating them. Standards of benchmarking may change and evolve over time. Recommendations: Develop guidelines for verifying basic sub-modules and algorithms (e.g., unit tests), while prioritizing accurate physics representation. Consider both performance and accuracy when evaluating the results (e.g., floating precision). |
---|---|
Important Links | Relevant papers to be added here. |