Master Thesis - discussion -

This blog is for thoughts and discussions on my Master's Thesis.

Saturday, June 02, 2007

Results for «Verification and Validation in Scientific Simullations»

Aggregated Results for the survey can be found here.
18 people filled in the questionnaire.
Thank for those who took part in this.

Monday, May 21, 2007

16th Inter-Institute Seminar for Young Researchers from the Technical Universities of Budapest, Cracow, and Vienna, May 17-20, 2007, Vienna, Austria

Detailed information on the seminar can be found here.

My presentation: "The Survey of Verification and Validation Processes for Large Scale Simulations in Computational Mechanics".

ABSTRACT

KEYWORDS: computational mechanics, simulation, verification, validation, benchmark database

Verification and validation (V&V) of simulation models, code and solution are requisite for the approvement of the scientific simulations.
These issues were first studied from the scientific literature [1, 2, 3, 4, 5, 6] and discussed with engineers. Then a sort of survey was conducted to check how the situation really looks like.

The presentation shows the current situation for V&V activities in the area of computational mechanics. The different approaches to model validity and code verification are presented. Various V&V techniques are defined and data validity is prescribed. The question: ”How to improve the processes of V&V and how to make them easier?” is tried to be answered.

One of the conclusion is that there is a strong need for benchmark databases to help scientists to verify and validate their numerical models, software and simulation results.


REFERENCES
[1] W.L. Oberkampf, T.G. Trucano, and C. Hirsch. Sandia National Laboratories. Technical report. Verification, Validation, and Predictive Capability in Computational Engineering and Physics. SAND2003-3769, 2003.
[2] W.L. Oberkampf, T.G. Trucano. Sandia National Laboratories. Technical report. Verification and Validation Benchmarks. SAND2007-0853, USA, 2007.
[3] L. Hatton. Technical report. The T-experiment: errors in scientific software. Oakwood Computing, 1997.
[4] D.E. Stevenson. A Critical Look at Design, Verification, and Validation of Large Scale Simulations. Clemenson University, SC29634-1906, 2001.
[5] G. Love, G. Back. Project Performance Corporation. Model Verification and Validation for Rapidly Developed Simulation Models: Balancing Cost and Theory. Proceedings of the 18th International Conference of the System Dynamics Society, Norway, 2000.
[6] R.G. Surgent. Simulation Research Group, Syracuse University. Verification and Validation of Simulation Models. Proceedings of the Winter Simulation Conference, 1998

Saturday, May 12, 2007

The Survey of Verification and Validation Activities in Scientific Simulations

Verification and Validation (V&V) issues in scientific work in the area of computational mechanics are an important part of this master thesis discussion. In order to explore these problems a sort of survey has been carried out. The questionnaire can be found here (active until the end of May'07).

Although the questionnaire is focused on the area of computational mechanics, it is not limited for this group only. Similar processes, activities and problems occure in other branches of scientific computer simulations.

So if You are involved in scientific simulations, please take part in this survey and fill the form.

The agregated results for the survey will be available at this blog by the and of May'07. Conclusions will be published in June'07.

Friday, January 19, 2007

Architecture of the system

This is a proposed deployment model.

What can be a Benchmark Client?:
  • Web viewer (basic client)
  • Console (optional)
  • Custom GUI (optional)
What does the Benchmark Server do?:

It is the engine (consisted on some scripts) , that enable user : adding, editing, searching, uploading/downloading data.

Benchmark Server has connection with Benchmark Database.

What does the Benchmark Database contain?:

It storages benchmarks' data, such as: XML file, that describes the benchmark and other files connected with this benchmark (e.g. documents, figures, tables, images, other files).

Sunday, May 21, 2006

Case - the example usage of the system

An engineer has written a program for analysis of Williams frame (non-linear geometrically problem (big bending) FEM with incremental-iterational proceedings applying ). He would like now to know whether what he wrote counts correctly. And he wants to test his program on a few data sets, for which solutions to this problem are known.

And here a database of benchmarks can be helpful. The engineer is writing down the www address of the database in the viewer..

Next he is searching for needed demonstration tests, for witch solution and description of the process are known. He is downloading needed data. He is testing his program with the data. Seeing results of his program and comparing them with results of benchmarks he is able to judge whether his program works correct.
  • If he gets for eg.: different solution, he should verify his program/model and/or tests it with more benchmarks input data
  • If he gets for eg.: better results than given benchmark's output data, he can add his own benchmark to the database
The conception of benchmarks' database is, in my opinion, all the more interesting that it lets testing own programs, with demonstration tests which are received from commercial programs. This solution indirectly improve the quality of programs which are written to individual / institutes / commercial needs.

Detailed scope of my master's thesis

DETAILED SCOPE OF MY MASTER'S THESIS:
  1. Classification of data used in computer simulations
  2. Standarization of the benchmark's format
  3. User interfaces serving the addition, modification, searching for benchmarks, downloading them
  4. Structure of the database
  5. Presentations of benchmarks (generating reports)
  6. Rankings
  7. Example usage of the benchmarks' database
  8. Perspectives of improvement and development for the system
  9. Diploma paper includes issues like:
    1. Scientific computer simulations situation, problems (accuracy, errors, modelling, quality of applications...)
    2. Verification and validation (V&V) activities , problems
    3. Benchmarks' database - reasons for such a project, a kind of help
    4. Project full description - architecture, implementation, usage.

Topic of my Master Thesis


Institut for Computational Mechanics
Civil Engeneering Faculty
Cracow University of Technology


Supervising:
dr hab. inż. Maria Radwańska prof. PK

mgr inż. Roman Putanowicz


Institut for Mechanics of Materials and Structures
Civil Engeneering Faculty
Vienna University of Technology


Supervising:
Univ.-Prof. Dipl.-Ing. Dr. techn. Josef Eberhardsteiner

TOPIC:

Design and implementation of benchmarks' database for Computational Mechanics.
Read more »