4 How to use instructional simulations in meteorological training?

Thomas and Milligan (2004) claims that it is important that students engage with the underlying simulation model, not just with the user interface. As Davies (2002) points out, interactivity is not synonymous with engagement. Pilkington and Parker-Brown (1996) noted the tendency for students to concentrate on manipulating objects without generating a deeper understanding of the model or principles behind the observed behaviour. Laurillard (1993) draws the distinction between qualitative reasoning: incorporating knowledge of real world objects and quantitative reasoning: referring only to quantities and processes explicitly presented on the screen and suggested that the more interpretive approach of qualitative reasoning must be encouraged. One method of doing this is to allow students to construct the models themselves. (...)

Alessi (2000) is of the opinion that for most educational purposes, such models should be enhanced with an instructional overlay. Alessi also points out that depending on the instructional paradigm adopted it may be advantageous to expose the model (glass-box), e.g. in expository learning or to hide the model (black-box), e.g. in discovery learning where students must discover it for themselves. Thus, educators may wish to control the degree of access the learner has to the internal model.

Source: Putting Teachers in the Loop: Tools for Creating and Customising Simulations by Ruth Thomas & Colin Milligan in Journal of Interactive Media in Education (2004).
http://jime.open.ac.uk/articles/10.5334/2004-15/



Like in the video above, "Scientist Examines Tornado Vortex" posted by NASA/Marshall Space Flight Center, a scientist examines what appears to be a tornado vortex (blue) coming out of a simulated thunderstorm. The scientist uses 3D glasses to be able to see in 3 dimensions the different flows going out into the vortex. Thus in meteorological training simulations can be used with a purpose of observation or examination. This type of simulation helps to demonstrate phenomena and illustrate what happens when parameters are changed.

However if a simulation was interactive, users could be asked to participate in a simulation, manipulate parameters and observe the simulated effect. The three consecutive examples below, demonstrate this type of simulations.

Example 1
Test the climate simulator and play the role of a world leader and use this tool to see how different emission levels affect global temperature.
http://www.oercommons.org/courses/climate-simulator/view

Example 2
A volcano simulator helps learn about the magma eruption mechanism. Change the amount of SiO2 and simulate the volcano eruption.
http://www.alaskamuseum.org/education/volcano

Example 3
Learn about the blackbody spectrum of the sun, a light bulb, an oven, and the earth. Adjust the temperature to see the wavelength and intensity of the spectrum change. View the color of the peak of the spectral curve.


See other simulations shared through the Phet project of the University of Colorado
http://phet.colorado.edu/en/simulations/category/new

Some simulations bring the participatory element one step further and participants are engaged in decision making. This type of simulations can be useful for practicing the forecasting process, issuing warnings and communication with customers. They can be used either for practice or as a part of assessment.

Less applied in meteorological training, one powerful use of simulations is having students involved in creation and disassembling. Often trainers prepare a ready interactive product and the simulation model behind is hidden (black-box). Nevertheless, providing opportunities for disassembling an existing simulation  or producing a new simulation together with participants can be an effective and engaging exercise.

Current technology allows simulations to be closer to the reality than ever. Therefore it is important to remember that simulation is not a reality but it imitates it. This is why basing assessment exclusively on simulations, without assessing actual  job performance, is not enough. But let's ponder for a moment how simulations can be useful as a part of assessment. 
Assessment and simulations

According to Quinn (2013) writing good assessments is hard. Principles of good practice include meaningful decisions, alternatives that represent reliable misconceptions, relevant contexts, believable dialog, and more. Assessments  must be aligned to the objectives, and ideally have an increasing level of challenge.
There are some technical issues as well. Extensions that are high value include problem generators and randomness in the order of options (challenging attempts to ‘game’ the assessment). A greater variety of response options for novelty isn’t bad either, and automarking is desirable for at least a subset of assessment.(...)

While SMEs can write content and even examples (if they get pedagogical principles and are in touch with the underlying thinking, but writing good assessments is another area.

(...)Writing meaningful assessments, particularly leveraging interactive technology like immersive simulation games, is an area where skills are still going to be needed. Aligning and evaluating the assessment, and providing scrutable justification for the assessment attributes (e.g. accreditation) is going to continue to be a role (for instructional designers) for some time.

We may need to move accreditation from knowledge to skills (a current problem in many accreditation bodies), but I think we need and can have a better process for determining, developing, and assessing certain core skills, and particularly so-called 21st century skills. (...) There will continue to be a role for doing so, even if we make it possible to develop the necessary understanding in any way the learner chooses.
Source noncommercial : C. Quinn, Assessing online assessments in Learnlets, 9 May 2013.
http://blog.learnlets.com/?p=3305

Further readings
Assessment in and of Serious Games: An Overview Francesco Bellotti, Bill Kapralos, Kiju Lee, Pablo Moreno-Ger, and Riccardo Berta, 2013

Activity 4
Discuss using comment box below. How do you evaluate performance in a simulation?

No comments: