Theoretical Framework



Yüklə 389,23 Kb.
səhifə6/9
tarix12.01.2019
ölçüsü389,23 Kb.
#95765
1   2   3   4   5   6   7   8   9

6. ICT Tools for assessment

In this chapter we will give criteria for ICT tools for assessment. Based on these criteria an ICT tool will be chosen for the first prototype. This is our method:




  1. First we compiled a large list of criteria, based on our theoretical framework and own experiences;

  2. From this list we extracted some minimal requirements and categories of requirements.The categories were determined by collecting all possible criteria from the theoretical background, grouping them and reformulating a category; The raw list of criteria can be found in appendix B

  3. A large list if tools was compiled, based on experiences from earlier projects, the Special Interest Group Mathematics ‘toetsstandaarden’ working group {Bedaux, 2007 #160}, KLOO research {Jonker, #22}, the FI math wiki on digital assessment and math software (http://www.fi.uu.nl/wiki/index.php/Categorie:Ict), and google searches. As there are hundreds of math tools an initial selection was based on the tool having at least some characteristics of tools for assessment. Based on these requirements we:

  4. First selected tools that met the minimal requirements. Tools that didn’t, were described but not considered any further. For this we used a template.

  5. Then the remaining tools were graded on the other categories by:

    1. Browsing the web on more information and usage on the tool;

    2. Installing the tool locally;

    3. Using the tool with already existing content. We aim at using quadratic equations as this tends to be subject that is catered for almost always;

    4. Authoring our own content from chapter 5. Here it is possible that not all the finesses of a tool become apparent. A minimal requirement is that authoring can be used. This means for most tools that they have to be installed. For every installed tool I keep a log of screenshots.

  6. This resulted in a separate descriptions of the tools and a matrix, giving an overview of strong and weak points of several tools.

Minimal requirements

  • Webbased: we find it essential that using the tool can be anytime, anyplace, anywhere, using just a web browser.

  • Ease of authoring, configurable: it should be possible to add own content.

  • Actively developed: it is important that the tool is supported and has some sort of continuity.

  • Minimal math support: formulas should be displayed correctly and support basic mathematical operations.

  • Storage of progress: it should be possible to store results, so students can come back later, if necessary.

Categories
For every item we answer these questions

  • What does this item mean?

  • Why is it important?

  • How is it scored/weighted?

It is also important to note that we are scoring these tools in the light of assessment and algebra. Therefore other aspects could give plus-points, but can never be the deciding factor.

Math specific

In this category we mention the variables that are -in our eyes- specific for mathematics. Here we asked ourselves the question "what are -comparing with other subjects- aspects that distinguishes  math from other subjects.



  1. Representational (sound representations, mathematical and cognitive fidelity)
    It is important that the way a student can work resembles the 'paper-and-pencil' way. This means being able to use graphs, tables and a certain resemblance to this way. We assert that using multiple representations (Van Streun) and a connection to today's practices, increase transfer.

  2. Randomization
    Assignments that are provided should not always be the same, but have varying values. Randomization caters for this. 

  3. Multiple steps (within one question, from one to other question)
    Here we distinguish two typesof  'multiple step'  exercises.

    1. Within a question: an open environment is provided and a student can choose what path to follow. This can be just one step, but also twenty steps.

    2. Between questions: some complex questions consist of several subquestions. Often one questions builds on an earlier one. 

  4. Integrated CAS
    For complex operations a CAS is necessary. We distinguish the possibility of using a CAS and the availability of a CAS, and also the type of CAS.

Ease of use

  1. Teacher, authoring (questions, text, links, graphs, multimedia)
    Teachers should be able to make their own qustions, with text, links, graphs etc., in a user-friendly way. This aspect scores -based on my own usage- how well authoring can take place.

  2. Student, usage (e.g. input editor, learning curve)
    Students should be able to work with tool. This means that the user intrafce and structure of the tool should be intuitive. The same holds for the input of mathematical formulae.

Registration

  1. Answers
    How much of the answers is stored? Are all the answers -also the wrong ones- stored, or only the last answer.

  2. Process
    And how much of the process. How did a student come to a certain answer?

Assessment

  1. Possibility of modes (pratice, exam)
    Asessing formatively means that asessment is for learning. Therefore asessment takes place during a course, not only at the end in an exam setting. Providing several modes is a good facility.

  2. Feedback (process, answer, global)

    1. Process: giving functional feedback on the strategy used.

    2. Answer: providing feedback on the answer given. The more specific this is, the more a student can learn from it.

    3. Global: overall mastery of a subject. Giving insight in a students progress on amore global scale.

  3. Hints
    The possibility of providing hints or that hints are provided.

  4. Review mode (what has he/she done wrong or right)
    The possibility of scrutinizing ones wrong or correct answers, as to learn from them, including process. The finer the granularity the more information -when needed- can be used.

  5. Question types
    Sometims only open questions are not enought. Other question types could provide more flexibility.

  6. Scoring
    Being able to use game-like scoring could help motivate students 'getting the highest score'

Content management

  1. Question management
    How well can questions be managed? Can they be copied easily? Can modules be recycled easily?

  2. Use of standards
    This also implies that a certain compliance to standards like QTI or SCORM is a plus.

  3. Available content
    If teachers are reluctant to make their own questions, a large user base and questions could help.

Other

  1. Cost
    Using a tool in secondary and higher education . We mean the bare licenses for using the tool. Other aspects of costs (support, training) are scored in aspects like 'ease of use' and 'continuity'.

  2. License and modifiability
    What type of license does this tool have? Does this license make it possible to modify the software to ones own wishes (open source).

  3. Technical requirements (also own installation)
    How easy is it to install locally? How high are the technical requirements?

  4. Continuity
    Is there a community, firm or organisation that can provide help with a tool? Our experience is that tools with small user bases and little support, provide less continuity.

  5. Languages
    How multilingual is the tool?

  6. Stability and performance
    Of course this item also depends on the somputer it is installed on. However, using and installing a tool does give an impression on the overall performance and stability.

  7. Looks
    In education a tool should look good.

  8. Integration for VLE

We propose to use the weights provided in the matrix in appendix E.

We would have liked to ask experts on the scores in the matrix, but this would imply that experts should have detailed knowledge of all the tools. Although one could always argue that the scores presented are arbitrary, we would contend that it provides a cgood picture of weak and strong points of the tools.



Conclusion:

  • Wiris is an attaractive tool for standalone use within for example Moodle. However, the lack of assessment functions means it is not suitable enough for our research.

  • WIMS probably is the quickest and most 'complete' of the tools with room for geometry, algebra, etc. There also is a fair amount of content available, also in dutch. It is let down by the feedback and the fact only one answer can be entered. Of course this can be programmed, as it is a very powerful package, but here we see a steep learning curve.

  • Digital Mathematical Environment. Strong points are the performance, multisteps within exercises plus feedback, authoring capability, SCORM export. Disadvantages: emphasis on algebra (not extendable, dependent on the programmer), source code not avaialble.

  • STACK has a good philosophy with 'potential responses' and multistep questions. Also, the integration with a VLE -unfortunately only moodle- is a plus. Installation, stability and performance is a negative (slow and cumbersome), as well as its looks. It is also very experimental, providing almost no continuity.

  • Maple TA has many points that STACK has, but with better looks and no real support for 'potential responses' and ' multistep questions'. They can be programmed, but this means -like WIMS- coping with a steep learning curve. As it has its roots in assessment software question types are well provided for. One could say that Maple TA started as assessment software and his moving towards software for learning, while STACK started with learning and is moving towards assessment software.

  • Activemath is more of an Intelligent Tutoring System than a tool for assessment. The question module (ecstasy) is powerful, providing transition diagrams. However, authoring and technical aspects make it less suitable for the key aspects we want to observe.

For summative assessment Maple TA is the best contender. For both summative (scores) and formative assessment DME is best used, with STACK a runner up. For algebra DME is best used because of the fact the 'process' is central. For all other mathematical topics WIMS is most versatile. For an interactive book without assessment features Activemath could be used best. The best standalone mathematical environment (CAS) is provided by Wiris

We conclude that DME is best suited for answering our research question, as it:



  • mixes formative and summative aspects of asessment into one tool;

  • provides an open algebra environment, opposed to the more closed questions (one answer) that other tools provide;

  • is a stable and attractive tool (motivation);

The descriptions of the tools that were considered but did not meet the minimal requirements can be found in appendix C. Descriptions of the tools that did meet the minimal requirements can be found in appendix D. A comparative matrix assessing the strong and weak points of these tools can be found in appendix E. In the long term we aim to publish these assessments in a wiki type environment so they can be kept up to date.
Now we will use our tool to model the questions from chapter 5.

Yüklə 389,23 Kb.

Dostları ilə paylaş:
1   2   3   4   5   6   7   8   9




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin