The e-Tools (1) Report: Pedagogic, Assessment and Tutoring Tools

Sizin üçün oyun:

Google Play'də əldə edin


Yüklə 0.84 Mb.
səhifə14/17
tarix18.01.2019
ölçüsü0.84 Mb.
1   ...   9   10   11   12   13   14   15   16   17

2 Work to be undertaken


Because of the limited (8 week) time scale, the project will run in four parallel strands.

  1. Pedagogic and assessment tools – systems

  2. Pedagogic and assessment tools – reference sites (including content)

  3. Face-to-face tutoring issues

  4. Literature search, concentrating on existing reports (of which we have many).

Strand (a) of the project will run according to the following approach:

  1. Identification of products and vendors to be considered (in consultation with sister e University studies, JISC and other advisors, to ensure no overlap and underlap).

  2. In parallel, creation of criteria for evaluation of these products and their usage (in consultation with sister studies in order to factor in business model and system size). Criteria will include system information (such as architecture, scalability, standards) and user information (such as “industrial-strength” reference sites). The criteria will be radically simplified compared with a full tender, since what is required is in the nature of a short list or pre-qualification, not a final selection. (See the Annex1 for a starter set.)

  3. Sending out of a questionnaire to vendors inviting them to respond to the criteria.

  4. Checking by our team of the responses for internal consistency, and benchmarking them against our literature database.

Strand (b) will identify key “e-university/virtual university/virtual campus” reference sites of relevance to the UK e-University, and reality check vendor claims against these sites.

Strand (c) will consist of the preparation of a report on face-to-face versus online issues.

Strand (d), the literature search, runs throughout the whole project, but takes as its start point a comprehensive existing collection of material (see below).

Classification of systems


The team has considered the suggested classification of systems in survey (1) into (a) pedagogic tools and (b) approaches to student assessment online. Our view is that this does not make good pedagogic or system sense. There is also an apparent over-emphasis on computer conferencing: there are now many other system paradigms including some traditional e-learning ones that have now come of age in the era of the home desktop and faster Internet, such as screen-sharing and multimedia. Finally, most Managed Learning environments go beyond purely administrative functions; many (such as TopClass) include assessment tools; and several (such as Virtual-U) include industrial-strength computer conferencing. Vice versa, several supposed computer conferencing systems (such as FirstClass) include assessment tools as “plug-in” components. Moves towards component-based architectures and standards-driven interoperability further blur these differences.

Consequently the study team feels that for survey (1), all main pedagogic tools should be surveyed, even if they also offer administrative functions up to and including Managed Learning Environments. However, we would agree that Survey (3) may look at Managed Learning Environments from a somewhat different standpoint.

Finally, regarding survey (2) we note that certain electronic learning resources (SmartForce, NETg, Cisco Academy) come with an embedded pedagogic engine which in at least two cases that we know of is being wholly or partially unbundled. We shall consider these only if the vendor is agreeable to the engine being evaluated on its own, separate from the content.

Relevant resources available to the study team


The study team has access to one of the best collections in the world on material on e learning in HE, FE and training, collected in over 15 years of activity on behalf of many sponsors (commercial, EU and national governments) and several employers. As an example, in 1996 the Finnish government supplied Professor Bacsich with a research assistant for 6 months to work solely on virtual university exemplars and systems.

This corpus of material has been added to recently particularly by the activities of the Virtual Campus Programme,1 the EU Objective 4 Upgrade2000 project (Digital media for e-teaching of Basic Skills), the JISC Costs of Networked Learning project (which included 7 case studies of UK HE institutions), the EU Leonardo Telelearn project (on cost-effectiveness of e-training) and the FEFC-funded National Learning Network Evaluation (which is tracking English FE colleges’ use of e-learning).

The project will also benefit from interworking with other e-learning staff in the University, who overall have experience of a wide range of e-learning systems.

Time Schedule


Week

Ends

Tasks

1

19 May

Start work on 15 May

Start literature search (continuous)



2

26 May

26 May: finalise questionnaire and list of vendors and send out questionnaire by email

3

2 June

Receive draft final report on tutoring

4

9 June

5 June: start collating responses against criteria and benchmark sites

5

16 June

Continue collating responses, check standards/interoperability issues

6

23 June

23 June: submit draft final report

7

30 June

Receive and work on comments from HEFCE and Steering Group

8

7 July

7 July: submit final report.

Project Management


On such a short project, the main issues are (1) good communications, (2) resilience of staffing and (3) quality of outcomes.

  1. The staff involved know and trust each other and have all worked together for over 10 years, since the era of the pilot studies for the DELTA Programme in Third Framework. They all have good electronic communications including email from home.

  2. In the case of illness, each staff member has backup since there is an overlap of skills. Since the activity is happening outside the teaching period, academic and technical staff are more easily available for backup roles. Further people from the usual “consultant diaspora” used by the University could be called on in extreme situations.

  3. Robin Mason will be in charge of the Quality Committee. Her role will be to read and comment internally on the draft final and final reports.


Dostları ilə paylaş:
1   ...   9   10   11   12   13   14   15   16   17
Orklarla döyüş:

Google Play'də əldə edin


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2017
rəhbərliyinə müraciət

    Ana səhifə