Literature search, concentrating on existing reports (of which we have many).
Strand (a) of the project will run according to the following approach:
Identification of products and vendors to be considered (in consultation with sister e University studies, JISC and other advisors, to ensure no overlap and underlap).
In parallel, creation of criteria for evaluation of these products and their usage (in consultation with sister studies in order to factor in business model and system size). Criteria will include system information (such as architecture, scalability, standards) and user information (such as “industrial-strength” reference sites). The criteria will be radically simplified compared with a full tender, since what is required is in the nature of a short list or pre-qualification, not a final selection. (See the Annex1 for a starter set.)
The team has considered the suggested classification of systems in survey (1) into (a) pedagogic tools and (b) approaches to student assessment online. Our view is that this does not make good pedagogic or system sense. There is also an apparent over-emphasis on computer conferencing: there are now many other system paradigms including some traditional e-learning ones that have now come of age in the era of the home desktop and faster Internet, such as screen-sharing and multimedia. Finally, most Managed Learning environments go beyond purely administrative functions; many (such as TopClass) include assessment tools; and several (such as Virtual-U) include industrial-strength computer conferencing. Vice versa, several supposed computer conferencing systems (such as FirstClass) include assessment tools as “plug-in” components. Moves towards component-based architectures and standards-driven interoperability further blur these differences.
Consequently the study team feels that for survey (1), all main pedagogic tools should be surveyed, even if they also offer administrative functions up to and including Managed Learning Environments. However, we would agree that Survey (3) may look at Managed Learning Environments from a somewhat different standpoint.
Finally, regarding survey (2) we note that certain electronic learning resources (SmartForce, NETg, Cisco Academy) come with an embedded pedagogic engine which in at least two cases that we know of is being wholly or partially unbundled. We shall consider these only if the vendor is agreeable to the engine being evaluated on its own, separate from the content.
Relevant resources available to the study team
The study team has access to one of the best collections in the world on material on e learning in HE, FE and training, collected in over 15 years of activity on behalf of many sponsors (commercial, EU and national governments) and several employers. As an example, in 1996 the Finnish government supplied Professor Bacsich with a research assistant for 6 months to work solely on virtual university exemplars and systems.
This corpus of material has been added to recently particularly by the activities of the Virtual Campus Programme,1 the EU Objective 4 Upgrade2000 project (Digital media for e-teaching of Basic Skills), the JISC Costs of Networked Learning project (which included 7 case studies of UK HE institutions), the EU Leonardo Telelearn project (on cost-effectiveness of e-training) and the FEFC-funded National Learning Network Evaluation (which is tracking English FE colleges’ use of e-learning).
The project will also benefit from interworking with other e-learning staff in the University, who overall have experience of a wide range of e-learning systems.
Start work on 15 May
Start literature search (continuous)
26 May: finalise questionnaire and list of vendors and send out questionnaire by email
Receive and work on comments from HEFCE and Steering Group
7 July: submit final report.
On such a short project, the main issues are (1) good communications, (2) resilience of staffing and (3) quality of outcomes.
The staff involved know and trust each other and have all worked together for over 10 years, since the era of the pilot studies for the DELTA Programme in Third Framework. They all have good electronic communications including email from home.
In the case of illness, each staff member has backup since there is an overlap of skills. Since the activity is happening outside the teaching period, academic and technical staff are more easily available for backup roles. Further people from the usual “consultant diaspora” used by the University could be called on in extreme situations.
Robin Mason will be in charge of the Quality Committee. Her role will be to read and comment internally on the draft final and final reports.