Metadata Collections & QA
Skip navigation Digital Library for Earth System Education
Digital Library for Earth System Education
Search tips

Community Annotated Collection

Collection description and purpose

The purpose of the Community Annotated Collection is to use the DLESE Community Review System (CRS) to:

  • identify excellent educational resources from within DLESE and move them into the DLESE Reviewed Collection,
  • provide feedback from users to resource creators to allow creators to iteratively improve their resources, and
  • provide information for prospective users of DLESE resources, based on other users' experience, that will help them make informed decisions about whether to use the resource and will help them use the resource more effectively.

The collection considers two sources of review information: first, community reviews gathered via the World Wide Web from educators and learners who have used the resource for teaching or learning, and second, specialist reviews by specialists selected by an Editorial Review Board. Collection review information, in aggregated form and with reviewers' identities removed, is posted on the Web. In addition to evaluating resources against the seven DLESE Reviewed Collection selection criteria, the collection gathers and disseminates teaching tips/ comments, and user's assessments of the resource's effectiveness in challenging teaching and learning situations.

3. Collection policy

The collection encourages and initiates reviews and comments about learning materials. This includes those DLESE resources that are designed for the classroom and actual learning situations like activities, case studies, courses, curriculums, field trips, lesson plans, modules, units, problem sets, projects, tutorials and books.

This review information about resources is then presented as different types of annotations within the collection:

  • Teaching tip - information for using or modifying the resource within teaching or learning environments
  • Review - a formal or informal evaluation of a resource
  • Quantitative information - numerical information about a resource
  • Editor's summary - a summary of evaluative information (qualitative or quantitative) that may describe the strengths, weaknesses, pedagogical effectiveness, best uses for or general characteristics of a resource
  • Challenging audience - information about the effectiveness of a resource with a specific audience
  • Comment - a remark, explanation, advice, opinion, attitude, fact, proposed change, question or additional resource information

For the quantitative and challenging audience information, it is easiest to look at actual data to understand the information collected. Please see the following;

  • Quantitative information - is a review of the resource against the DLESE Reviewed Collection criteria. http://www.dlese.org/reviews/crs/index23.php
  • Challenging audience - is a review of the resource in terms of how it meets the needs of learners (adult, visually impaired, hearing impaired, limited mobility, urban dweller, economically disadvantaged, limited English proficiency, limited technology experience, low literacy level, learning disabled, emotional problems, or attention deficit). http://www.dlese.org/reviews/crs/index24.php

Top

4. Contact information

Please direct questions about the collection to support@dlese.org.

5. Resource terms of use

Since the resources within the collection are annotation of third-party sites, the resource terms of use are dictated by the individual resource copyrights and terms of use found on each resource's website.

6. Metadata terms of use

Metadata is copyrighted 2003 Columbia University and 2007 University Corporation for Atmospheric Research (UCAR). DLESE may modify, reformat, and redistribute metadata to function within DLESE systems and services.

7. Resource quality assurance

The resources made accessible via this collection are access points into many different organizations who are responsible for the quality of the materials. The collection strives to be a high-quality collection based on its usefulness to educators and the selection methods described above under collection policy.

Top

8. Metadata quality assurance

Metadata is generated by automated means after the vetting of community generated content. The the identifier of the resource being annotated is verified via queries to DLESE at the time the annotation is created. The remaining fields are either controlled vocabularies (i.e. 'pathway', type', 'status'), are assigned automatically using information from the resource being annotated or use the vetted community content that resides in a database.

9. Persistence plan for the collection

The collection is expected to exist indefinitely as long funding is available to maintain it. Because all resources reviewed and annotated by the collection are already in DLESE, no resources would be orphaned if the collection ceased operations.

Top

Last updated: 2007-11-01
Maintained by: Katy Ginger & Shelley Olds (support@dlese.org), DLESE Metadata