FR+Group+A

Development and Implementation As a group, Create a plan to develop a functionality rubric. Make sure to include answers to the key questions regarding development.


 * Functions for inclusion in rubric are identified:**
 * Use an existing rubric-review and adapt existing rubric.
 * A small committee subgroup should work to revise the funtionality rubric.
 * Will need technical expertise (teaching) We will need several levels on the committee: technical, network and instructional
 * Committee Possible Composition:3 network 3 instructional, and 1 from every district.
 * A rating system/Design Model is created:**
 * Rating system should come from the data we collected from survey
 * Look at weights ..everything that is 60 is a 3
 * Capture the data but keeping in mind that you dont want to reward a product that does everything yet does not do everything well.
 * There should be some way to standardize the rubric item.
 * Take under considerations the work arounds...... should we capture work arounds and add-ons when considering rating?
 * Consider Core functionality of the product as well as plug in/add in options( Example rate item a 0 however note add-ons or work-arounds)
 * Possibly set up a test environment with add ons in place
 * Focus on the "lollipop" items but keep in mind portfolios because we definitely see a future need for them
 * Develop a requirements document to share with each company
 * Consider Possible Cost of digging up add on and work arounds


 * Possible rubric score/rating system**
 * Start at 4
 * 3 /2 nice to have
 * 1 there is an add on
 * 0 no work around or add on.

Should evaluators see all products or focus on one? YES Should evaluators review together? (Possibly have group come together with rubric and do one type of item at a time....give teams specific roles for testing) Consider testing costs and sustainability costs. (one time costs and ongoing costs for creating testing environment as well as if purchased) Possibly doing a pilot test before the large scale testing to make sure items are valid. Double check our process before having evaluators move forward with scoring LMSs with rubric. Remembering that the quality of what we get will depend on the quality of the training/guidance that we give to evaluators. Possible subcommittee for developing training for evaluation process.
 * Testing Concerns:**


 * Open Source LMS to be tested:**
 * The ones that are posted on the wiki

-
 * User Profiles** **Needs:**
 * Admin (10)
 * Teacher/ Course Creator (10)
 * Student (10)
 * Logistics**- Go with the window in the plan

Concerns....Mac vs Windows platform... testing out all mutimedia types and file formats is important
 * Who would do testing** ---subsets in each county.