Thanks to everyone who provided input!
I apologize that I ran out of time to get it all onto the blog, but your comments were discussed at the criteria writing-fest Wednesday night at the CMU.
I also want to thank the folks on the Exec Board who did or will do most of the heavy lifting in getting this stuff done quickly.
Wednesday, February 4, 2009
Criteria 5 Comments
Comments are separated by dashes.
-------
We have been using assessment for our programs. Why not use some of that information we collect every year? I don’t know if we have national test comparisons (we don’t request students to take those). So clarifications need to be given for the laundry list provided under this criterion. What, specifically are we looking for? E.g. faculty awards, service on editorial boards, student awards or professional involvement?
--------
• Question A: offering examples of exemplary performance is great except it is in the ‘testimonial’ category and usually exemplary performance is due more to the individual’s innate abilities and drive than the educational institution. We assist, of course, but I’m not sure we can take full credit (IMHO). I LIKE the ‘congruence between intended/actual outcomes’ but it is not something one can decide to measure today if you’ve not been doing good data keeping….
• Satisfaction—this is a standard measure but the alumni data is hard to get and employer information is very very difficult to get. And Nursing knows where our grads are working—I have NO idea how some depts. (English, math, psychology, etc.) would address that.
• ‘Professional’ admissions? Does this mean, did the pre-engineering students get into engineering programs?
• Question B: I think this is irrelevant.
• Question C: how does one address ‘external validation of quality’ of faculty (certifications?) or ‘outcomes mirror best practices’ (too vague); ‘recognition to institution’ ?? We seem to be missing student evaluations of individual’s teaching effectiveness???
• Needed: external accreditations or recognitions.
-------
• This is another place where I would suggest student research and/or scholarly and creative activities are deliberately identified. Perhaps using SAC as a measure?
• National test comparisons: costs for these can be prohibitive and for other test scores (GRE, MCATs) we are reliant on students to send data to the school we can't request?
• Grade inflation? How will average GPA's be compared when courses may be vastly different and teaching styles have been deliberately tailored to help students succeed?
• Professional productivity - isn't this already addressed in Criterion 4?
-------
This entire section can be condensed into “ Demonstrate Student Success.”
B. You will be provided with average GPA for graduating seniors from 10 and 5 years ago vs today This is misrepresentation of the intent of the book. Reviewing GPA trends as a means of determining grade inflation was just an example of how one single institution decided to take a quirky approach to measuring teaching effectiveness. Besides, a rising GPA trend does not prove the causality of grade inflation. What of the program that spends a decade carefully refining their first year foundations program to better prepare their students for success in the program? Should they be lumped in with grade inflation?
C. Faculty
o Peer reviewed or juried competitions
o External validation of quality
o Outcomes mirror best practices
Faculty effectiveness has already been covered under Criterion 4. Only in a research institution is faculty a measure of program outcome. In our type of institution it is better suited to aforementioned Criterion 4 Quality Inputs
o Recognition to institution
B. Has the program brought beneficial recognition to the institution?
(This should be its own item not a bullet.)
-------
-------
We have been using assessment for our programs. Why not use some of that information we collect every year? I don’t know if we have national test comparisons (we don’t request students to take those). So clarifications need to be given for the laundry list provided under this criterion. What, specifically are we looking for? E.g. faculty awards, service on editorial boards, student awards or professional involvement?
--------
• Question A: offering examples of exemplary performance is great except it is in the ‘testimonial’ category and usually exemplary performance is due more to the individual’s innate abilities and drive than the educational institution. We assist, of course, but I’m not sure we can take full credit (IMHO). I LIKE the ‘congruence between intended/actual outcomes’ but it is not something one can decide to measure today if you’ve not been doing good data keeping….
• Satisfaction—this is a standard measure but the alumni data is hard to get and employer information is very very difficult to get. And Nursing knows where our grads are working—I have NO idea how some depts. (English, math, psychology, etc.) would address that.
• ‘Professional’ admissions? Does this mean, did the pre-engineering students get into engineering programs?
• Question B: I think this is irrelevant.
• Question C: how does one address ‘external validation of quality’ of faculty (certifications?) or ‘outcomes mirror best practices’ (too vague); ‘recognition to institution’ ?? We seem to be missing student evaluations of individual’s teaching effectiveness???
• Needed: external accreditations or recognitions.
-------
• This is another place where I would suggest student research and/or scholarly and creative activities are deliberately identified. Perhaps using SAC as a measure?
• National test comparisons: costs for these can be prohibitive and for other test scores (GRE, MCATs) we are reliant on students to send data to the school we can't request?
• Grade inflation? How will average GPA's be compared when courses may be vastly different and teaching styles have been deliberately tailored to help students succeed?
• Professional productivity - isn't this already addressed in Criterion 4?
-------
This entire section can be condensed into “ Demonstrate Student Success.”
B. You will be provided with average GPA for graduating seniors from 10 and 5 years ago vs today This is misrepresentation of the intent of the book. Reviewing GPA trends as a means of determining grade inflation was just an example of how one single institution decided to take a quirky approach to measuring teaching effectiveness. Besides, a rising GPA trend does not prove the causality of grade inflation. What of the program that spends a decade carefully refining their first year foundations program to better prepare their students for success in the program? Should they be lumped in with grade inflation?
C. Faculty
o Peer reviewed or juried competitions
o External validation of quality
o Outcomes mirror best practices
Faculty effectiveness has already been covered under Criterion 4. Only in a research institution is faculty a measure of program outcome. In our type of institution it is better suited to aforementioned Criterion 4 Quality Inputs
o Recognition to institution
B. Has the program brought beneficial recognition to the institution?
(This should be its own item not a bullet.)
-------
Criteria 5 Original
A. Students
o Examples of exemplary performance
o National test comparisons
o Congruence between intended and actual outcomes
o Degrees of student, employer and alumni satisfaction
o Arts: client outcomes, Alumni working in the field
o Licensure
o Professional or graduate admissions
o Demonstrated effectiveness in preparing students for the future
B. You will be provided with average GPA for graduating seniors from 10 and 5 years ago vs today
C. Faculty
o Peer reviewed or juried competitions
o External validation of quality
o Outcomes mirror best practices
o Recognition to institution
o Examples of exemplary performance
o National test comparisons
o Congruence between intended and actual outcomes
o Degrees of student, employer and alumni satisfaction
o Arts: client outcomes, Alumni working in the field
o Licensure
o Professional or graduate admissions
o Demonstrated effectiveness in preparing students for the future
B. You will be provided with average GPA for graduating seniors from 10 and 5 years ago vs today
C. Faculty
o Peer reviewed or juried competitions
o External validation of quality
o Outcomes mirror best practices
o Recognition to institution
Criteria 4 Comments
Comments below, as usual between the dashed lines.
-----
I think this needs to be re-named to at least be consistent with the book: Quality of Program Inputs and PROCESSES. We will look at ‘outputs’ in Criterion 5.
Faculty/ Question A:
• How do we measure total experience of faculty? Especially in the service areas (i.e. teaching, nursing, SLHS) the faculty MUST have clinical experience. In fact, the stronger the clinical experience, the stronger the faculty as a group. So, do we get to count ‘years clinical exp. + yrs teaching exp.’ And then, is it ‘teaching’ at MSUM only; at any 4 year instituion; at any MnSCU institution; at a CC and/or a 4 yr school, etc etc.
• What in the world is ‘intellectually current’???
• Scholarly contributions only ‘fit’ for some areas and probably should not be weighted too heavily, considering MSUM’s current emphasis on being a ‘teaching institution’.
• Describing ‘how our faculty stack up against other institutions in the state and nationally’—so do we use the sum of the individuals’ body weight? (JUST KIDDING) I don’t understand how we can really measure this—it is very very subjective.
• Missing: faculty flexibility, initiative, leadership, passion for teaching and working with students. Not sure how to measure this.
• Suggestion: # of different courses taught by each member; #of new courses developed by each member. Might show something about the faculty’s initiative, flexibility—or ir might just indicate which are ADHD.
Students/ Question C:
For nursing at least, we should include the proportion of transfer students to new, HS students. We have a great majority of transfers and their HS GPA, HS rank; ACT are just not helpful at all. As a matter of fact, for many of them, their incoming cumulative college GPA is not a good reflection of their ability/capability. Their incoming NURSING (for the RN to BSN and grad students) GPA is very informative however. I’m sure we’re not the only program on campus with this.
Curriculum/ Question D:
• What do the first two mean? Breadth, depth, level; coherence of curriculum?
• The third one also confuses me—examination of the design….
• The final one—how the program shifted delivery to meet current student needs—will vary greatly between programs. Nursing has gone increasingly on-line; some of the perfrmance programs have no on-line.
• Need to ADD: something related to how well students are meeting the program’s own stated outcomes. (this may be in Criterion 5)
Criterion 5: Quality of Program outcomes
• Question A: offering examples of exemplary performance is great except it is in the ‘testimonial’ category and usually exemplary performance is due more to the individual’s innate abilities and drive than the educational institution. We assist, of course, but I’m not sure we can take full credit (IMHO). I LIKE the ‘congruence between intended/actual outcomes’ but it is not something one can decide to measure today if you’ve not been doing good data keeping….
• Satisfaction—this is a standard measure but the alumni data is hard to get and employer information is very very difficult to get. And Nursing knows where our grads are working—I have NO idea how some depts. (English, math, psychology, etc.) would address that.
• ‘Professional’ admissions? Does this mean, did the pre-engineering students get into engineering programs?
• Question B: I think this is irrelevant.
• Question C: how does one address ‘external validation of quality’ of faculty (certifications?) or ‘outcomes mirror best practices’ (too vague); ‘recognition to institution’ ?? We seem to be missing student evaluations of individual’s teaching effectiveness???
• Needed: external accreditations or recognitions.
-----
Some of the information listed (e.g. scholarly contributions) covers 3 years; why 3 when for other items they cover 5 years? Again certain items need clarification. How are we going to show that faculty are intellectually current (with conference attendance, workshops taken, classes taught?) What data are we supposed to use to show how our faculty “stack up against other institutions?” Why there is no place to comment on teaching or involvement of students in research/creative activity? The rating matrix needs to be adjusted to correspond to the information requested. (Why not use Dickeson’s categories, and request information better suited to who we are, for this criterion?)
------
• Clarify total years of experience better. Experience doing anything related to the program (teaching, research, outreach, external committees etc.). Do the years include time not at MSUM?
• A list of scholarly contributions should demonstrate intellectual currency - why have these two items separate?
• Here is one place where student research is important. Faculty may not be publishing at a phenomenal rate due to teaching loads along with involving students in research...but should activities of students (conf. attendance etc.) count as an indication of faculty's intellectual currency (I say yes!). Faculty at our institution don't necessarily engage in their professional pursuits independent of what they do with students.
• What exactly IS the measure for retracting and retaining qualified faculty? Failed searches? Faculty leaving? Again, many factors can play into this, not the least of which is the many budget crunches we have faced in the past, national hiring trends, our location etc. Perhaps this could get wrapped into the quality of the faculty assessment somehow. If you have high quality faculty now, you would anticipate being able to recruit/retain the same?
• Quality of students is also related to scholarships. Availability of scholarships (or lack of) has already been mentioned as something the campus needs to do better at.
• This whole issue of quality of students and persistence relative to the quality of a program is not clear to me. Don't we say that we provide students with an opportunity to shine? That we help students who might not have thought they could, do well and find their strengths? I'm not sure the scoring rubric will reflect this spirit - "to what extent does the congruence of the quality of students ...". The term congruence and how that reflects what we do here at MSUM doesn't seem to work. We may see incongruence - should that always result in a low score?
• How do we report on learning styles of the students?
• Internationalized curriculum? What evidence of this is needed - a topic or two within X number of course? Courses with specific SLOs regarding globalization?
• Describe how program shifted deliver to meet current student needs - is this not already addressed in answers to Criterion 1?
• There should be an assessment of opportunities for students to participate in faculty-mentored activities (research, scholarly, creative) outside of the classroom, along with a measure of outcomes (presentation at SAC, conferences, art shows, performances etc.) These opportunities can also be addresses as part of Criterion 5 but they also complement the curricula of many of the programs.
--------
A. Please provide the following information regarding the quality of faculty and staff
o % of faculty with terminal degrees
o Total years of experience of faculty
Does this include related professional experience or just teaching experience? Does it include adjunct and fixed term or just full time or tenure track years? What about community college experience? Is this averaged across the number of faculty or provided as a single number?
o Please list all scholarly or creative contributions of the faculty for the past three years (include these in an attachment)
o Please provide data showing that the faculty are intellectually current
It is impossible to show data that proves faculty are intellectually current. A faculty member who spends endless hours exhaustively reading materials published in their field would be intellectually current but have no evidence to prove so. Another faculty member may attend conferences and make presentation regularly; this only shows them to be professionally active, not necessarily intellectually current.
o Please comment on the availability of future faculty
o Please comment on the program’s current ability to attract and retain future faculty
o Please comment on how our faculty stack up against other comparable institutions in the state and nationally
We are a liberal arts and sciences institution that prides itself on student centered teaching. Perhaps we should be giving programs credit for quality teaching, student contact and faculty mentored research and creative activities, rather than focusing on scholarly publications as a Research I institution would. Faculty publication records are a very poor measure of the strength of a program. They measure how much time a person has put into their research and resume, not how strong a program they have built. Should we punish a program that has faculty that have put their efforts into its students and curriculum development rather than personal research?
B. You will be provided with the % of instruction offered by full-time faculty
Do we mean permanent faculty or do fixed term faculty count? The book acknowledges only permanent and part time faculty. We also have the category of fulltime non-permanent faculty (fixed term). This question unfairly punishes programs that may have already been unfairly treated by the past administration’s inequities in deciding faculty distribution.
C. You will be provided with the following information regarding your students for the past three years:
o High school GPA, average and range
o High school rank, average and range
o ACT composite, English and math sub-scores; average and range
o GPA for transfer students, average and range
o This data along with major persistence data will help determine the congruence of students and likely persistence in the program. Please include a statement.
The entire section “C” above needs to be removed as it is out of sync with our vision/mission. This type of thinking is predominant among institutions that seek to gain prestige through exclusivity! If we seek to continue to be an institution of access as President Szymanski has indicated, we can not punish the programs who are willing to take on the challenge of taking in below average students and producing above average graduates. Rural and economically disadvantaged students frequently correlate to lower scoring high school students. Providing the opportunity for these students to better themselves is part of who we are as an institution.
D. Please provide the following information regarding curriculum
o The breadth, depth, and level
o The coherence of the curriculum This is the same question as the bullet point below, they should be combined.
o Examination of the design with special attention to whether the integration of the content of the curriculum depends on the student or is designed into the curriculum (senior capstone course or experience)
o Examination of the how the current design meets the needs and learning styles of the students First, the answer to this question would be equivalent to approximately one half of a typical accreditation report and would take considerable volume to report. Second, undergirding this question may be an assumption that our students are overwhelmingly “digital native” students, which I have already indicated may not be an accurate description of our students and is likely out of line with our vision/mission.
o Date of last update of the curriculum and what was done
o An explanation of how the curriculum prepares graduates for a global world
o Analysis done on the program in the last three years
o Date of last accreditation and results
o Please describe how the program shifted delivery to meet current student needs
E. Please provide a statement regarding ways the program has adapted to technology including how it the program prepares students for a high-tech world and attracts high tech support from external sources Preparing students for a high-tech world means much than teaching them to become proficient at using computer software and various digital devices. It means being prepared to make responsible and ethical decisions regarding the use of all forms of technology in our lives (not just digital/computer technology.) I’m not convinced this question, as written, will evoke sufficiently broad responses. Second, attracting support to an institution is covered later in this document (Criterion 7) and so should be removed from this section. High tech support is a pressing concern for Research I institutions with high research demands and less relevant to us as a whole. Are programs like Dance, Theater, Literature, Poetry, Philosophy, and languages less valuable if they don’t garner high tech support?
F. Please make a statement regarding equipment, facilities, and other resources including capital capacity, currency of equipment and materials, library holdings and databases and updated facilities supporting the program.
------
-----
I think this needs to be re-named to at least be consistent with the book: Quality of Program Inputs and PROCESSES. We will look at ‘outputs’ in Criterion 5.
Faculty/ Question A:
• How do we measure total experience of faculty? Especially in the service areas (i.e. teaching, nursing, SLHS) the faculty MUST have clinical experience. In fact, the stronger the clinical experience, the stronger the faculty as a group. So, do we get to count ‘years clinical exp. + yrs teaching exp.’ And then, is it ‘teaching’ at MSUM only; at any 4 year instituion; at any MnSCU institution; at a CC and/or a 4 yr school, etc etc.
• What in the world is ‘intellectually current’???
• Scholarly contributions only ‘fit’ for some areas and probably should not be weighted too heavily, considering MSUM’s current emphasis on being a ‘teaching institution’.
• Describing ‘how our faculty stack up against other institutions in the state and nationally’—so do we use the sum of the individuals’ body weight? (JUST KIDDING) I don’t understand how we can really measure this—it is very very subjective.
• Missing: faculty flexibility, initiative, leadership, passion for teaching and working with students. Not sure how to measure this.
• Suggestion: # of different courses taught by each member; #of new courses developed by each member. Might show something about the faculty’s initiative, flexibility—or ir might just indicate which are ADHD.
Students/ Question C:
For nursing at least, we should include the proportion of transfer students to new, HS students. We have a great majority of transfers and their HS GPA, HS rank; ACT are just not helpful at all. As a matter of fact, for many of them, their incoming cumulative college GPA is not a good reflection of their ability/capability. Their incoming NURSING (for the RN to BSN and grad students) GPA is very informative however. I’m sure we’re not the only program on campus with this.
Curriculum/ Question D:
• What do the first two mean? Breadth, depth, level; coherence of curriculum?
• The third one also confuses me—examination of the design….
• The final one—how the program shifted delivery to meet current student needs—will vary greatly between programs. Nursing has gone increasingly on-line; some of the perfrmance programs have no on-line.
• Need to ADD: something related to how well students are meeting the program’s own stated outcomes. (this may be in Criterion 5)
Criterion 5: Quality of Program outcomes
• Question A: offering examples of exemplary performance is great except it is in the ‘testimonial’ category and usually exemplary performance is due more to the individual’s innate abilities and drive than the educational institution. We assist, of course, but I’m not sure we can take full credit (IMHO). I LIKE the ‘congruence between intended/actual outcomes’ but it is not something one can decide to measure today if you’ve not been doing good data keeping….
• Satisfaction—this is a standard measure but the alumni data is hard to get and employer information is very very difficult to get. And Nursing knows where our grads are working—I have NO idea how some depts. (English, math, psychology, etc.) would address that.
• ‘Professional’ admissions? Does this mean, did the pre-engineering students get into engineering programs?
• Question B: I think this is irrelevant.
• Question C: how does one address ‘external validation of quality’ of faculty (certifications?) or ‘outcomes mirror best practices’ (too vague); ‘recognition to institution’ ?? We seem to be missing student evaluations of individual’s teaching effectiveness???
• Needed: external accreditations or recognitions.
-----
Some of the information listed (e.g. scholarly contributions) covers 3 years; why 3 when for other items they cover 5 years? Again certain items need clarification. How are we going to show that faculty are intellectually current (with conference attendance, workshops taken, classes taught?) What data are we supposed to use to show how our faculty “stack up against other institutions?” Why there is no place to comment on teaching or involvement of students in research/creative activity? The rating matrix needs to be adjusted to correspond to the information requested. (Why not use Dickeson’s categories, and request information better suited to who we are, for this criterion?)
------
• Clarify total years of experience better. Experience doing anything related to the program (teaching, research, outreach, external committees etc.). Do the years include time not at MSUM?
• A list of scholarly contributions should demonstrate intellectual currency - why have these two items separate?
• Here is one place where student research is important. Faculty may not be publishing at a phenomenal rate due to teaching loads along with involving students in research...but should activities of students (conf. attendance etc.) count as an indication of faculty's intellectual currency (I say yes!). Faculty at our institution don't necessarily engage in their professional pursuits independent of what they do with students.
• What exactly IS the measure for retracting and retaining qualified faculty? Failed searches? Faculty leaving? Again, many factors can play into this, not the least of which is the many budget crunches we have faced in the past, national hiring trends, our location etc. Perhaps this could get wrapped into the quality of the faculty assessment somehow. If you have high quality faculty now, you would anticipate being able to recruit/retain the same?
• Quality of students is also related to scholarships. Availability of scholarships (or lack of) has already been mentioned as something the campus needs to do better at.
• This whole issue of quality of students and persistence relative to the quality of a program is not clear to me. Don't we say that we provide students with an opportunity to shine? That we help students who might not have thought they could, do well and find their strengths? I'm not sure the scoring rubric will reflect this spirit - "to what extent does the congruence of the quality of students ...". The term congruence and how that reflects what we do here at MSUM doesn't seem to work. We may see incongruence - should that always result in a low score?
• How do we report on learning styles of the students?
• Internationalized curriculum? What evidence of this is needed - a topic or two within X number of course? Courses with specific SLOs regarding globalization?
• Describe how program shifted deliver to meet current student needs - is this not already addressed in answers to Criterion 1?
• There should be an assessment of opportunities for students to participate in faculty-mentored activities (research, scholarly, creative) outside of the classroom, along with a measure of outcomes (presentation at SAC, conferences, art shows, performances etc.) These opportunities can also be addresses as part of Criterion 5 but they also complement the curricula of many of the programs.
--------
A. Please provide the following information regarding the quality of faculty and staff
o % of faculty with terminal degrees
o Total years of experience of faculty
Does this include related professional experience or just teaching experience? Does it include adjunct and fixed term or just full time or tenure track years? What about community college experience? Is this averaged across the number of faculty or provided as a single number?
o Please list all scholarly or creative contributions of the faculty for the past three years (include these in an attachment)
o Please provide data showing that the faculty are intellectually current
It is impossible to show data that proves faculty are intellectually current. A faculty member who spends endless hours exhaustively reading materials published in their field would be intellectually current but have no evidence to prove so. Another faculty member may attend conferences and make presentation regularly; this only shows them to be professionally active, not necessarily intellectually current.
o Please comment on the availability of future faculty
o Please comment on the program’s current ability to attract and retain future faculty
o Please comment on how our faculty stack up against other comparable institutions in the state and nationally
We are a liberal arts and sciences institution that prides itself on student centered teaching. Perhaps we should be giving programs credit for quality teaching, student contact and faculty mentored research and creative activities, rather than focusing on scholarly publications as a Research I institution would. Faculty publication records are a very poor measure of the strength of a program. They measure how much time a person has put into their research and resume, not how strong a program they have built. Should we punish a program that has faculty that have put their efforts into its students and curriculum development rather than personal research?
B. You will be provided with the % of instruction offered by full-time faculty
Do we mean permanent faculty or do fixed term faculty count? The book acknowledges only permanent and part time faculty. We also have the category of fulltime non-permanent faculty (fixed term). This question unfairly punishes programs that may have already been unfairly treated by the past administration’s inequities in deciding faculty distribution.
C. You will be provided with the following information regarding your students for the past three years:
o High school GPA, average and range
o High school rank, average and range
o ACT composite, English and math sub-scores; average and range
o GPA for transfer students, average and range
o This data along with major persistence data will help determine the congruence of students and likely persistence in the program. Please include a statement.
The entire section “C” above needs to be removed as it is out of sync with our vision/mission. This type of thinking is predominant among institutions that seek to gain prestige through exclusivity! If we seek to continue to be an institution of access as President Szymanski has indicated, we can not punish the programs who are willing to take on the challenge of taking in below average students and producing above average graduates. Rural and economically disadvantaged students frequently correlate to lower scoring high school students. Providing the opportunity for these students to better themselves is part of who we are as an institution.
D. Please provide the following information regarding curriculum
o The breadth, depth, and level
o The coherence of the curriculum This is the same question as the bullet point below, they should be combined.
o Examination of the design with special attention to whether the integration of the content of the curriculum depends on the student or is designed into the curriculum (senior capstone course or experience)
o Examination of the how the current design meets the needs and learning styles of the students First, the answer to this question would be equivalent to approximately one half of a typical accreditation report and would take considerable volume to report. Second, undergirding this question may be an assumption that our students are overwhelmingly “digital native” students, which I have already indicated may not be an accurate description of our students and is likely out of line with our vision/mission.
o Date of last update of the curriculum and what was done
o An explanation of how the curriculum prepares graduates for a global world
o Analysis done on the program in the last three years
o Date of last accreditation and results
o Please describe how the program shifted delivery to meet current student needs
E. Please provide a statement regarding ways the program has adapted to technology including how it the program prepares students for a high-tech world and attracts high tech support from external sources Preparing students for a high-tech world means much than teaching them to become proficient at using computer software and various digital devices. It means being prepared to make responsible and ethical decisions regarding the use of all forms of technology in our lives (not just digital/computer technology.) I’m not convinced this question, as written, will evoke sufficiently broad responses. Second, attracting support to an institution is covered later in this document (Criterion 7) and so should be removed from this section. High tech support is a pressing concern for Research I institutions with high research demands and less relevant to us as a whole. Are programs like Dance, Theater, Literature, Poetry, Philosophy, and languages less valuable if they don’t garner high tech support?
F. Please make a statement regarding equipment, facilities, and other resources including capital capacity, currency of equipment and materials, library holdings and databases and updated facilities supporting the program.
------
Criteria 4 Original
A. Please provide the following information regarding the quality of faculty and staff
o % of faculty with terminal degrees
o Total years of experience of faculty
o Please list all scholarly contributions of the faculty for the past three years (include these in an attachment)
o Please provide data showing that the faculty are intellectually current
o Please comment on the availability of future faculty
o Please comment on the program’s current ability to attract and retain future faculty
o Please comment on how our faculty stack up against other institutions in the state and nationally
B. You will be provided with the % of instruction offered by full-time faculty
C. You will be provided with the following information regarding your students for the past three years:
o High school GPA, average and range
o High school rank, average and range
o ACT composite, English and math sub-scores; average and range
o GPA for transfer students, average and range
o This data along with major persistence data will help determine the congruence of students and likely persistence in the program. Please include a statement.
D. Please provide the following information regarding curriculum
o The breadth, depth, and level
o The coherence of the curriculum
o Examination of the design with special attention to whether the integration of the content of the curriculum depends on the student or is designed into the curriculum (senior capstone course or experience)
o Examination of the how the current design meets the needs and learning styles of the students
o Date of last update of the curriculum and what was done
o An explanation of how the curriculum prepares graduates for a global world
o Analysis done on the program in the last three years
o Date of last accreditation and results
o Please describe how the program shifted delivery to meet current student needs
E. Please provide a statement regarding ways the program has adapted to technology including how it prepares students for a high-tech world and attracts high tech support from external sources
F. Please make a statement regarding equipment, facilities, and other resources including capital capacity, currency of equipment and materials, library holdings and databases and updated facilities supporting the program.
o % of faculty with terminal degrees
o Total years of experience of faculty
o Please list all scholarly contributions of the faculty for the past three years (include these in an attachment)
o Please provide data showing that the faculty are intellectually current
o Please comment on the availability of future faculty
o Please comment on the program’s current ability to attract and retain future faculty
o Please comment on how our faculty stack up against other institutions in the state and nationally
B. You will be provided with the % of instruction offered by full-time faculty
C. You will be provided with the following information regarding your students for the past three years:
o High school GPA, average and range
o High school rank, average and range
o ACT composite, English and math sub-scores; average and range
o GPA for transfer students, average and range
o This data along with major persistence data will help determine the congruence of students and likely persistence in the program. Please include a statement.
D. Please provide the following information regarding curriculum
o The breadth, depth, and level
o The coherence of the curriculum
o Examination of the design with special attention to whether the integration of the content of the curriculum depends on the student or is designed into the curriculum (senior capstone course or experience)
o Examination of the how the current design meets the needs and learning styles of the students
o Date of last update of the curriculum and what was done
o An explanation of how the curriculum prepares graduates for a global world
o Analysis done on the program in the last three years
o Date of last accreditation and results
o Please describe how the program shifted delivery to meet current student needs
E. Please provide a statement regarding ways the program has adapted to technology including how it prepares students for a high-tech world and attracts high tech support from external sources
F. Please make a statement regarding equipment, facilities, and other resources including capital capacity, currency of equipment and materials, library holdings and databases and updated facilities supporting the program.
Criteria 3 Comments
Comments are below, separated by dashes.
-----
• How dependent is the campus on this program?
-----
Data provided: Ratio of total courses to major/minor, ratio of total courses in service to other majors, ratio of total courses in service to general education
Dickeson suggests providing enrollment data (which to me makes more sense, but that’s not the information we’ll be given.
Item B (list all majors directly dependent on the program) is not clear. Are we expected to provide a number or a list of names? It’s not clear. What are the questions we are expected to address here? The scoring items are vague and not directly linked to specific questions.
-----
• How dependent is the campus on this program?
-----
Data provided: Ratio of total courses to major/minor, ratio of total courses in service to other majors, ratio of total courses in service to general education
Dickeson suggests providing enrollment data (which to me makes more sense, but that’s not the information we’ll be given.
Item B (list all majors directly dependent on the program) is not clear. Are we expected to provide a number or a list of names? It’s not clear. What are the questions we are expected to address here? The scoring items are vague and not directly linked to specific questions.
Criteria 3 Original
A. You will be provided with:
o Ratio of total courses to Major/Minor
o Ratio of total courses in service to other majors
o Ratio of total courses in service to general education
B. Please list all majors that are directly dependent on this program
o Ratio of total courses to Major/Minor
o Ratio of total courses in service to other majors
o Ratio of total courses in service to general education
B. Please list all majors that are directly dependent on this program
Criteria 2 comments
Comments received:
-----
• Perhaps just some clarity here on how will the "to what extent’s be determined? Is this determined at each level of prioritization in a relative manner? So all CSNS programs are scored relative to each other first, not compared with all other programs on campus as the scoring rubric says.
-----
The commenter rewrote this criteria as:
A. You will be provided with:
o Quintiles of entering majors
o Quintiles of grad majors
o 5 year trend lines of majors
B. Please make a statement regarding your perceptions of the external demands to continue the program.
C. To what extent is this program unique in its contribution to our state and region?
-----
Data provided: quantiles of entering majors, and grad majors, and 5 year trend lines of majors.
Here it’s easier to see how the data will be used, but the matrix needs to be cleaned up to fit the data requested.
I don’t know if one can answer item E. Statewide data comparisons? Are such data even available?
------
• Question B is asking for opinion which, of course, all programs will have hundreds of students just waiting in the wings to attend, if only the program had more faculty or equipment or scholarships or… I think we need to stress some data-driven ‘opinion’ here.
• Question E asks for statewide data and, esp since we are a border city, I think we should also look at ‘regional’ comparisons IF students come here from ND, SD, WI, IA. As we move into the ‘online’ formats, the geographic location of the student will only be an issue as it relates to ‘out of state’ tuition and/or licensure issues (for example, nursing faculty must be licensed in the state where they are supervising students). The ‘market’ for a particular major may be significantly outside of MN but still be very viable (i.e. we have literally hundreds of international students currently).
-------
-----
• Perhaps just some clarity here on how will the "to what extent’s be determined? Is this determined at each level of prioritization in a relative manner? So all CSNS programs are scored relative to each other first, not compared with all other programs on campus as the scoring rubric says.
-----
The commenter rewrote this criteria as:
A. You will be provided with:
o Quintiles of entering majors
o Quintiles of grad majors
o 5 year trend lines of majors
B. Please make a statement regarding your perceptions of the external demands to continue the program.
C. To what extent is this program unique in its contribution to our state and region?
-----
Data provided: quantiles of entering majors, and grad majors, and 5 year trend lines of majors.
Here it’s easier to see how the data will be used, but the matrix needs to be cleaned up to fit the data requested.
I don’t know if one can answer item E. Statewide data comparisons? Are such data even available?
------
• Question B is asking for opinion which, of course, all programs will have hundreds of students just waiting in the wings to attend, if only the program had more faculty or equipment or scholarships or… I think we need to stress some data-driven ‘opinion’ here.
• Question E asks for statewide data and, esp since we are a border city, I think we should also look at ‘regional’ comparisons IF students come here from ND, SD, WI, IA. As we move into the ‘online’ formats, the geographic location of the student will only be an issue as it relates to ‘out of state’ tuition and/or licensure issues (for example, nursing faculty must be licensed in the state where they are supervising students). The ‘market’ for a particular major may be significantly outside of MN but still be very viable (i.e. we have literally hundreds of international students currently).
-------
Criteria 2: External Demand for Program
A. You will be provided with:
o Quintiles of entering majors
o Quintiles of grad majors
o 5 year trend lines of majors
B. Please make a statement regarding your opinion of potential for future enrollments
C. Please make a statement regarding the resources needed to continue support (majors/minors)
D. Please make a statement regarding your perceptions of the external demands to continue the program.
E. Please include any statewide data comparisons you feel are pertinent
o Quintiles of entering majors
o Quintiles of grad majors
o 5 year trend lines of majors
B. Please make a statement regarding your opinion of potential for future enrollments
C. Please make a statement regarding the resources needed to continue support (majors/minors)
D. Please make a statement regarding your perceptions of the external demands to continue the program.
E. Please include any statewide data comparisons you feel are pertinent
My suggestion for Criteria 1
My summary of comments:
Points A and B will be very hard to write accurately and, in any event, it is unclear how to score.
Points C and F are redundant.
Point D, which asks for data, has a couple problems:
Point H should not simply include total courses. Number of sections or enrollment is more relevant.
Suggested draft as of 11AM, 2/4/09:
A. Put your department in the context of the university. This brief statement might include information about your department's history, evolution, or the extent to which your department meets the expectations of students. Put differently, this is the place to introduce yourself to the president and those on the review committees who are not familiar with your department.
B. Please make a statement regarding the maturity of the program. Could it be expected to grow with existing resources or is your department at capacity? The purpose is to set expectations and provide a context for interpreting enrollment data.
C. The data provided should include enrollments in courses. Number of courses alone does not measure teaching load.
Points A and B will be very hard to write accurately and, in any event, it is unclear how to score.
Points C and F are redundant.
Point D, which asks for data, has a couple problems:
- The five year time period seems arbitrary. It also isn't clear what the desirable outcome is here. For some programs growth is desirable, but for others flat enrollment means the program is operating at capacity.
- This should be dropped because it is a very incomplete attempt at measuring program success in retaining students and because MSUM goes out of its way to educate students of all abilities.
Point H should not simply include total courses. Number of sections or enrollment is more relevant.
Suggested draft as of 11AM, 2/4/09:
A. Put your department in the context of the university. This brief statement might include information about your department's history, evolution, or the extent to which your department meets the expectations of students. Put differently, this is the place to introduce yourself to the president and those on the review committees who are not familiar with your department.
B. Please make a statement regarding the maturity of the program. Could it be expected to grow with existing resources or is your department at capacity? The purpose is to set expectations and provide a context for interpreting enrollment data.
C. The data provided should include enrollments in courses. Number of courses alone does not measure teaching load.
Criteria 1: Comments from campus
Suggestion are below, separated by a dashed line.
-------
Data provided: Enrollment and major trends over a 5 year period, persistence by program year one to year two (is this supposed to look at retention of students from 1st to 2nd year? Why not include infusion of students into a program, e.g. transfer students joining a program), ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core (what issues are we supposed to address with these data?)
It makes sense to examine whether or not the program is in congruence with MSUM’s expectations (item G) but it’s not clear to me what information will support that.
As for adaptability of the program (items C and F, which should be combined), it makes sense to examine if a program evolves to meet changes in its field, if it does then student needs (if one assumes that students expect to receive current training) should be met. Otherwise it’s not clear what “student needs” means. Unless these items reflect a program’s maturity and adaptability relates to flexibility in mode of delivery, hours courses are offered at etc. These issues need clarification.
How is one supposed to find information about MSUM’s original expectations of a program? (It’s easier to describe changes over a recent 5 or 10 year period, but the origins? Does anyone on campus keep data like that?)
-------
Using 5 year trends is not justified (actually, there is very little evidence at all provided in this book for any of the author's claims). I don't see the value in 5-year trends, especially in institutions like universities. When the trends turn around, you may need to retrench elsewhere and re-institute programs that were previously dropped. 5 year plans are generally short-sighted. (by the way, if you had looked at 5-year trends in housing values at the peak of the bubble and invested all your money in high priced houses, where would you be now?).
--------
On item C:
This statement makes implications about our students that may in fact not be true. For example, our student population in most programs is predominantly composed of traditional students. Our tuition and fee structure has actively discouraged non-traditional students from attending our campus. Programs should not be scored down for not offering classes outside of prime time that would have remained two thirds empty and been fiscally irresponsible. Capacity to offer non-prime time classes in the future might be a more accurate indicator.
The number of on-line courses is also a dubious indicator on our campus. Our university has a tradition of being an institution of access for individuals of lower economic status. Indeed, President Szymanski has indicated this in a town hall meeting. We have prided ourselves on helping people and the community to better themselves. People of lower economic classes are generally not well equipped to succeed at on-line coursework particularly early in their educational careers. On-line courses are most appropriate for wealthy suburban students who have a culture of digital access at home and in well-funded schools. Low income and rural students do not typically have access at the same levels. Also, on-line education is more successful for students involved in graduate and upper level coursework where students are more familiar with and independent in accessing information and building knowledge. A far more appropriate indicator would be programs providing students with appropriate digital instruction that helps them to bridge the digital divide, rather that burying them behind it through failure in on-line courses. The “Digital Native” student much talked about nationally has not yet reached our campus in large numbers. Widespread use of cell phones and social networking pages do not necessarily translate into computer-savvy learning skills. The situation may be different in years to come, but our current students should not have their needs neglected.
On item E:
This is largely the same question as subheading “C” it should be combined with it.
On item G:
The original working of this statement assumes that the institution had defined and communicated expectations of its programs. I believe this is not true on a broad scale.
On item H:
Is the number of courses a useful measure? Or might it be more useful to look at the number of sections or the number of seats offered?
-------
The commenter adapted this criteria to be:
A. Please make a brief statement regarding the establishment of the program and its evolution.
B. Describe recent changes to the program and their rationale. How has the program adapted to changing institutional expectations and/or the changing needs and expectations of today’s students?
C. You will be provided with:
a. Enrollment and major trends over a 5 year period
b. Persistence by program year one to year two
D. You will be provided with the ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core,
-------
• Clarification needed on how would the expectations of students today be determined? Anecdotally? NESSE data (from what years?). What is the time frame? Between now and 5 yrs ago, 10 yrs ago etc. Depending on how far back one looks, program adaptation to change may be marked or not at all.
• In the scoring rubric what is the difference between "program adapted to meet change" and "program adapted to meet needs of current students". Perhaps these could be merged into 1?
• Persistence from year 1 to year 2 can be strongly influenced by many factors outside of a student's major (preparation, housing, etc.). How do we account for this?
• Will we have a definition of broad institutional outcomes? It is clear that we will have a draft vision to work with but what are our BIO's? Are they found in our current mission statement (which will be revised)? Again, perhaps these two items on the scoring rubric could be combined
-------
- Again—which vision and mission are we addressing? How does the MnSCU strategic plan etc etc fit in here?
- Each dept should have a ‘mission’ statement and ‘outcomes’ for their graduates –check with the Student Learning Assessment committee (or whatever it’s called now). They should be listed in the Bulletin I think. Do we put this here too?
- I think Item G is way too vague—‘congruence of the program to institutional expectations’??? the ‘vision’ issue arises here again (see above).
Data provided: Enrollment and major trends over a 5 year period, persistence by program year one to year two (is this supposed to look at retention of students from 1st to 2nd year? Why not include infusion of students into a program, e.g. transfer students joining a program), ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core (what issues are we supposed to address with these data?)
It makes sense to examine whether or not the program is in congruence with MSUM’s expectations (item G) but it’s not clear to me what information will support that.
As for adaptability of the program (items C and F, which should be combined), it makes sense to examine if a program evolves to meet changes in its field, if it does then student needs (if one assumes that students expect to receive current training) should be met. Otherwise it’s not clear what “student needs” means. Unless these items reflect a program’s maturity and adaptability relates to flexibility in mode of delivery, hours courses are offered at etc. These issues need clarification.
How is one supposed to find information about MSUM’s original expectations of a program? (It’s easier to describe changes over a recent 5 or 10 year period, but the origins? Does anyone on campus keep data like that?)
-------
Using 5 year trends is not justified (actually, there is very little evidence at all provided in this book for any of the author's claims). I don't see the value in 5-year trends, especially in institutions like universities. When the trends turn around, you may need to retrench elsewhere and re-institute programs that were previously dropped. 5 year plans are generally short-sighted. (by the way, if you had looked at 5-year trends in housing values at the peak of the bubble and invested all your money in high priced houses, where would you be now?).
--------
On item C:
This statement makes implications about our students that may in fact not be true. For example, our student population in most programs is predominantly composed of traditional students. Our tuition and fee structure has actively discouraged non-traditional students from attending our campus. Programs should not be scored down for not offering classes outside of prime time that would have remained two thirds empty and been fiscally irresponsible. Capacity to offer non-prime time classes in the future might be a more accurate indicator.
The number of on-line courses is also a dubious indicator on our campus. Our university has a tradition of being an institution of access for individuals of lower economic status. Indeed, President Szymanski has indicated this in a town hall meeting. We have prided ourselves on helping people and the community to better themselves. People of lower economic classes are generally not well equipped to succeed at on-line coursework particularly early in their educational careers. On-line courses are most appropriate for wealthy suburban students who have a culture of digital access at home and in well-funded schools. Low income and rural students do not typically have access at the same levels. Also, on-line education is more successful for students involved in graduate and upper level coursework where students are more familiar with and independent in accessing information and building knowledge. A far more appropriate indicator would be programs providing students with appropriate digital instruction that helps them to bridge the digital divide, rather that burying them behind it through failure in on-line courses. The “Digital Native” student much talked about nationally has not yet reached our campus in large numbers. Widespread use of cell phones and social networking pages do not necessarily translate into computer-savvy learning skills. The situation may be different in years to come, but our current students should not have their needs neglected.
On item E:
This is largely the same question as subheading “C” it should be combined with it.
On item G:
The original working of this statement assumes that the institution had defined and communicated expectations of its programs. I believe this is not true on a broad scale.
On item H:
Is the number of courses a useful measure? Or might it be more useful to look at the number of sections or the number of seats offered?
-------
The commenter adapted this criteria to be:
A. Please make a brief statement regarding the establishment of the program and its evolution.
B. Describe recent changes to the program and their rationale. How has the program adapted to changing institutional expectations and/or the changing needs and expectations of today’s students?
C. You will be provided with:
a. Enrollment and major trends over a 5 year period
b. Persistence by program year one to year two
D. You will be provided with the ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core,
-------
• Clarification needed on how would the expectations of students today be determined? Anecdotally? NESSE data (from what years?). What is the time frame? Between now and 5 yrs ago, 10 yrs ago etc. Depending on how far back one looks, program adaptation to change may be marked or not at all.
• In the scoring rubric what is the difference between "program adapted to meet change" and "program adapted to meet needs of current students". Perhaps these could be merged into 1?
• Persistence from year 1 to year 2 can be strongly influenced by many factors outside of a student's major (preparation, housing, etc.). How do we account for this?
• Will we have a definition of broad institutional outcomes? It is clear that we will have a draft vision to work with but what are our BIO's? Are they found in our current mission statement (which will be revised)? Again, perhaps these two items on the scoring rubric could be combined
Criteria 1: History, Development and Expectations of the Program
A. Please make a brief statement regarding the establishment of the program and its evolution.
B. Please make a brief statement regarding MSUM’s original expectations of the program and how the program has evolved to meet current expectations.
C. Please describe the program has adapted to meet the expectations of students today. You may wish to include the number of courses outside prime time, the number of on-line courses, the number of developmental courses offered (or allowed for in course planning).
D. You will be provided with:
a. Enrollment and major trends over a 5 year period
b. Persistence by program year one to year two
E. Please make a statement regarding the maturity of the program. To what extent does the program show potential for growth? What is the comparative advantage regionally?
F. Please make a statement regarding the adaptability of the program especially addressing: to what extent is the program adapted to the needs and expectations of today’s students?
G. Please make a statement regarding the congruence of the program to institutional expectations. To what extent does the program support broad institutional outcomes? Is the program an area that supports the vision of the institution?
H. You will be provided with the ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core,
B. Please make a brief statement regarding MSUM’s original expectations of the program and how the program has evolved to meet current expectations.
C. Please describe the program has adapted to meet the expectations of students today. You may wish to include the number of courses outside prime time, the number of on-line courses, the number of developmental courses offered (or allowed for in course planning).
D. You will be provided with:
a. Enrollment and major trends over a 5 year period
b. Persistence by program year one to year two
E. Please make a statement regarding the maturity of the program. To what extent does the program show potential for growth? What is the comparative advantage regionally?
F. Please make a statement regarding the adaptability of the program especially addressing: to what extent is the program adapted to the needs and expectations of today’s students?
G. Please make a statement regarding the congruence of the program to institutional expectations. To what extent does the program support broad institutional outcomes? Is the program an area that supports the vision of the institution?
H. You will be provided with the ratio of total courses to major courses, ratio of total courses to courses in service to other majors, ratio of total courses to Dragon Core,
Subscribe to:
Posts (Atom)