LIS Program Administration

ALA Accreditation

  • Lead author of LIS’s Plan for the Program Presentation, including the Introduction, Process for the Preparation of the Program Presentation, Accreditation Committees and Document Development, Constituent Group Involvement and Communication Strategies, Timeline for Preparing the Program Presentation (2013-2016), Proposed Layout of the Program Presentation, and Documentation sections and Standard I (Mission, Goals, and Objectives) and Standard V (Administration and Finance) evidence lists.
  • Lead author of LIS’s Program Presentation detailed chapter outlines and evidence lists for Standard I (Mission, Goals, and Objectives) and Standard V (Administration and Finance). The development of these evidence lists served as an informal analysis of LIS’s compliance with the 2008 ALA Accreditation Standards.
  • Draft author of LIS’s 2014 Biennial Narrative Report. This draft ensured LIS gathered formative feedback from the ALA’s Committee on Accreditation on its LIS’s systematic planning and outcomes assessment plans prior to LIS’s comprehensive accreditation review.

Systematic Planning and Outcomes Assessment

  • Co-developer of LIS’s Annual Assessment/Planning Cycle, including its required Annual Report Template and Assessment/Planning Cycle Schedule. This program-level systematic planning framework aligns LIS’s decision-making processes with the ALA Accreditation Standards requirements to: assess its progress towards its goals and achievement of its objectives, engage all constituency groups, apply the results of assessment to program decision-making processes, communicate assessment results to program constituents, and ultimately facilitate continuous improvements to the program.
  • Co-developer of LIS’s Annual Outcomes Assessment Plan, including its Outcomes Assessment Map, Core Course Outcomes Assessment Report Template, and required Key Assessment Report and Curricular Improvement Report Template. This plan aligns LIS’s student learning outcomes assessment activities with the ALA Accreditation Standards requirements to: employ direct and indirect assessments of student learning, gather assessment results from all constituency groups, apply the results of assessment to program decision-making processes, communicate assessment results to program constituents, and ultimately facilitate continuous improvements to student learning.
  • Lead organizer of 2 student engagement sessions¹ between LIS students and program administration/faculty in summer 2014. Embedded mechanisms for broad-based, continuous engagement between LIS students and program administration to ensure engagement is sustained under LIS’s new administration.
  • Co-reestablished the LIS Advisory Board² to provide strategic feedback and input from representatives from all of LIS’s constituency groups. Identified 5 of the 8 newly recruited Board members through professional connections and knowledge of leaders in the DC regional area. Co-developed the Board’s meeting agendas and presentations, Charge and 2015-2016 Goals and Member Recruitment Plan.
  • Revised and aligned survey questions from LIS’s Current Student, Exit, Employer, Practicum Student, and Practicum Supervisor surveys³ to gather program-level indirect assessment data on LIS’s program objectives. Data collection from the Alumni, Exit, and Employer survey scheduled for AY2012-13 and 2013-14 (pp. 8-9).
  • Embedded open-ended questions LIS’s Current Student, Exit, and Employer surveys³ to gather qualitative data on LIS’s program objectives and goals. These responses provide LIS with assessment results it can apply, as appropriate, to revisions to its program goals and objectives statements⁴.

Data Analysis

  • Lead author of LIS’s AY2013-14 Exit Survey (n = 29) and Fall 2013 and Spring 2014 Blended/OWL Surveys (n = 131) data analysis reports. This Exit Survey was the first to gather data on graduate’s perceptions of the comprehensive exam. These results were subsequently triangulated with results from the 2014 Alumni Survey to increase the reliability of graduate perceptions of the exam.
  • Author of LIS’s 2014 Alumni Survey³ (n = 294) preliminary data analysis report. Results pertaining to the comprehensive exam contributed to the formation of the Capstone Review Committee. The Committee’s goals included improving the exam’s assessment capabilities and student preparation, learning, and pass rates.
  • Led the analysis of multiple enrollment, graduation, and employment data sets, including the ALA Committee on Accreditation’s Trend Data, CUA’s Office of Financial Planning, Institutional Research and Assessment, the National Center for Education Statistics’ Digest of Education Statistics, and the Bureau of Labor Statistics Occupation Outlook Handbook. A targeted analysis of market-share enrollment trends based on degree specializations served as a catalyst for LIS’s decision-makers (comprised exclusively of full-time faculty) to introduce two new courses of study, Community and Embedded Information Services and Information Analysis. These two courses of study are the first to be introduced at LIS since 2009.

Survey Design and Coordination

  • Organized the submission of 2013 graduate responses to LIS’s 2014 Alumni Survey³ for publication in Library Journal’s 2014 Placement and Salaries Survey. Successfully co-advocated for the publication of program-level response rates in Library Journal to provide a clearer and more accurate assessment of graduate placement rates.
  • Co-coordinated the redesign, dissemination, collection, and archiving of data from LIS’s primary survey assessment tools, including the Annual Exit (n for AY2014-15 = 29), 2014 Alumni (n = 294), 2014 Current Student (n = 64), and 2015 Employer (n = 243) surveys³.
  • Embedded opportunities for open-ended responses throughout LIS’s Alumni, Blended/OWL, Current Student, Exit, Employer, Practicum Student, and Practicum Supervisor surveys³ to amplify constituent voice and gather targeted qualitative data on specific programmatic elements to ease data coding, analysis, and interpretation and the application of assessment results to program decision-making processes. Prior to these revisions, survey respondents were provided one opportunity to provide open-ended feedback, which required additional data coding and analysis and minimized the number and length of comments provided.
  • Revised LIS’s Course Evaluation⁵ supplementary questions to gather targeted data on curriculum and instruction quality.

¹ In 2009, meetings between student and administration was scheduled to occur on a semiannual basis (pp. 48, 207). Prior to these summer 2014 engagement session, no meetings were held since 2009.

² The LIS Advisory Board was inactive from March 2012 – November 2014.

³ In 2009, the Alumni, Current Student, and Employer surveys were scheduled to be disseminated triennially, with the next round due in spring 2011 (pp. 48, 207). These surveys were not disseminated until the 2014-15 academic year.

⁴ Constituent engagement in revisions to the program’s vision, mission, goals, objectives, and competencies statements was scheduled to occur on a yearly basis, with the first round of substantive revisions scheduled for 2011 (pp. 6-7, 45, 50-53, 100). Since their adoption in 2007/2008, these statements have not been revised (with the exception of 7 spelling revisions).

⁵ Prior to these revisions, LIS’s supplementary questions gathered redundant data for 5 semesters after the University revised its standard course evaluation questions in Spring 2013.


Your thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s