Data Archives - NWCCU https://nwccu.org/category/data/ Tue, 24 Sep 2024 18:30:49 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://nwccu.org/wp-content/uploads/2024/01/Logo.png Data Archives - NWCCU https://nwccu.org/category/data/ 32 32 V7I1: Letter from the President https://nwccu.org/news/v7i1-letter-from-the-president/?utm_source=rss&utm_medium=rss&utm_campaign=v7i1-letter-from-the-president Tue, 24 Sep 2024 18:30:49 +0000 https://nwccu.org/?p=2282 Student Learning Outcomes and Assessment-Based Education   Sonny Ramaswamy, President, NWCCU “Success consists of going from failure to failure without loss of enthusiasm.” Winston Churchill The mantra we read about in scholarly journals, newspapers and other media, and books or hear about in panel discussions and keynotes at conferences, in hearings at the United States Congress, …

The post V7I1: Letter from the President appeared first on NWCCU.

]]>
Student Learning Outcomes and Assessment-Based Education  

Sonny Ramaswamy, President, NWCCU

Success consists of going from failure to failure without loss of enthusiasm.” Winston Churchill

The mantra we read about in scholarly journals, newspapers and other media, and books or hear about in panel discussions and keynotes at conferences, in hearings at the United States Congress, and during discussions at the United States Department of Education, is of higher education’s critical role in promoting student success and closing equity gaps. In recent years, state funding has been tied to student outcomes, via performance-based budget models.

In my experience, the recipe for promoting student success includes a suite of approaches that advance community and group interactions to ensure student achievement and outcomes. This includes focusing on the fundamentals of education and high-impact practices – intrusive advising, offering experiential learning opportunities and inculcating technical, cognitive skills along with non-cognitive, core competencies, such as critical thinking, problem solving, and communication skills – combined with single-minded use of data-informed predictive analytics and other digital tools. In addition to academic support, advancing student success will require that students, particularly those from marginalized and underserved communities, are provided help with financial aid, just-in-time grants, supportive social networks, food, housing, child- and health-care support, and mental health counseling. These latter supportive efforts should be focused, not just on students from particular racial or ethnic groups, but on ALL students requiring the same, so as to promote access, belonging, and success, which can help further democratize higher education and contribute to the well-being of our democracy.

Postsecondary institutions that have made demonstrable and significant progress in promoting student success, particularly amongst students from marginalized and disadvantaged communities, take a purposeful approach to advancing student success, which includes many, if not all, of the practices I have listed above.

We have learned in recent years that highly successful institutions use explicit and well-defined student learning outcomes, rubrics, and assessments, where learning outcomes and assessment are part of a continuous improvement process, i.e., the assessment cycle, which includes a multistep, cyclical process of, for example, identifying goals, designing processes, assessing, implementing, collecting and analyzing data, evaluating, synthesizing reports and sharing, and using the knowledge to promote continuous improvement. These learning outcomes and assessments are undertaken at the course, programmatic, and institutional level.

Student learning outcomes are the specified knowledge, skills, abilities or attitudes that students are expected to attain by the end of a learning experience or program of study.

Assessments are a series of activities designed to test student competence in the learning outcomes of a course or program, while supporting students to improve their learning. Assessments are also used to inform instructors of the effectiveness of their teaching practices. There are three types of assessment: diagnostic (use of pre- and post-tests to evaluate student knowledge before and after going through a course or program), formative (use of ungraded or low-stakes activities), and summative (use of graded assignments).

Rubrics are statements that articulate the expectations by listing criteria for what is particularly important and by describing levels of quality on a defined scale.

Defining and using learning outcomes and assessment in support of student success efforts are becoming more common as institutions collect and use student learning outcomes data and assessment instruments for continuous improvement. These efforts have been driven externally by state and federal governments, accreditors, and other third-party interests. Internally these efforts have been driven by college and university administrators. What is needed, however, is more faculty engagement and involvement in establishing and assessing student learning outcomes is essential.

The suboptimal faculty involvement in use of learning outcomes and assessment, reminded me of my experience a few years ago. This is the gist of my conversation, recreated from memory and with license, with a professor at an unnamed institution of higher learning about student learning outcomes and assessment:

Me: Do you define student learning outcomes and assess achievement of the same in the courses you teach?

Prof: Oh yeah, of course. During the very first class at the beginning of the term, in the course syllabus handout, I specify the course objectives, term paper and other requirements, and also provide the dates for quizzes, midterms, and final exams. I also inform the students I may include surprise quizzes.

Me: Hmm. How do you know the students have developed the knowledge required in your course and the discipline and that they’re able to think critically and solve problems, rather than rote-learn facts and technical content?

Prof: Hmph. I don’t worry if the students have gained the knowledge and can think critically. To me what’s important is they know the facts. Without the facts, they can’t go further.

Me: But …, aren’t you concerned they may not develop the depth of disciplinary knowledge that would allow them to be successful? And, how do you know if student learning is improving?

Prof: No, that’s not for me to be concerned about in my course. It’s an administrative requirement, which may be for the program coordinator, department chair, or dean to be concerned about. I am just too busy.

Me: Shouldn’t you as a professor be concerned about student learning and be able to compare year to year and with other sections of the course being taught by other professors?

Prof: Like I said, it’s for the administrators to be concerned with. I am here to teach my courses and do my scholarly research. I just do not have the time to do what you are talking about.

The above is in many ways reflective of the disconnect between the boots-on-the-ground course instructor and the institutional level efforts to promote learning outcomes and assessment at the course and program level driven by external requirements imposed by accreditors and government agencies.

Indeed, in querying data for our NWCCU member institutions that went through our Year-Seven, comprehensive Evaluation of Institutional Effectiveness during Fall 2022, Spring 2023, Fall 2023, and Spring 2024 peer evaluation seasons, approximately 75 percent received Recommendations related to Standards 1.C.5 and 1.C.7.

Specifically, NWCCU’s Standards for Accreditation related to Student Learning, which focus on student learning outcomes and assessment, include:

  • 1.C.1 The institution offers programs with appropriate content and rigor that are consistent with its mission, culminate in achievement of clearly identified student learning outcomes (emphasis added here and below in italics) that lead to collegiate-level degrees, certificates, or credentials and include designators consistent with program content in recognized fields of study.
  • 1.C.2 The institution awards credit, degrees, certificates, or credentials for programs that are based upon student learning and learning outcomes that offer an appropriate breadth, depth, sequencing, and synthesis of learning.
  • 1.C.3 The institution identifies and publishes expected program and degree learning outcomes for all degrees, certificates, and credentials. Information on expected student learning outcomes for all courses is provided to enrolled students.
  • 1.C.4 The institution’s admission and completion or graduation requirements are clearly defined, widely published, and easily accessible to students and the public.
  • 1.C.5 The institution engages in an effective system of assessment to evaluate the quality of learning in its programs. The institution recognizes the central role of faculty to establish curricula, assess student learning, and improve instructional programs.
  • 1.C.6 Consistent with its mission, the institution establishes and assesses, across all associate and bachelor level programs or within a General Education curriculum, institutional learning outcomes and/or core competencies. Examples of such learning outcomes and competencies include, but are not limited to, effective communication skills, global awareness, cultural sensitivity, scientific and quantitative reasoning, critical analysis and logical thinking, problem solving, and/or information literacy.
  • 1.C.7 The institution uses the results of its assessment efforts to inform academic and learning-support planning and practices to continuously improve student learning outcomes.

During the last few years, NWCCU has offered a series of workshops and webinars to support our institutions develop and deploy effective learning outcomes and assessment, such as, for example, the Spring 2023 webinar entitled, Best Practices in Assessment, and the Mission Fulfilment Fellowship, which is in its sixth year of existence and has “graduated” over 150 Fellows. In addition, we have provided resources on demystifying our Standards related to assessment.

Prospectively, NWCCU will create a series of coordinated efforts to support our member institutions to meet or exceed the Standards related to student learning, cited above, including workshops, webinars, training, feedback, mentoring, and other resources and materials.

For example, in the first of a series through the next few years, at NWCCU’s upcoming 2024 Annual Conference, November 20-22, Jillian Kinzie, Associate Director, National Survey of Student Engagement, Indiana University Center for Postsecondary Research, will present a day-long workshop for faculty and institutional leaders. Dr. Kinzie will focus on actionable approaches to program-level learning outcomes assessment, with an emphasis on equitable, student-centered assessment practices. The intent is to connect program-level assessment with both course-level and institution-level learning outcomes assessment (Standards 1.C.3, 1.C.5, 1.C.6) and help teams design processes to meaningfully use assessment results for improvement of student learning (Standard 1.C.7).

In addition, henceforth, starting with this issue of The Beacon, our intent is to spotlight articles on student learning outcomes and assessment by national experts, along with articles on best practices and how-to, written by representatives from NWCCU member institutions.

For example, in this issue of The Beacon three institutions that participated in NWCCU’s Mission Fulfilment Fellowship, Shoreline Community College, College of Southern Nevada, and College of Southern Idaho, share their experiences in developing and deploying learning outcomes and assessment to improve student achievement and success on their respective campuses.

We hope that the combination of articles and workshops are helpful for our member institutions to develop learning outcomes- and assessment-informed approaches to promote student success.

Postscript:

After I wrote my essay on student learning outcomes and assessment, I decided on querying OpenAI’s GPT-4, the generative artificial intelligence (AI) application.

GPT-4’s response to my query “What should I take into consideration as I develop student learning outcomes, rubrics, and assessments?” is copied and pasted below verbatim:

When developing student learning outcomes (SLOs), rubrics, and assessments, there are several key considerations to ensure they are effective, meaningful, and aligned with both educational goals and student needs. Here’s what you should take into account:

  1. Alignment with Program Goals and Institutional Mission
  • Coherence: Ensure that SLOs align with the broader goals of the program, department, and institution. Each outcome should contribute to the overall educational objectives.
  • Curriculum Integration: Consider how each SLO fits within the course sequence or curriculum map. Outcomes should be integrated and scaffolded across courses to build skills progressively.
  1. Clarity and Precision
  • Specificity: SLOs should be clearly articulated, using precise language that leaves no room for ambiguity. Avoid vague terms like “understand” or “learn” and use action verbs that are observable and measurable, such as “analyze,” “design,” or “evaluate.”
  • Student-Centered: Frame outcomes from the student’s perspective, focusing on what the student will be able to do, rather than what the instructor will cover.
  1. Measurability
  • Assessable: Ensure that each SLO can be effectively measured through assessments. The outcomes should lend themselves to evaluation through exams, projects, presentations, etc.
  • Quantitative and Qualitative Measures: Consider using a mix of both quantitative (e.g., test scores) and qualitative (e.g., portfolio reviews) measures to capture the full range of student learning.
  1. Cognitive Complexity (Bloom’s Taxonomy)
  • Diverse Cognitive Levels: Use Bloom’s Taxonomy as a guide to ensure that SLOs cover a range of cognitive levels—from basic knowledge and comprehension to higher-order thinking skills like analysis, synthesis, and evaluation.
  • Appropriate Challenge: Ensure that outcomes are appropriately challenging for the level of the course or program, encouraging students to advance in their learning.
  1. Inclusivity and Equity
  • Accessibility: Ensure that SLOs and assessments are designed with all students in mind, considering different learning styles, abilities, and backgrounds. Avoid language or expectations that might exclude or disadvantage certain groups.
  • Universal Design for Learning (UDL): Incorporate UDL principles to create multiple means of engagement, representation, and expression, allowing all students to demonstrate their learning effectively.
  1. Rubric Development
  • Clear Criteria: Develop rubrics with clear, specific criteria that describe varying levels of performance (e.g., excellent, proficient, needs improvement). Each criterion should be directly tied to the SLOs.
  • Consistency and Fairness: Ensure that rubrics provide a consistent and fair method for evaluating student work. Rubrics should be applied uniformly across all students and assignments.
  • Transparency: Share rubrics with students in advance so they understand the expectations and can use them as a guide for their work.
  1. Variety of Assessment Methods
  • Multiple Modes: Use a variety of assessment methods (e.g., exams, projects, presentations, peer reviews) to capture different aspects of student learning and accommodate diverse learning styles.
  • Formative and Summative Assessments: Incorporate both formative assessments (ongoing, low-stakes assessments like quizzes or draft reviews) to monitor progress and summative assessments (final projects or exams) to evaluate overall achievement of SLOs.
  • Authentic Assessments: Whenever possible, use assessments that simulate real-world tasks or challenges relevant to the discipline, providing meaningful context for students.
  1. Feedback and Improvement
  • Actionable Feedback: Design assessments that provide students with meaningful, actionable feedback that they can use to improve their learning and performance.
  • Continuous Improvement: Use assessment results to refine and improve SLOs, rubrics, and teaching practices. Assessment should be part of a continuous feedback loop that enhances the educational experience.
  1. Scalability and Manageability
  • Feasibility: Ensure that the number of SLOs and the associated assessments are manageable for both instructors and students. Overly complex or numerous outcomes can lead to assessment fatigue and diminish the quality of evaluation.
  • Resource Availability: Consider the resources available, such as time, technology, and support services, when developing assessments. Ensure that the chosen methods are sustainable over time.
  1. Stakeholder Involvement
  • Faculty Collaboration: Engage faculty in the development and review process to ensure that SLOs and assessments reflect collective expertise and are consistent across the program.
  • Student Input: Consider gathering feedback from students on assessments and rubrics to understand their perspective and make adjustments as needed.
  • Accreditation and Compliance: Align SLOs and assessments with accreditation standards and institutional requirements, ensuring that your program meets external expectations.

By taking these factors into consideration, you can develop robust and effective student learning outcomes, rubrics, and assessments that enhance student learning and contribute to continuous improvement in your educational program.

In going through this exercise, I was impressed with GPT-4’s response, which is pretty similar to human experts.

Our higher educational institutions have a critical role to play in helping students realize their aspirations through inclusive curricula and opportunities that, instead of drilling and killing, help students thrive.

In the 1995 film Apollo 13, Ed Harris, the actor playing Gene Kranz, NASA flight director, says, “failure is not an option.” To me that line is apropos to the present fraught context where the public is divided on the value of education. By being laser focused on promoting student success, facilitated by use of effective learning outcomes and assessment instruments, our colleges and universities can ensure failure is not an option.

The post V7I1: Letter from the President appeared first on NWCCU.

]]>
V6I3: Peer Comparators and Student Success  https://nwccu.org/news/v6i3-peer-comparators/?utm_source=rss&utm_medium=rss&utm_campaign=v6i3-peer-comparators Fri, 15 Mar 2024 05:23:16 +0000 https://nwccu.org/?p=1744 Ed Harri, Senior Vice President, NWCCU   As a staff liaison to a number of NWCCU member institutions, I have the opportunity to regularly observe the impressive work our member institutions undertake to advance student success. Many have been inspired by peer institutions effectively tackling similar challenges resulting in demonstrated improvements in their student outcomes. NWCCU …

The post V6I3: Peer Comparators and Student Success  appeared first on NWCCU.

]]>
Ed Harri, Senior Vice President, NWCCU  

As a staff liaison to a number of NWCCU member institutions, I have the opportunity to regularly observe the impressive work our member institutions undertake to advance student success. Many have been inspired by peer institutions effectively tackling similar challenges resulting in demonstrated improvements in their student outcomes. NWCCU strongly believes in the power of peer learning for institutional continuous improvement. It is a major contributor to developing clear and coherent strategies for improving student outcomes and closing gaps in achievement between different student populations.  

NWCCU’s 2020 Standards for Accreditation 1.D.2 and 1.D.3 reflect these commitments to the use of disaggregated student achievement data, benchmarked against institutionally-identified peers, to support continuous improvement.  

Challenges  

Since adoption of the Standards, approximately half of the institutions (39 of 80) that completed their year-seven, Evaluation of Institutional Effectiveness received a Recommendation on either Standard 1.D.2 or 1.D.3. A Recommendation refers either to the evaluation team’s final statements or its confidential advice to NWCCU regarding an institution’s accreditation or candidacy status.  

Recommendations related specifically to Standards 1.D.2 or 1.D.3 are the result of a number of factors, including but not limited to lack of evidence for:  

  • Use of disaggregated data for regular planning and review processes. 
  • Benchmarking against regional and national peers. 
  • Public posting of student achievement outcomes. 

While some of these deficiencies become evident through the institutional self-study and peer evaluation visit and report, institutions and peer evaluators have indicated they struggle in interpreting elements of Standards 1.D.2 and 1.D.3, specifically on the selection of peers, disaggregation of data requirements, and presenting and using data. This makes it challenging for institutions to effectively demonstrate and report on practices they use to address the Standards. Consequently, it becomes difficult for peer evaluation teams to evaluate these practices in the report and visit.   

Guidance  

In an effort to provide institutions with support and clarity that enables them to productively move forward in what we consider critical Standards for demonstrating continuous improvement, NWCCU has created guidance for Standards 1.D.2 and 1.D.3. The guidance describes the purpose of these specific Standards. This includes ensuring that institutions: 

  • Define and use relevant student achievement indicators for planning, resource allocation, and improvement.  
  • Disaggregate their own student achievement data to identify equity gaps and inform strategies for closing gaps.  
  • Identify peer institutions and benchmark their student achievement data to measure student achievement efforts. 
  • Publicize their student achievement indicators on their website for prospective students, faculty and staff, members of the public, and other stakeholders. 

Frequently asked questions related to these Standards are also addressed in the guidance document, including the number of peer institutions to be identified, best practices in selecting peers, requirements for disaggregation of data, and presenting and using data.  

The NWCCU Board of Commissioners was afforded an opportunity to discuss and provide input to the guidance during the Commission meeting in January 2024. Similarly, the guidance has been provided via email in February 2024 to chief executive officers, academic officers, and accreditation liaison officers (ALO) of NWCCU member institutions. In addition to discussion of the guidance during the ALO Townhall in February 2024, we will incorporate the guidance into peer evaluator refresher training sessions to be held during Summer 2024.  

Our goal is to have evaluators become prepared to ask questions of institutions related to this guidance starting with the Fall 2024 peer evaluation cycle, as envisioned in NWCCU’s Mission and Vision to promote access, belonging, and success of all students. In doing so, our intent is to advance data-informed peer evaluation, which incorporates use of institutional data as a formative, continuous improvement approach to promoting student achievement and success.  

Please note that these efforts will be further supported by the institutional data dashboards being developed and deployed by NWCCU’s Vice President of Data Science and Analytics, Jess Stahl. Dr. Stahl’s article describes these dashboards in more detail. 

Institutional Examples 

A number of institutions are already demonstrating strengths in their use and presentation of disaggregated data. Examples include:  

  • Saint Martin’s University: SMU has publicly available dashboards comparing their student achievement outcomes to their identified peer institutions.  
  • University of Western States: UWS took the lead in developing a data-sharing agreement with other graduate health professions institutions and uses these data to compare student success and progress.  
  • Whatcom Community College: WCC has an extensive history of comprehensive college, program, and course-level disaggregated student achievement and outcomes assessment data publicly available in various forms. 
  • Western Washington University: WWU has recently updated its public-facing dashboards to provide years of data on enrollment, graduation, and and retention compared to different peer groups, including aspirational peers.  

We anticipate that institutions will be able to further demonstrate their efforts on these Standards in the near future, leading to more examples to share with our members.  

Please continue to share with your staff liaison feedback and your own examples of data presentation and peer comparisons. We have been using your feedback to provide this improved guidance, which should only help to strengthen our commitment to these Standards and institutional demonstrations of improved student outcomes. 

The post V6I3: Peer Comparators and Student Success  appeared first on NWCCU.

]]>
V6I3: Update on NWCCU’s Data Dashboard and Decision Support System https://nwccu.org/news/v6i3-data-dashboards/?utm_source=rss&utm_medium=rss&utm_campaign=v6i3-data-dashboards Fri, 15 Mar 2024 05:19:45 +0000 https://nwccu.org/?p=1776 Jess Stahl, Vice President for Data Science and Analytics, NWCCU   As the Spring 2024 accreditation cycle begins, I want to share some exciting updates about NWCCU’s decision support system and how it supports NWCCU’s mission to apply data- and evidence- informed standards and processes to support continuous improvements and equitable student achievement and success.    Over …

The post V6I3: Update on NWCCU’s Data Dashboard and Decision Support System appeared first on NWCCU.

]]>
Jess Stahl, Vice President for Data Science and Analytics, NWCCU  

As the Spring 2024 accreditation cycle begins, I want to share some exciting updates about NWCCU’s decision support system and how it supports NWCCU’s mission to apply data- and evidence- informed standards and processes to support continuous improvements and equitable student achievement and success.   

Over the past several years, we have been continuously improving our data systems and processes. Since 2020, we have developed and implemented a robust, cloud-based data warehouse and created a core set of dashboards with key indicators related to enrollment, outcomes, and basic financial data to support our staff liaisons, peer evaluators, and Board of Commissioners. In Fall 2023, we began to publish the dashboards on our website, along with an annual report based on data from the National Advisory Council on Institutional Quality and Integrity (NACIQI) that provides an holistic overview of our member institutions. 

Data dashboards found on NWCCU’s website.

You may have noticed that we refer to our data system as a “decision support system”, which naturally raises the question, “What kinds of decisions is this data system used to support?”.  I want to reassure you that we do not use our dashboards or other data tools to make automated or solely “data-driven” decisions regarding accreditation. Rather, we use our data system to gather a significant amount of relevant data from multiple sources across the postsecondary data ecosystem, synthesize those data so that we can use it in ways that are meaningful for accreditation purposes, and present the information to our stakeholders in ways that are useful and productive.  In other words, we use our decision support system to synthesize and integrate the wealth of data available, transform the data into “user-friendly” visual and written formats so that our stakeholders can make use of the relevant information in support of our mission to be data- and evidence-informed. 

The most essential data sources that we draw upon are (1) primarily, the data provided directly by member institutions (i.e., your institution’s reports and Annual Report) and (2) secondarily, the Integrated Postsecondary Education Data System (IPEDS) from the U.S. National Center for Education Statistics (NCES), which is a part of the Institute for Education Sciences (IES) within the U.S. Department of Education (ED).   

The dashboards presented on our website are based on the following IPEDS data:  

  • Enrollment: total enrollment with five-year % change, part-time and full-time, graduate, and undergraduate FTE Fall enrollment with five-year % change. 
  • Graduation Rate: total cohort; bachelor’s degree within 100%, 150%, and 200% of normal time; degree/certificate within 100%, 150%, 200% time; by Pell/Gender (within six years), transfer out rate.

Starting in Spring 2024, we are beginning to including on an experimental basis for institutions undergoing the Year-Seven, Evaluation of Institutional Effectiveness (EIE), benchmarking (median) data based on Carnegie Classification (i.e., “C18Basic”) over a five-year period (2017-2021) for Total Enrollment, FTE Fall Enrollment, Graduation Rate Total Cohort, Bachelor’s Degree within 200% of normal time, Degree/Certificate within 200% of normal time. The data, based solely on Carnegie Classification, is categorized as “National Peers” (all institutions nationally with the same Carnegie Classification) or “NWCCU Peers” (all NWCCU accredited institutions with the same Carnegie Classification). In future, benchmarking data provided will include the peer sets identified in the IPEDS Data Feedback Report (i.e., “Comparison Group”, which is either defined by the institution as a custom comparison group or, if none is provided by the institution, a comparison group selected by NCES) and the institution’s self-identified (national and regional) peers. 

Also, recognizing that many students undertake postsecondary education specifically as a means to attain upward economic mobility, we will also consider additional data such as post-college earnings provided via the Equitable Value Explorer (EVE) developed by the Postsecondary Value Commission. Specifically, we will provide institutions undergoing an EIE and their peer evaluators with the following EVE data: completion rate, time to credential, earnings 10 years after initial enrollment, cumulative net price, price to earnings premium, and default rate. Likewise, we also consider Price to Earnings Premium data (when available), which aims to measure the number of years it takes students to recoup the net cost of earning a credential at the institution using a formula that divides “total average net price” by (“post-enrollment earnings” minus “typical salary of a high school graduate in the institution’s state”). 

In addition to receiving access to the online dashboards, each institution undergoing an EIE in the Spring 2024 cycle and their peer evaluators have been provided a workshop, access to a custom data analysis of key indicators as a starting point for data-informed dialogue, and data consultation. We are also pleased to provide our stakeholders, including the public, access to meaningful institutional data. 

The post V6I3: Update on NWCCU’s Data Dashboard and Decision Support System appeared first on NWCCU.

]]>
V6I3: NWCCU Vice President Receives $1.25M Grant https://nwccu.org/news/v6i3-gates-data-grant/?utm_source=rss&utm_medium=rss&utm_campaign=v6i3-gates-data-grant Fri, 15 Mar 2024 05:18:11 +0000 https://nwccu.org/?p=1774 Jess Stahl, NWCCU Vice President of Data Science and Analytics, received a $1.25M grant from the Gates Foundation to improve access to postsecondary outcomes data.    This grant will enable NWCCU to continue supporting our member institutions to engage in data-informed approaches to improve equitable outcomes and facilitate exploration of the Postsecondary Data Partnership (PDP), a …

The post V6I3: NWCCU Vice President Receives $1.25M Grant appeared first on NWCCU.

]]>
Jess Stahl, NWCCU Vice President of Data Science and Analytics, received a $1.25M grant from the Gates Foundation to improve access to postsecondary outcomes data.   

This grant will enable NWCCU to continue supporting our member institutions to engage in data-informed approaches to improve equitable outcomes and facilitate exploration of the Postsecondary Data Partnership (PDP), a national student-level data system, among interested NWCCU member institutions.  

Dr. Stahl’s work will focus on enriching the value of the PDP for our member institutions and providing greater access to important research insights in partnership with the National Student Clearinghouse via modern privacy-enhancing technology and methods.   

As part of this effort, she will continue to grow the Data Equity Fellowship, which launched last year with an inaugural cohort of ~200 fellows across 80 institutions, nationally. As a complement to the Data Equity Fellowship, she will be launching a “data-informed decision-making simulation app” (tentatively code-named DataVersity) along with a national “friendly competition” (within the app) to improve equitable outcomes through simulated evidence-based actions based on relevant data in partnership with SACSCOC.   

The post V6I3: NWCCU Vice President Receives $1.25M Grant appeared first on NWCCU.

]]>
V6I1: NWCCU’s Data Equity Fellowship https://nwccu.org/news/v6i1-nwccus-data-equity-fellowship/?utm_source=rss&utm_medium=rss&utm_campaign=v6i1-nwccus-data-equity-fellowship Mon, 25 Sep 2023 19:32:00 +0000 https://nwccu.org/news/v6i1-nwccus-data-equity-fellowship The Data Equity Fellowship was inspired by recommendations from the national experts convened by NWCCU in 2021 as the Data Council and its counterpart, the Equity Council. These recommendations centered on the need for facilitated, collaborative experiences bridging the domains of "data" and "equity." Read on to learn more about the Data Equity Fellowship and ways to get involved.

The post V6I1: NWCCU’s Data Equity Fellowship appeared first on NWCCU.

]]>

 

Jess Stahl, Vice President, NWCCU

In August 2022, a pioneering group of 136 professionals from 55 institutions across 17 U.S. states and British Columbia embarked on a 12-month professional development journey as the inaugural cohort of the Data Equity Fellowship. The Data Equity Fellowship was offered in partnership with the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC), so Data Equity Fellows were able to learn from national (and international) colleagues. The Fellowship is designed as a virtual, 100% online experience so that geographic location would not present a barrier to participation, particularly for institutions in rural locations, limited travel funding, or limited time available for travel. Therefore, Fellowship teams represented diverse institutions: R1 universities, community and technical colleges, health professions- focused institutions, HBCU, TCU, HSI, faith-based, rural, and urban institutions.

 

 

 

The Data Equity Fellowship was inspired by recommendations from the national experts convened by NWCCU in 2021 as the Data Council (chaired by Dr. Timothy Renick, Executive Director of the National Institute for Student Success at Georgia State University) and its counterpart, the Equity Council (chaired by Dr. Tia Brown McNair, Vice President in the Office of Diversity, Equity, and Student Success and Executive Director for the Truth, Racial Healing, and Transformation Campus Centers at the American Association of Colleges and Universities). These recommendations centered on the need for facilitated, collaborative experiences bridging the domains of “data” and “equity.” The Data Equity Fellowship provides exactly that type of professional development experience as Fellowship teams comprised of at least one professional whose role centers on “data” and at least one professional whose role centers on “equity” come together over the course of 12 months to:

  • Learn about “data equity” – not to be confused with “using data to support equity initiatives,” which is a closely related but different concept (also covered in the Data Equity Fellowship).
  • Evaluate data to identify the area of focus for a data-informed project to improve equitable outcomes that is well-aligned with their institution’s unique mission and institutional goals.
  • Engage in a facilitated planning process using a data equity framework to plan the team project and rooted in evidence-based data equity principles.
  • Implement a team project, including planning to sustain and scale the impact, and report on the outcomes and lessons learned.

Data Equity Fellowship teams varied in size with the largest team size being 7 team members and an average team size of 3-4 members. The professional roles held by Data Equity Fellows at their institution were extremely diverse and included Assistant Provost, Associate Vice Chancellor, Chief Diversity Officer, Vice President, Executive Director, Dean, Business Intelligence Analyst, Research & Data Analyst, Accreditation Liaison, Program Manager, Research Associate, and Professor across areas like Institutional Research/Assessment/Effectiveness, Administration & Finance, Diversity, Equity, & Inclusion, Equity & Compliance, Curriculum Programs, Process Improvement, Student Engagement & Success, Academic Instruction, Analytics & Decision Support, Alaska Native Education & Outreach, Workforce Program Research & Development, HR & Strategic Engagement, Academic Affairs, and Student Affairs.

Data Equity Fellows engaged in Communities of Practice, which were self-selected, peer-led groups with a specialized focus based directly on the Fellows’ interests, including best practices for addressing equity gaps, collaborating with faculty to address equity gaps, collecting and using qualitative data, DEI in community and technical colleges, DEI in health professions-focused institutions, DEI in minority serving institutions, DEI in predominantly white institutions, disaggregating data, engaging in courageous conversations, and methodologies for identifying equity gaps. Communities of Practice came together on a monthly basis to share insights and resources, support, project ideas and feedback based on their common goals and interests. A key learning from the Communities of Practice was that everyone has expertise and insights that can benefit the whole community, regardless of job title or the number years of experience in a specific role because each person has an important perspective to share that can increase the field of possibilities for creativity and novel ideas for problem-solving in challenging areas.

The Fellowship projects centered on a wide variety of evidence-based and high impact areas like targeted initiatives that address equity gaps (identified using disaggregated institutional data); inclusion of equitable data in institutional mission, vision, and strategy; collaborative improvements to student advising to improve student retention and completion; initiatives developed in collaboration with faculty using data to identify and address equity gaps; and equity in employee recruitment and hiring, including data-informed policies, processes, and training. Based on their institution’s level of data maturity and culture of using data effectively to support equitable outcomes, some Fellowship teams focused on building institutional data capacity through initiatives centered on developing key institutional data systems and processes; providing or expanding appropriate access to disaggregated data; and building institutional capacity for analysis of disaggregated data to identify areas of inequitable outcomes.

The Data Equity Fellows also ventured into the metaverse in the last months of the Fellowship! Taking advantage of the Fellowship’s virtual structure, Fellowship teams presented their data equity projects in the metaverse, which almost none of the Fellows had previously experienced. It was exciting to explore a new online environment and everyone quickly learned how to move through the space, chat, and connect via livestream video with each other and the Fellowship teams hosting virtual “poster sessions.”

You may have seen colleagues sharing their (well-earned) badges and/or certificates from the Data Equity Fellowship- as graduates of the 2022/23 Data Equity Fellowship, the Fellows have joined the largest data equity community for postsecondary professionals – the Data Equity Fellowship Alumni Network – where they have the opportunity to continue sharing knowledge, ideas and insights, questions and feedback, events, career growth and transitions, professional opportunities, resources, success stories, support, and maintain long-term connections while making a positive impact for students.

The 2023/24 cohort of the Data Equity Fellowship is being offered in partnership with SACSCOC, UNCF, and Growing Inland Achievement. For more information, please visit the Data Equity Fellowship page or contact the Data Equity Fellowship Director, Dr. Jess Stahl (Vice President, Data Science & Analytics).

 

The post V6I1: NWCCU’s Data Equity Fellowship appeared first on NWCCU.

]]>
V6I1: Strategic Data Dashboard Use to Enhance System and Institutional Effectiveness https://nwccu.org/news/v6i1-strategic-data-dashboard-use-to-enhance-system-and-institutional-effectiveness/?utm_source=rss&utm_medium=rss&utm_campaign=v6i1-strategic-data-dashboard-use-to-enhance-system-and-institutional-effectiveness Mon, 25 Sep 2023 01:09:17 +0000 https://nwccu.org/news/v6i1-strategic-data-dashboard-use-to-enhance-system-and-institutional-effectiveness Given the size and complexity of the Church Educational System (CES), it has been imperative that data be organized and used strategically and effectively. This article outlines the principles that guide data use in CES and then highlights the data dashboard for the system, including an institutional example from BYU-Idaho.

The post V6I1: Strategic Data Dashboard Use to Enhance System and Institutional Effectiveness appeared first on NWCCU.

]]>
Rob Garrett and Nathan Lindsay; Brigham Young University-Idaho and Church Educational System of The Church of Jesus Christ of Latter-day Saints

For decades, institutions and systems within higher education have been working to make more data-informed decisions. This is true of the Church Educational System (CES) for The Church of Jesus Christ of Latter-day Saints, which consists of Brigham Young University, BYU-Idaho, BYU-Hawaii, BYU-Pathway Worldwide, Ensign College, and Seminaries and Institutes of Religion. Taken together, the enrollments in these CES entities are nearly 1 million students. We have a unified mission and are striving to work as a system. Given the size and complexity of the system, it has been imperative that data be organized and used strategically and effectively. This article outlines the principles that guide data use in CES and then highlights the data dashboard for the system, including an institutional example from BYU-Idaho.

Principles Guiding Data Use

Four principles have been important in strengthening each institution’s capacity to integrate research, data, and an information infrastructure to assist in decision-making. These are as follows:

1) Anchor on your purpose:

Use metrics that align with and support the mission and strategy. Some data are important and crucial, while other data do not matter. Thus, it is essential to distinguish up front what data points are key to the institution’s or system’s purpose. In order to use data this way, the mission and strategy of an organization must be clear and actionable.

2) Identify key drivers and constraints:

Identify metrics that amplify purpose and constrain tradeoffs. Understanding whether the mission and strategy are realized is key to using data. Therefore, using data that elevates mission and constrains tradeoffs helps prevent organizational drift.

3) Do not overanalyze or over-index:

Like a car dashboard, an institutional dashboard should provide simple points of data at a glance. It is easiest to keep the dashboard on one page. If the high-level data that points to mission and strategy signals more analysis, engage in the deeper data activities using other tools.

4) Adapt your culture:

Using dashboards and data tied to mission and purpose needs to become part of the organizational culture. The data are not the decision makers, but the data must inform decision makers. Therefore, in the organizational culture, find the appropriate place or processes by which to use data to make decisions.

CES Dashboard

Over the course of this past year, these principles have provided a strong foundation for the development of a system-wide dashboard. A CES Institutional Research committee, consisting of representatives from each of the institutions, met monthly to align on which metrics to use and how to define each of these data points. As shown in the table below, the metrics focus on three key areas: Raise the Quality, Lower the Cost, and Increase the Reach. As one example, Lower the Cost metrics include tracking institutional credits, total credits, cost per FTE, and cost per graduate.

The specific data for each of these metrics are not provided in this article, given that they are primarily intended for internal purposes. The primary intention of the dashboard is not to compare schools, because each has differentiated strategies, but rather to use the data for triangulation and discussion on these topics. Toward that end, it should be noted that the data from these areas are being used by campus presidents and their leadership teams to guide institutional strategy and decisions. In addition to this first page of the dashboard, there is a second page in which each school can share institution-specific metrics, particular to their campus. Each fall the CES dashboard is updated. 

BYU-Idaho’s Data Dashboard

At a more local level, the BYU-Idaho Institutional Dashboard aligns directly with the Mission & Core Themes of the university. The university’s mission is to develop disciples of Jesus Christ who are leaders in their homes, the Church, and their communities. Supporting core themes help build testimonies of the restored gospel of Jesus Christ, provide high-quality education, serve as many students as possible, and deliver education that is affordable. The Mission & Core Themes state the purpose of BYU-Idaho.

Using data that point to the Mission & Core Themes provides a window into the areas that matter most at BYU-Idaho. For example, it includes measures on spirituality, learning, student achievement, enrollment, and cost.

As shown in the table below, specific metrics were selected for the dashboard that help the university to navigate through the natural tensions of delivering high-quality education to as many students as possible, while keeping the offerings affordable. The dashboard also provides important insights that not only support the purpose but also bring light to possible strategic drift. To illustrate, providing data that share teaching and learning activities or student access insights signals what is important to the university and illustrates if desired strategic outcomes are met.

This focus on high-level institutional purposes and the related selected data allows the university to not overanalyze or over-index. Instead, the data provide a window into what really matters for the university. If further investigation is needed, other data tools are used to analyze deeper issues. Disaggregated data are utilized in this approach for analysis and decision-making. This method then connects data analysis efforts to the dashboard and institutional purpose. Using data in this manner allows for simple dashboards that signal if further investigation is needed-just like a car dashboard.

The dashboard has helped with facilitating needed change. To illustrate, data points related to student achievement caused the university to focus on and understand more about students’ experiences and chances for completion. After viewing the data and associated analysis, we recognized that at-risk students did not have a strong path to graduation. For example, the data showed that our re-take policy was hurting students. To respond to these issues, practices and policies were changed to better support students. These changes included an updated retake policy that used the most recent grade in the students’ GPA calculation for all retakes. These changes also included academic renewal and support for the most academically at-risk students. In this example, the data from the dashboard and reports brought many people together from across the university to make changes that will better serve students.

These changes also provided another point of disaggregation for at-risk students. More importantly, those policy changes created a realistic path to completion without undermining graduation requirements. The visibility of students’ circumstances allows for ongoing dialogue and support measures. For instance, an academic department recently changed the curriculum once the students within the department became visible through data to the academic leaders and others.

Dashboard efforts are never finished and are constantly evaluated. With the improvement of data analysis tools, the university is looking to update the dashboard framework. The metrics that are used for the dashboard are also constantly reviewed. Just recently, the academic office requested a change to include an actual development period instead of only using approved plans that bring better insight to execute the plans for faculty development. That request was approved and the data replaced an outdated and less accurate data point. Dashboarding efforts have made a difference at BYU-Idaho and will continue to be refined as the university moves forward to accomplish its mission.

Conclusion

As outlined in this article, the development of a system-wide dashboard (based on foundational guiding principles) has helped CES and its individual institutions prioritize their strategies and shape decision making. We are working to strengthen institutional research through the sharing of common metrics and best practices across the system. Having strong dashboards certainly provides support for accreditation efforts at each institution. When institutional strategy is clear and data are used effectively, institutions are much more likely to achieve mission fulfillment.

In addition, our hope is that this article provides input to others on the use of disaggregated data dashboards for continuous improvement to inform planning, decision making, and allocation of resources, aligned with the Northwest Commission on Colleges and Universities Standards for Accreditation. Such data and indicators can also be used to benchmark against peer institutions at the regional and national levels and be used to promote student success.

 

The post V6I1: Strategic Data Dashboard Use to Enhance System and Institutional Effectiveness appeared first on NWCCU.

]]>
V4I2: NWCCU Institutional Data Capacity Survey Part II https://nwccu.org/news/v4i2-nwccu-institutional-data-capacity-survey-part-ii/?utm_source=rss&utm_medium=rss&utm_campaign=v4i2-nwccu-institutional-data-capacity-survey-part-ii Tue, 02 Nov 2021 16:48:18 +0000 https://nwccu.org/news/v4i2-nwccu-institutional-data-capacity-survey-part-ii Continued look at the results from the Institutional Data Capacity Survey (IDCS), which was designed to provide insight into the capacity (e.g. data, technology, and resources) of institutions to improve equitable outcomes through evidence-based approaches to meet the NWCCU 2020 Standards for Accreditation.

The post V4I2: NWCCU Institutional Data Capacity Survey Part II appeared first on NWCCU.

]]>

By: Dr. Jess Stahl, Vice-President Data Science & Analytics (NWCCU)

NWCCU Institutional Data Capacity Survey (May 2020)
Part II: Data Staffing, Governance, and Ethics

[Part I: Institutional Profiles and Technical Resources appeared in Issue V3I4 of the Beacon]

In May 2020, NWCCU member institutions responded to a survey from the NWCCU entitled “Institutional Data Capacity Survey” (IDCS), which was designed to provide insight into the capacity (e.g., data, technology, and resources) of institutions throughout our region to improve equitable outcomes through evidence-based approaches to meet the NWCCU 2020 Standards for Accreditation. Specifically, the standards (Standard One- Student Success, and Institutional Mission and Effectiveness: 1.D.2, I.D.3, and 1.D.4) require institutions to report disaggregated indicators of student achievement and engage in regional and national benchmarking using transparent methods for data collection and analysis to mitigate gaps in achievement and equity.

Thus, the IDCS covered a wide range of domains including technical resources, data staffing, data governance, and data reporting to inform our strategic planning about how best to support our member institutions.  This article is the second in a three-part series discussing the key highlights of the survey results and our relevant initiatives. This article presents a summary of data staffing, governance, and ethics at NWCCU institutions as reported in the IDCS survey. 

Data Staffing

The most prevalent data role is “Data Analyst” by a large margin. It is notable that 23% of institutions reported having “None” of the data roles presented within the survey whereas 21% reported having three or more data roles, which suggests an institutional data capacity disparity among our institutions. Similar findings across multiple sections of the IDCS inspired our “PDP Accelerator” initiative to actively facilitate increased data capacity for NWCCU institutions through access to high-quality data dashboards to explore intersectional, disaggregated data in support of data-informed approaches to improving outcomes and eliminating equity gaps.


* Totals sum to more than 100% because many institutions reported having more than one role.

The majority (75%) of institutions reported that they would benefit from having greater FTE dedicated to working with data.

When asked what additional resources would be ideal, the most widely cited resource was additional Data Analysts. Several institutions indicated that they currently lack a single, full-time Institutional Research professional. Institutions also expressed a need for additional staff with specific expertise such as qualitative research analysts, statisticians, data scientists, SQL programmers, predictive modeling experts, application developers, data visualization experts, database administrators, data coaches, assessment specialists, big data experts, data engineers, and data literacy advocates. Responses indicated that data staffing would ideally increase by at least 1 additional FTE to meet their institution’s stated goal to be data-informed.

The majority of institutions (66%) indicated that their institution would benefit from additional expertise in working with data. The most desired expertise (by a wide margin) was predictive modeling. Other technical skills mentioned were data science, data engineering, data visualization, and SQL programming. However, institutions also mentioned non-technical skills like promoting the meaningful use of data and effective communication skills. Many noted that it is challenging to remain current with technical skills given a lack of resources and busy workload that leaves little time for professional development.

Institutional Research is the role with the most responsibility (90%) for interpreting data and results of analyses. Information Technology staff are also involved in interpreting data and results of analyses at 25% of institutions in collaboration with Institutional Research or Other staff. Among nearly 40% of institutions, a wide variety of other roles were associated with interpreting data such as Institutional Effectiveness, Provost, Human Resources, Finance, Enrollment Services, Assessment, Data Science, Student Services, Accreditation Liaison Officers, Deans, Registrar, President, Vice President, academic centers and units, and Faculty, which suggests dedication and progress towards building a culture of meaningful data use and promoting data literacy.  However, it is important to note that 47% of institutions reported having 2 or more roles associated with interpreting data and results of analyses while 44% reported that Institutional Research alone bears this responsibility.  Furthermore, 3% of institutions reported having no roles associated with interpreting data and results of analyses.

* Totals sum to more than 100% because many institutions reported having more than one role.

Information Technology (IT) staff control access (permissions) to databases and data warehouse(s) for most (91%) institutions. At about 30% of institutions this it the only role IT plays related to institutional research and assessment. However, more than half of IT staff also provide Excel files for reports or design custom SQL queries to generate reports. At nearly 20% of institutions, IT staff create data dashboards, analyze data, and generate reports for data stakeholders. At about 40% of institutions IT staff fulfill three or more of these roles. Many responses emphasized a close working relationship between IT staff and Institutional Research.

Data Governance and Ethics

There are data governance structures in place at 63% of our institutions. Very few (14%) have implemented data ethics structures. While 14% of institutions have both, 34% reported having neither. Only 32% of institutions reported having a data catalog, which suggests varying levels of data maturity between our institutions.

Data governance is most often directed by a committee (52%) or a department/office (6%) or working group (3%). Data governance is most often managed by IT, Institutional Research, or both (co-managed). At 2 institutions, data governance is directed by an individual.

Data ethics is most often directed by a committee or a department/office (both 3%). Other structures include university policies or data stewards (i.e., individuals that direct data ethics).

In our next issue, we will discuss IDCS results regarding investment in data analytics, data tools and methods (e.g., data visualization, predictive modeling, big data, and artificial intelligence).

Bonus Section:

In Your Own Words (What additional resources would be ideal?)…

“We would benefit from extra personnel with knowledge/skill in increasing data literacy, programming, data engineering, and data visualization and analytics.”

“More people is just part of the solution. We also need training and improvements to internal processes and structures at the University (not one particular unit).

“Data analyst dedicated to data governance. An application developer in IT to refine applications and improve data collection.”

“At a minimum, one full-time staff member with a highly flexible skill set. Perhaps an analyst with the ability to parse data and validate theories for how best to improve our business processes; a data manager who can design queries and perform other data extraction and transformation; or an assistant who reports results to format tables, make plots, and perform basic tasks of uploading and downloading files. In addition, we would benefit from having additional FTE dedicated to creating and maintaining tools that allow college personnel to perform a number of queries that could be pre-set in order to save IR time for genuinely complicated problems. Additional FTE could also be dedicated to the overhaul, storage, and cleaning of data at all levels. As stated previously, the data submission process would benefit from software that takes [SIS] data and converts it for external reports.

The current model is that IR will provide for the analysis needs of a diverse group of clients, internal and external, and across all programs and college units. The challenge is that clients must be more informed users of data because it is not enough to rely on the knowledge of IR staff. To create a working culture of evidence and data-informed decisions, the client side must have sufficient understanding to actively participate in the research/analysis process because this allows them to learn from the data in their program or area and have more ability to engage and interpret it deeply. The client role should not be viewed one-dimensionally as a question-forming process, with the expectation that IR provide answers that conform to client expectations. While IR works within that client-centered framework to create accessible reports, that approach is insufficient in many research contexts where questions are not well-suited to automated, self-service dashboards, and often the results are more complex and require a more active analytical process. This suggests the need for more training in research for people outside of IR, if not FTE/capacity across the college.”

“Yes. IR is understaffed, needing to replace one worker who has moved to part-time in preparation for retirement, and in need of a senior analyst. Marketing and our unit responsible for international education need at least one analyst each.”

“2-3 Personnel; An IT person to build and strengthen the technology infrastructure, an additional IR person to do predictive modeling, and a data coach.”

“Another analyst would enhance the IR office’s capacity to not only provide routine data but to expand data literacy and advance data analytics and modeling.”

“Staff with background/degrees in statistical analyses, management of Big Data. Broader circle of staff with access to the data warehouse resources we currently have.”

“At least one (and preferably two) additional IR/IT staff focused on data automation and professional development/staff support on effective data use.”

“There is a desire to move from descriptive to prescriptive analyses. In addition, there is a need to address data literacy challenges at our college.”

“Ideally, a part to full-time database analyst would be useful query data and build reports and/or files for export to tools such as Tableau. This would allow the Director of Institutional Effectiveness more time for teaching data literacy, developing data governance and building data visualizations for use across campus and for the public. Alternatively, a single IR position with more technical background than the incumbent might allow for a leaner approach to fulfilling these roles.”

“One qualified person, with the sole purpose of working with departments to collect and analyze data as well as ensuring that the data is used for decision-making across the institution.”

“Data integrity auditing to ensure data is updated over time.”

“There are a variety of ways to build additional capacity: add a data manager; add someone who can focus on assessment alone; add someone who can focus on qualitative data; add a data coach to ‘market’ the use of data and present the data across the institution; add an analyst to allow the other person to be a data scientist. Or, possibly, build the staff to develop and maintain a data warehouse and/or data lake that would make data analytics more accessible.”

“A more robust team in IR; someone whose job description focuses on streamlining data, including a clear plan, and a cross-departmental process for working with data.”

While more FTE devoted to working with data would be potentially beneficial, additional training and increased data literacy of current staff would also be beneficial.

“”We need more data analysts looking for data quality issues; more IT staff to build and maintain tools used for reporting and analysis. Data scientist to build predictive models and incorporate artificial intelligence. Current office has two people serving the entire institution.”

In Your Own Words (What additional expertise would be ideal?)…

“Predictive analytics, programming, data engineering.”

“Use of data for planning and budgeting.”

“Learning from a partner or having a department at a similar institution would be beneficial. If there are no other [similar] colleges that have the expertise, then being matched with an institution that has experienced building data collection and analysis from the bottom up.”

“The best use of data to improve student success and efficiencies across the college.”

“Training in review, analysis, interpretation and presentation of data.”

“Data analysis capability. Report generation facilitation. Outside experts to teach new or updated methods.”

“Technical assistance for college leaders regarding interpretation of data. Data coaching for leaders who are seeking to use data to inform decision making.”

“Additional 1.0 FTE with 10+ years’ experience in Higher Education analytics at an R1 institution.”

“We have insufficient capacity for analysis and interpretation of data.”

“We need a data architect to help us design a comprehensive system overhaul and aid in the creation of more effective data architecture, including stable datasets, definitions, data modeling, and predictive analytics. Tableau requires some coding and data analytics, so an additional analyst for program review, which requires individual data analytics, a trainer to help staff to become more literate data users, and the skills mentioned in the previous questions.”

“People who understand ingestion, storage, and reporting of data within the IT department so that algorithms and other efficiencies can be built to support IR and assessment work.”

“Statistical analysis, predictive modeling, survey design, unit level or program level assessment.”

“HTML coding and web authoring, Python, Powershell.”

“Guidance in how to set up, integrate, and effectively manage a data warehouse. Also, how to set up a centralized repository for data and create effective institutional dashboards and data visualizations.”

“Information on how other small colleges manage data and data requests. Expertise on using data (we collect a lot of information but fall down when it comes to connecting it to actions/decisions).”

“More training in data analytics and visualization software. The learning curve for these is steep, which can lead to lost time in producing valuable analytics.”

“Use of a data warehouse and hands-on opportunities to experience the potential with data technology tools.”

“Dedicated professional and staff development time each month to stay current with the rapidly changing AI environment and other reporting requirements. Time to “document” processes and procedures including updates as these tasks take a lot of time given the constantly changing nature of data and reporting.”

“The concepts surrounding the use of data to inform decision-making are new to many on our campus. There has been remarkable progress in learning about these concepts over the last year, and additional training would improve our ability to create measurable goals/outcomes for all of our mission fulfillment tasks.”

“Data governance expertise.”

“High-level statistics and modeling would enhance our ability to produce insights and inform decision-making. In general, staff have uneven fluency with data and levels of expertise in analysis.”

“Artificial Intelligence and Predictive Analytics.”

“Actually well trained individuals in data issues, collection, analysis and reporting are a premium.”

 

The post V4I2: NWCCU Institutional Data Capacity Survey Part II appeared first on NWCCU.

]]>
V3I4: NWCCU Institutional Data Capacity Survey https://nwccu.org/news/v3i4-nwccu-institutional-data-capacity-survey/?utm_source=rss&utm_medium=rss&utm_campaign=v3i4-nwccu-institutional-data-capacity-survey Thu, 13 May 2021 17:32:05 +0000 https://nwccu.org/news/v3i4-nwccu-institutional-data-capacity-survey NWCCU Institutional Data Capacity Survey, Part I: Institutional Profiles and Technical Resources

The post V3I4: NWCCU Institutional Data Capacity Survey appeared first on NWCCU.

]]>
By: Dr. Jess Stahl, Vice-President Data Science & Analytics (NWCCU)

NWCCU Institutional Data Capacity Survey (May 2020)
Part I: Institutional Profiles and Technical Resources

 

In May 2020, NWCCU member institutions responded to a survey from the NWCCU entitled “Institutional Data Capacity Survey” (IDCS), which was designed to provide insight into the capacity (e.g. data, technology, and resources) of institutions throughout our region to improve equitable outcomes through evidence-based approaches to meet the NWCCU 2020 Standards for Accreditation. Specifically, the standards (Standard One- Student Success, and Institutional Mission and Effectiveness: 1.D.2, I.D.3, and 1.D.4) require institutions to report disaggregated indicators of student achievement and engage in regional and national benchmarking using transparent methods for data collection and analysis to mitigate gaps in achievement and equity.

Thus, the IDCS covered a wide range of domains including technical resources, data staffing, data governance, and data reporting to inform our strategic planning about how best to support our member institutions. This article is the first in a three-part series discussing the key highlights of the survey results and our relevant initiatives. This article will present a summary profile of our member institutions and the technical resources used to manage institutional data.

Regional “Snapshot”: Summary of Institutional Profiles

The total number of institutions responding to the IDCS was 159, which was 100% participation.

 

Nearly 60% of our member institutions are located in Washington or Oregon. Two institutions reported geographic location in multiple states. (Note: “BC” = British Columbia, Canada)

Based on self-report using size designations from the College Scorecard, “Large” institutions represent 14% of our membership whereas 86% of our institutions are “Medium” or “Small”.

Four-year institutions comprise 60% of our membership and 20% of those institutions are designated as “Large” in size compared with only 5% of two-year institutions that are “Large”. The majority (59%) of our two-year institutions are “Medium” and 37% are “Small” in size.

A clear majority (70%) of our member institutions are “Public”, but there are also a significant number of “Private Non-profit” institutions. The size distribution for our Private Non-profit institutions is nearly identical to the overall distribution: 48% Medium, 39% Small, 13% Large.

Our largest institutions are 73% Public, 27% Private Non-profit, and 86% are four-year institutions. At the other end of the spectrum, our smallest institutions are 71% Public, 29% Private Non-profit, and 63% are four-year institutions.

The survey provided an opportunity for institutions to self-identify as any of the following (“unofficial”) designations: Hispanic Serving Institution, Tribal College or University, Native American Non-Tribal Institution, or Faith Based Institution. Our intention for these categories was to identify and understand any challenges and evidence-based approaches undertaken by institutions serving specific student populations. In total, 48 institutions identified as follows:

(One institution identified as both Native American, Non-Tribal and Hispanic serving.)

Faith-based institutions were asked:

As a Faith Based institution, how do you evaluate (‘evidence-based’ approach) fulfillment of the faith-based aspects of your mission (e.g. faith, service, leadership, ethics, personal growth, reflection)?

Institutions identified the following key dimensions as faith-based aspects of their mission that are reflected in their programs and courses: pastoral charity (e.g. concern for the poor, marginalized, and suffering), active engagement with persons of other religions, openness to persons of other cultures, integration of theology and pastoral practice, faith and values based decision-making, student behavior, service, leadership, ethical reflection, faith commitment, spiritual formation, and critical thinking regarding moral and ethical practices. Many indicated that their mission relates to lifelong demonstration, growth, and service across these dimensions.

Institutions use a variety of methods to measure the key dimensions that support mission fulfillment as a faith-based institution, including: evaluation (direct assessment) by formators and supervisors, opportunity to practice and develop in pastoral ministry placements, surveys, learning outcomes in core curriculum courses specifically tied to theological and ethical education, national benchmarking with similar faith-based institutions, participation in staff and faculty development related to specific faith-based pedagogical traditions, service requirements, course based rubrics, student feedback, course evaluations, student reflections and testimonials, each employee has a mission fulfillment discussion with their supervisor as a part of their annual performance review, capstone courses, engagement in chapel services, faith integration rubric for program assessment, faculty portfolios, specific surveys mentioned: National Survey of Student Engagement (NSSE), Christian Life Survey, faculty growth plans, and peer reviewed portfolios.

Hispanic Serving institutions were asked:

As a Hispanic Serving Institution, what challenges have you had (or anticipate having) with regard to the Interim or Annual Report?

Many institutions reported that they did not anticipate any challenges or were uncertain (as newly designated Hispanic Serving Institutions) about whether they would face any specific challenges. However, institutions reported that the most challenging issues are: interpreting federal data definitions (e.g. various FTEs), lack of institutional data capacity, difficulty measuring some of the performance measures identified in the initial grant proposal, technical problems with the Annual Performance Report (APR) website, lack of regular communication from the program officer, belief that number of Hispanic students is under-reported due to students identifying as ‘unknown’ race/ethnicity, difficulty obtaining information about “Deferred Action for Childhood Arrivals” (DACA) and undocumented students because identification is voluntary, and a need to engage in more workforce partnerships and establish continuity in transfer to other universities or colleges.

Tribal Colleges and Universities (TCU) were asked:

As a Tribal College or University, how do you evaluate (‘evidence-based’ approach) fulfillment of your mission (e.g. foster tribal values, culture, languages, self-determination, knowledge, and traditions)?

Institutions identified the following key dimensions as important aspects of their mission as a TCU that are reflected in their programs and courses: knowledge of culture and language, traditional knowledge and traditions, history, cultural dignity, and Native lifeways.

Institutions use a variety of methods to measure the key dimensions that support mission fulfillment as a TCU, including: integrating Indigenous Assessment Metaphor into the overall Institutional Assessment plan (grounding assessment efforts in place-based values and beliefs); participation in cultural programming; culture/language requirements within each degree; AIMS Key Indicator System (AKIS); course level assignments with evidence-based rubrics detailing how culture and language are integrated and graded within each assignment; departments must articulate culture and language within student learning outcomes; “support cultural perpetuation, including language, culture and history” as a core theme; outcome data from language classes; evidence of Native language learning among faculty and staff and its contribution to their work, faculty and staff participation in campus cultural events; evidence of cultural exchange among students, faculty, staff, and community; historical and cultural library holdings and their usage.

Native American, Non-Tribal institutions were asked:

As a Native American Non-Tribal Institution, what challenges have you had (or anticipate having) with regard to the Interim or Annual report?

These institutions reported that they did not anticipate any challenges in reporting.

The IDCS survey utilized categories aligned with the purpose of the survey to examine data capacity. The (U.S.) federally designated categories for minority serving institutions are[i]:

Acronym

Description

NWCCU members

AANAPISI

Asian American, Native American, Pacific Islander Serving Institution

15*

ANNH

Alaskan Native, Native Hawaiian Serving Institution

2

HSI

Hispanic Serving Institution

12*

NASNTI

Native American Serving, Non-Tribal Institution

2

TCU

Tribal Colleges and Universities

9

AAPI

Asian and Pacific Islander Serving Institution

HBCU

Historically Black Colleges and Universities

PBI

Predominantly Black Institutions

MSEIP

Minority Science & Engineering Improvement Programs

There are currently 39 NWCCU institutions federally designated as minority serving institutions and 44% of those are in Washington. There are no federally designated minority serving institutions in Idaho or Utah. One institution holds designation as both AANAPISI and HSI.

The IDCS also provided an opportunity for institutions to identify whether their institution has implemented or has key initiatives underway for the evidence-based practices. The most widely implemented initiative is “guided pathways” (36%) followed by “co-remediation” / corequisite (21%)[ii]. “Competency-based” (9%) programs are the least commonly implemented initiative.

Institutions implementing “guided pathways” were asked:

How does your institution evaluate the success of its Guided Pathways (evidence-based approach)?”

Methods for evaluating the success of guided pathways initiatives include: statewide dashboard with common metrics (notably, the Washington State Board for Community & Technical Colleges dashboard), Voluntary Framework of Accountability (VFA), benchmarking with institutions with similar demographics, collaboration between institutional research and student affairs offices to identify and track key indicators such as advising appointments and registration in recommended courses, tracking Community College Research Center’s (CCRC) Early Momentum Metrics, analysis of declaration of pathway upon entry and program by second quarter, institutional scorecards, rubrics, signature assignments from selected classes, Association of American Colleges & Universities (AAC&U) multi-state collaborative, student perception feedback (e.g. quarterly College 110 student survey, Community College Survey of Student Engagement- CCSSE, Survey of Entering Student Engagement- SENSE, student focus groups), College Spark Guided Pathways Initiative, program maps evaluated annually during a faculty workday focused on assessment of learning, student self-reflection at each benchmark (developed by academic counselors), e-portfolio system.

Specific metrics that institutions use to evaluate guided pathways initiatives include: credits to degree, first year retention, graduation, completion, course success, credit accumulation (both making consistent progress and also fewer total credits accumulated to complete), post-graduate employment, transfer, continuous enrollment in math and English until general education requirements have been completed, shorter time to degree, completion of college level  English and math within the first year of enrollment, admissions conversion rates, increase in number and percentage of students who have chosen a program pathway within two quarters or 30 credits.

Institutions implementing “co-remediation” were asked:

“How does your institution evaluate the success of Co-Remediation (evidence-based approach)?”

Methods for evaluating the success of guided pathways initiatives include evaluating transcript data, Integrated Basic Education and Skills Training (I-BEST), and enrolling students in a co-requisite support class.

Specific metrics that institutions use to evaluate guided pathways initiatives include: pass rates in key gateway courses, increased percentage of first-year students who successfully complete math requirements, time to graduation, student performance in subsequent courses, DFW rates, enrollment trends, grades earned by those enrolled in co-remediation courses compared with those in traditional math and English courses, grades earned by those in co-remediation courses compared with pre-university course grades, and year to year success rates.

Institutions implementing “competency-based” programs were asked:

“How does your institution evaluate the success of its Competency Based approach? (evidence)”

Methods for evaluating the success of competency-based programs include: student feedback within the class and course evaluations, institutional accountability plan, surveys, in-person and phone interviews, and evaluating portfolios.

Specific metrics that institutions use to evaluate competency-based programs include: graduation, licensure/certification exams, placements, retention, credentials awarded, student satisfaction measures, and enrollment and final grades over 3 semesters.

All institutions were asked:

“What are the most challenging aspects of data (collection, analysis, reporting) for your institution?”

We received strikingly similar descriptions of common data challenges from most institutions: data efforts are excessively time-consuming; data is siloed, decentralized, and has multiple ‘owners’; fragmented data efforts; limited data is available; lack of best practices for analyzing both qualitative and quantitative data; questionable data quality or accuracy; outdated data systems and tools; insufficient data staffing; unclear data collection (e.g. what, how, sample size, definitions, process, tools); lack of resources for data preparation; data definitions (e.g. different definitions of ‘success’); outdated IT infrastructure; integrating different data systems and multiple platforms; lack of data literacy; unclear data governance; balancing internal data requests with external reporting requirements; missing data; inconsistent (or non-existent) data workflows and processes; keeping up with industry technology, infrastructure, skills, and talent; lack of resources to pay for data skillset; difficulty retaining data staff; and lack of formal roles for data analysts. These concerns were consistently cited across institutions of all size and type.

We appreciated the candid and extremely informative responses to this question that reflected the real-world challenges in working with data faced by nearly all institutions. You will likely find your own experience(s) reflected in the Bonus Section[iii] (at the end of this article) that describes the challenges that are commonly experienced by most institutions, which also informs our work.

Regional “Snapshot”: Summary of Technical Resources

The “Technical Resources” section of the IDCS focused on the specific technical resources (data tools) used by our member institutions to manage and work with institutional data. The purpose of this section was to understand the range of technical resources and identify areas for support.

Nearly all (99%) institutions are using a Student Information System (SIS):

The most widely used student information systems are: Banner (31%) and PeopleSoft (28%). Almost an equal number of institutions use Jenzabar (11%) and Colleague (10%). It is interesting to note that one institution reported maintaining a 45-year-old mainframe computer while another reported not using a student information system at all.

Nearly all (98%) institutions are using a Learning Management System (LMS):

Canvas (53%) is the most widely used learning management system followed by Moodle (25%), Blackboard Learn (10%), and Brightspace D2L (6%). One institution reported using a “home grown LMS” and three institutions reported not using any learning management system at all.

Among those maintaining an on-premises database, the majority are using Microsoft SQL Server (63%) followed by Oracle Database (31%) and MySQL (15%).

There is an almost even split between the number of institutions that reported using a data warehouse (48%) and a slight majority (52%) that are not. The most widely used data warehouse tool is MS SQL Parallel Data Warehouse (8%) closely followed by Microsoft SQL Server (7%) and Oracle Autonomous Data Warehouse (6%).

Most (67%) institutions reported not maintaining a cloud database. But, among the institutions that are using a cloud database, there is a relatively even distribution among Amazon Relational Database (9%), Microsoft Azure SQL Database (8%), and Oracle Cloud Database (8%).

Survey results regarding the technical resources used by our member institutions is helpful for planning future programming, annual conference topics/speakers, and other initiatives. They also help us understand the technical capacity of our member institutions to implement high-quality data analytics to support evidence-based approaches to achieving equitable outcomes.

For example, a “modern data stack” in most industries consists of an application database or other data sources, customer relationship management system, and advertising platform from which data flows into a data warehouse/lake. Then, “extract-load-transform” (ELT) is performed to prepare data for use with business intelligence (BI) tools for data analytics/visualization, ad-hoc analyses, and (increasingly, within higher education) artificial intelligence/machine learning.

This data also provides insight into potential institutional data capacity gaps among our member institutions for which we may be able to provide resources through grant-funded initiatives such as the PDP Accelerator, which is currently serving a cohort of about 50 member institutions that are working toward onboarding to the Postsecondary Data Partnership.[iv]

We encouraged participation from relevant stakeholders across the institution in completing the IDCS, including staff from institutional research, information technology, human resources, assessment/effectiveness, accreditation, data governance, data ethics, and data analytics teams.

In our next issue, we will discuss IDCS results regarding data analytics and data staffing (HR).

Bonus Section:
In Your Own Words (data challenges)…

“We are a data centered institution. Our challenges are similar to many institutions our size. Data quality is our largest challenge. We have a central information system that has been in place many years and we are generations from the original implementation so we have evolved in ways that the system was not designed and therefore data quality is challenged. Many users are narrowly trained and have little understanding of the interfunctionality of the system. This delays reporting and analysis because data needs to be cleaned and contextualized prior to higher level review.”

“We are currently transitioning from a legacy ERP to a more modern data system. At the same time we are also doing rapid work to develop a culture around utilizing data to inform our decision-making process. This is both an opportunity and a challenge in that staff are being asked to configure a new system in ways that impact what data will be collected and how it will be available while simultaneously learning how data can be utilized.”

“One challenge area has been the collection of student stories. In an effort to strengthen our assessment efforts, the collection of student stories has been added to our strategic plan. We are actively working on best practices of how to collect these stories.”

“Agreeing on the right questions to ask and setting the priority to persist in asking those questions; having the right data for complex questions like ‘what did your students learn’; integration of data from multiple systems and data sets across the institution, some siloed, some very old, some very new (enterprise-level, department-level, office-level); data management (getting clean and clear data suitable for analytical use); reconciling data definitions (every external requestor has their own ‘spin’; every internal requestor has their own ‘spin’; reconciling changes in definitions over time; maintaining continuity in the face of institutional change; integration and liability around data that from outside sources (like state government unemployment records); security infrastructure to ensure that data can be share only as is appropriate”

“Data definitions, consistency, analysis, collection, organization, and use are the most challenging aspects. This is so for multiple reasons, but in part because the IR team lacks resources such as the software tools, training, sufficient staffing, and time to provide routine, clear access to standard data points in a stable location. We need dashboards to review standard data points and to communicate effectively about how to identify and close information gaps. Using data to close equity gaps, improve the institution and for decision making is another challenge and growth opportunity.

Generally, we have sufficient data, but it is often incomplete, inconsistent, or unavailable and is usually under the direct control of units across the college, who may not follow consistent procedures or document their methods. Some data is not collected for the purposes of research but instead to support ongoing business practices. Data collection is also challenging because it relies on self-reporting.

For internal reporting we need access, and clear direction from leadership. External reporting requires staff time, which is at capacity. Broad-based data literacy is an issue in order to organize, analyze and use data meaningfully. We need to align expectations and assumptions with the reality of institutional research.”

“We struggle the most with collection, due to the many different avenues of data collection (e.g., different systems, individuals’ personal spreadsheets). We do not have a single comprehensive system or a set of systems that easily talk to each other.”

“We have a good collection, analysis, and reporting structure, but education is our next challenge: ‘What does the data mean?’ ‘Why does this data not match that data?’ “

“1. The student information system is different than the human resources and finance system. This makes it difficult to look at the business side of instructors to classes with our disparate information systems.

2. The non-credit side of our institution is housed in a different platform than the credit side, making it hard to track our entire student body.

3. With respect to HSI data, the current issues surrounding DACA students’ fears around participating in the Census are all challenging aspects of collecting data, especially concerning our undocumented and Hispanic student populations.”

“Being a community college, we have students entering our system at pre-college, GED/ESL, community education, and collegiate level. Data needs vary at each entry point and become difficult to manage when a student transitions from one entry point to another. This creates incomplete data sets as a student for instance transitions from community education to collegiate.”

“Some data collection is difficult because the college’s legacy data systems do not have fields for all information that the college would like collect. Some of that data is needed for better decision making and predictive analytics.

Data quality is also an issue because the college’s legacy system does not have field-level controls; definitions have changed over the years without documentation; and processes have not been consistently followed across the organization. As a result, some longitudinal data is almost impossible to read or use because codes have changed; data is inconsistent and messy; and some data elements have been used differently, over time, for different purposes.

Data reporting is also fairly comprehensive, but our college has been challenged with data literacy. Another challenge is forming evaluation and assessment plans before implementing student success initiatives. As expected, it is difficult to assess the impact of something if proper evaluation steps were not first considered and implemented.”

“We don’t have an Institutional Research Office or a designated full-time IR person (one mid-level administrator currently has 1/6 of his load allocated for institutional research). While the university realizes IR is an important need, intentions to create a staff level position to assist with this have been put on hold the past couple of years due to budget constraints. For the most part, data collection is decentralized on our campus, although in the last several years we have made efforts to make data more widely available and easily accessible to decision-makers on campus. In addition to personnel, perhaps the most challenging aspect of data collection, analysis, and reporting is the limitations of our ERP. We are currently using … (a system no longer being actively developed by our vendor) and we do not have a data warehouse, so our ability to extract live data and create data reports and visualizations is limited. We also have many different data repositories and they don’t all talk to each other and frequently contain conflicting and different information, which at times leads to a lack of trust in our data.”

“Limited human resource capacity; lack of real-time data availability; lack of data warehouse and working without data governance and standards. Multiple reporting options and constantly changing approaches to data views.”

“Limited collection of data on student demographics/profile (e.g. race, ethnicity, socio-economic indicators), which is an issue in all [similar] institutions.

– Having several independent data sources to meet the needs of individual business units creates a challenge of consolidating information and ensuring consistency across the institution.

– Data privacy creates a challenge around how much and what types of information can be published.

– Inconsistent business practices across different units is also a big challenge e.g. department use course waitlists in different ways so it is impossible to gauge student demand.

– Understanding the business practice of how data enters the information system. Without a complete understanding of a department’s business practice, it is easy to misinterpret data.

– Inconsistent data definitions across business units e.g. a part-time student is defined differently for different purposes or by different departments. Some of it may be reduced with implementing a standard data dictionary.”

“Disaggregation of data is challenging because of our small size. Many of our demographic and other groupings yield small numbers of students, and as a result, several years’ worth of data is required in order to have meaningful information.”

“The growing volume of data; matching up data resources with the institutional growth. Keeping up with high demand for data analysis, reporting and visualization”

“1) Lack of institutionalized Data Governance and Data Management to maintain consistent data entry and information system definitions. Currently working on implementing and addressing college-wide, establishing a Data Governance Committee, and creating a Data Catalog.

2) A history of bottom-up and silo-ed approaches to data usage that do not align with strategic planning and strategic institutional priorities and initiatives.

3) Prior to 2019, limited resources to provide data reports and insights. Now working on increasing data literacy across different levels of the organization and improving data management and usage for decision support.”

“Niche data storage by departments in the institution and data quality concerns when new systems are implemented with insufficient planning.”

“Most of our institution’s challenges relating to data are connected to our student information system (SIS). Our SIS is predominately designed for for-profit proprietary schools, and we are one of the few traditional undergraduate institutions utilizing the product. In conjunction with that, there has been a lack of training with the SIS for new users, which leads to a long and steep learning curve, and, at times, an over-reliance on more adept users of the system. Reports are exported to Excel and often must be combined with other database exports (few reports are customizable), and as such, users with little experience or technical knowledge of Excel are often frustrated and then rely heavily upon a few experienced users.

In addition, at times there appears to be a lack of vision/communication between departments regarding data requests and usage. We often receive multiple requests for information that are very similar but just different enough that the same data cannot be used for both request- hence, a doubling of work often occurs.”

“We have limited [human resources] (i.e., just one research analyst) to analyze and report the data for our college. Also, it will be more challenging and time-consuming, and less meaningful (e.g., colleges will have inconsistencies in coding and underlying institutional processes), to compare and benchmark our indicators with regional and national peer institutions.”

“The most challenging has been our institutional technology’s infrastructure and ability to provide the data in a timely manner with limited resources. It would be a better system for the institutional research department to have direct access to the data sources on campus but all sources are currently managed by the IT department.”

[i] Current data regarding NWCCU member institutions federally designated as minority serving institutions was provided by Dr. Selena Grace.
[ii] Complete College America (CCA) has released a new report, “No Room for Doubt: Moving Corequisite from Idea to Imperative (A decade of data on the effectiveness of the corequisite model and new insights on how to implement and scale)”.  Experts from CCA will present on these findings at the 2021 NWCCU Annual Conference.
[iii] Minimally edited to maintain privacy and grammatical clarity.
[iv] If your institution is interested in the PDP Accelerator, please contact Dr. Jess Stahl.

 

The post V3I4: NWCCU Institutional Data Capacity Survey appeared first on NWCCU.

]]>
V3I3: Survey Results on 2020 Pandemic Actions and Concerns https://nwccu.org/news/v3i3-survey-results-on-2020-pandemic-actions-and-concerns/?utm_source=rss&utm_medium=rss&utm_campaign=v3i3-survey-results-on-2020-pandemic-actions-and-concerns Mon, 08 Feb 2021 17:43:04 +0000 https://nwccu.org/news/v3i3-survey-results-on-2020-pandemic-actions-and-concerns Survey results of COVID related actions and concerns at the NWCCU member institutions

The post V3I3: Survey Results on 2020 Pandemic Actions and Concerns appeared first on NWCCU.

]]>
 

Just before the Winter 2020 holiday break, we sent out a survey seeking out the innovative ways our member institutions have adapted, transitioned, and transformed academic offerings and student services in light of the COVID-19 pandemic.

Below are some of the responses received from the over 70 institutions that submitted responses on their actions taken in light of the pandemic, and ongoing concerns and focus:

 

This information will be utilized to assess areas of additional assistance that the Northwest Commission can provide, topics to target in ongoing webinars, and issues to focus on while creating practical application trainings and workshops for our member institutions. NWCCU is committed to continuing to provide support, resources, and programming designed to meet current needs. Please connect with your NWCCU staff liaison if your institution has any questions.

 

The post V3I3: Survey Results on 2020 Pandemic Actions and Concerns appeared first on NWCCU.

]]>
V3I3: Postsecondary Data Partnership (PDP) Accelerator Launches https://nwccu.org/news/v3i3-postsecondary-data-partnership-pdp-accelerator-launches/?utm_source=rss&utm_medium=rss&utm_campaign=v3i3-postsecondary-data-partnership-pdp-accelerator-launches Mon, 08 Feb 2021 17:41:28 +0000 https://nwccu.org/news/v3i3-postsecondary-data-partnership-pdp-accelerator-launches V3I3: Postsecondary Data Partnership (PDP) Accelerator Launches

The post V3I3: Postsecondary Data Partnership (PDP) Accelerator Launches appeared first on NWCCU.

]]>
NWCCU has been awarded a grant from the Bill & Melinda Gates Foundation entitled, “Supporting Accrediting Bodies: Postsecondary Data Partnership Accelerator” (PDP Accelerator). The Postsecondary Data Partnership (PDP) is a nationwide effort from the National Student Clearinghouse to help colleges and universities more easily understand student educational progress and outcomes. The objective is to provide easier access to better data in order to enhance knowledge and facilitate more informed decisions to support student success. The PDP is supported by partners including Achieving the Dream (ATD), American Association of State Colleges and Universities (APLU), American Indian College Fund, Association of Public & Land-Grant Universities (APLU), Complete College America (CCA), Excelencia in Education, JFF, State Higher Education Executive Officers Association (SHEEO), Frontier Set, and other high impact organizations. In our region, the state of Nevada has joined the PDP as a statewide system.

NWCCU has partnered with the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) for the PDP Accelerator and both accreditors will facilitate onboarding selected member institutions to the PDP. We are excited to announce that more than 50 NWCCU member institutions have been selected to join our first cohort along with at least as many SACSCOC institutions. The first cohort launchedin January and will receive extensive support from the PDP Accelerator and its peer network and other resources through August 2023. A second cohort is planned for 2021/2022.

The PDP Accelerator is also launching the Data Equity Academy in February, where you will find tools, information, and resources online to demystify data for better use and understanding. In December 2020, NWCCU hosted the Foundations of Data Equity workshop with more than 400 attendees. We are offering the same foundational data equity workshop again in March 2021. Through the Data Equity Academy, we will continue to host webinars and other events related to data equity. Stay tuned!

For more information, click on the following link:

If you have questions or want to get involved with the PDP (including joining future cohorts), please contact Dr. Jess Stahl (jstahl@nwccu.org), Vice President for Data Science & Analytics.

 

The post V3I3: Postsecondary Data Partnership (PDP) Accelerator Launches appeared first on NWCCU.

]]>