Annual Assessment Reports 2018-2019

UAA Student Affairs departments demonstrate a commitment to a culture of evidence and continuous improvement through intentional assessment planning, the measurement of learning outcomes, and the publication of insightful key findings to stakeholders. Organized by department, the annual assessment reports below include:

  • Assessment Inventory: List and description of all assessment projects conducted during the year.
  • Learning Outcomes: Primary learning outcomes prioritized for the academic year.

Contents

Admissions and Recruitment

Assessment Inventory

Recruitment Funnel Analysis

Detailed analysis of Fall 2018 admissions funnel was carried out based on Banner and Salesforce data.

Assessment Type

Operational/Program Outcomes

Key Findings
  • 12,308 inquiries were generated for Fall 2018.
  • Only 4% of out of state inquiries enrolled even though 44% of all inquiries were out of state inquiries.
  • Only 3% of inquiries enrolled at UAF or UAS (vs 23% that enrolled at UAA and 35% that that enrolled at a non-UA institution).
  • Communities with UAA campuses and communities on the road system generally have higher matriculation rates.
  • 29% of all inquiries filed a FAFSA for UAA.

Weekly Enrollment Briefings

The Office of Admissions leadership team, Institutional Research and the AVC for Enrollment Services meet weekly to review application, admissions and enrollment data. Each week, comparisons are made to point-in-cycle from the previous three years and benchmarking current point in cycle to recruitment cycle goals. Opportunities for any course corrections are discussed as well as key finds to share out with leadership. The data used is a combination of the attached enrollment funnel and the IR dashboard for daily admission applications.

Assessment Type

Comparative Benchmarking, Usage/Tracking Data

Key Findings

By meeting each week to discuss the data, the group is able to benchmark year-over-year trends, making course corrections when necessary and discuss any data errors well in advance of any final reporting numbers for any given term. Additionally these weekly analysis meetings have allowed us to identify processing and communication pinch points and to adjust accordingly. For example, if applications in 鈥媜ne of the targeted recruitment populations starts to trend down, the admissions counselor for that school or group can do additional outreach. If there was not a regularly scheduled meeting with collaboration across the two units, we would not be able to make these mid-cycle corrections as necessary with faith in the data.

You Visit Virtual Tour

The Youvisit virtual tour launched Aug. 2018. Review analytics quarterly.

Assessment Type

Comparative Benchmarking, Usage/Tracking Data

Key Findings

The UAA virtual campus tour was developed by YouVisit and the Communications Team over the summer of 2018 and serves as both a marketing and lead generation tool for the Office of Admissions. Monitoring the analytics over the course of FY19 has shown a number of important factors, most notably a total of 3,691 visits since the tour 鈥媗aunch in August 2018 through June 2019, and a conversion rate of 23%. Conversions are actions taken from inside the tour to request more information, schedule a campus visit, or apply online. 鈥媁eekly activity reports are generated and new prospects are loaded into SalesForce. Other notable data captured by the tour analytics is a breakdown of visitors from over 78 countries and 137 states and territories.

Top 5 countries:

  1. United States: 3,043 (89.9%)
  2. Canada: 39 (1.2%)
  3. Russia: 19 (0.6%)
  4. Germany: 19 (0.6%)
  5. United Kingdom: 16 (0.5%)

Top 5 states and territories:

  1. 熊猫在线视频: 1,056 (31.2%)
  2. California: 254 (7.5%)
  3. Texas: 137 (4%)
  4. New York: 84 (2.5%)
  5. Washington: 82 (2.4%)鈥

Learning Outcomes

Preparing for College Presentation - Junior Day

Outcomes Statements

By way of participating in the preparing for college presentation session, high school juniors who attended Junior Day at UAA gain a better understanding of the college search process.

Learning Intervention

A presentation and question/answer session for all attendees at the event.

Measure

Attendees were surveyed about their satisfaction with the day and each session.

Data Collected

No

Student Ambassador Program

Outcomes Statements

UAA Student Ambassadors will develop and hone customer service skills by serving as the front line of the admissions phone queue, and responding to inquiry emails.

Ambassadors will develop and strengthen their public speaking skills by leading daily campus tours.

As a result of the training and opportunities to apply the training through daily work, ambassadors will be able to gain customer service and public speaking skills.

Learning Intervention

Regular training and review sessions. Feedback and follow up from surveys of tour attendees.

Measure

Through continual training and mock tours, feedback and guidance will be provided to refine delivery of campus tours. Surveys to tour participants requesting their feedback was used to identify needed areas to focus on improving.鈥

Data Collected

No

Career Exploration and Services

Assessment Inventory

Survey of Student Employees and Supervisors

Student employees and student employee supervisors were surveyed via Qualtrics at the end of Spring 2019. 

Assessment Type

Operational/Program Outcomes, Stakeholder Needs

Key Findings
  • Student employees are very positive regarding their employment experience. Over 93% of student employees recommend on-campus employment to other students.
  • Student employee supervisors use their experience to form strong connections with their employees but may need more training on student professional development and learning outcomes.
  • Student employees rated especially high their employment experience as it related to recognition and appreciation for the contributions they made to their teams and in gaining skills and experiences that could be showcased on a future resume.
  • Supervisors strongly agreed they were able to form strong connections with their employees, but less than 60% said they understood how they could support the professional development of their student employees by connecting what the students learned in the classroom to their work in the office.

Learning Outcomes

Outcomes Statements

As a result of on-campus student employment at UAA, students will have the opportunity to develop foundational career competencies desired by employers. Build your resume with these career competencies:

  • Leadership
    Leverage the strengths of others to achieve common goals, and use interpersonal skills to coach and develop others. The individual is able to assess and manage his/her emotions and those of others; use empathetic skills to guide and motivate; and organize, prioritize, and delegate work.
  • Oral and Written Communication
    Ability to articulate thoughts and ideas clearly and effectively in written and oral forms to persons inside and outside the organization. The individual has public speaking skills; and can write/edit memos, letters, and complex technical reports clearly and effectively.
  • Professionalism and Work Ethic
    Demonstrate personal accountability and effective work habits, e.g., punctuality, working productively with others, and time workload management, and understand the impact of non-verbal communication on professional work image. The individual demonstrates integrity and ethical behavior, acts responsibly with the interests of the larger institution in mind, and is able to learn from his/her mistakes.
  • Teamwork and Collaboration
    Build collaborative relationships with colleagues and customers representing diverse cultures, races, ages, genders, religions, lifestyles, and viewpoints. The individual is able to work within a team structure, and can negotiate and manage conflict.
  • Information Technology Application
    Leverage existing digital technologies ethically and efficiently to solve problems, complete tasks, and accomplish goals. The individual demonstrates effective adaptability to new and emerging technologies.
  • Critical Thinking and Problem Solving
    Exercise sound reasoning to analyze issues, make decisions, and overcome problems. The individual is able to obtain, interpret, and use knowledge, facts, and data in this process, and may demonstrate originality and inventiveness.
  • Global and Intercultural Fluency
    Value, respect, and learn from diverse cultures, races, ages, genders, sexual orientations, and religions. The individual demonstrates openness, inclusiveness, sensitivity, and the ability to interact respectfully with all people and understand individuals鈥 differences.
  • Career Management
    Demonstrated ability to identify and articulate skills, strengths, knowledge, and experiences relevant to the position desired and career goals, can identify areas necessary for professional growth. The individual is able to navigate and explore job options, understands and can take the steps necessary to pursue and gain new skills to achieve goals, and understands how to self-advocate for opportunities in the workplace (career driven mindset).

Learning Intervention

Engaging in various on-campus student employment positions.

Measure

Annual student employee student outcomes survey uses indirect measures to ask students to indicate their agreement as to whether or not their student employment experience helped them to gain in the learning outcome areas. 

Data Collected

Yes

Dean of Students Office

Assessment Inventory

Behavior Trends of UAA Students: A Student Conduct and Care Team Analysis

This report, titled Behavior Trends of UAA Students: A Student Conduct and Care Team Analysis, is published biennially by the UAA Dean of Students Office and serves to inform the UAA community about trends in student behavior and the programs that are in place to support those students. This report focuses primarily on statistical information from Fiscal Years 2017 and 2018, with the fiscal year running from July 1 - June 30 each year. While this report analyzes specific trends in student behavior related to academic integrity and the misuse of alcohol (the two areas of the Student Code of Conduct most violated), this report also analyzes the student behaviors that have caused others in the UAA community to become concerned about them. In this regard, trends related to student well-being and their struggles are also documented within this report's pages.鈥

Assessment Type

Comprehensive Program Review

Key Findings

None provided.

CORE AOD Survey

The Core Survey is administered to UAA students every four years to track trends in alcohol and drug usage by UAA students.鈥

Assessment Type

Stakeholder Needs

Key Findings

None provided.

DOS/DRL Care/Conduct Retention Study

The Fall 2018 to Spring 2019 retention study was administered to determine the barriers to students enrolling in classes for the Spring 2019 semester and how the DOSO and DRL could help retain them.鈥

Assessment Type

Stakeholder Needs

Key Findings

None provided.

Students of Concern Risk Levels

The Care Team assessed the initial and resolved risk levels of students referred to the Care Team, using the NaBITA Threat Assessment Tool.鈥

Assessment Type

Operational/Program Outcomes

Key Findings

None provided.

Substance Abuse Screening Tools

The Alcohol, Drug, and Wellness Educator (ADWE) assessed students' use of alcohol, marijuana, and other drugs. The ADWE used the Alcohol Use Dependency Identification Test (AUDIT) to 鈥媋ssess students referred for an alcohol screening through the Student Conduct process. Similarly, the ADWE used the Cannabis Use Disorder Identification Test (CUDIT) to 鈥媋ssess students referred for a marijuana screening and the Drug Abuse Screening Tool (DAST) to assess students referred for other drugs.鈥

Assessment Type

Stakeholder Needs

Key Findings

None provided.

Learning Outcomes

Alcohol (AUDIT) Screening with Alcohol, Drug and Wellness Educator (ADWE)

Outcomes Statements

As a result of participating in a meeting with the ADWE, students will be able to comprehend high-risk vs. low-risk choices of alcohol.

As a result of participating in a meeting with the ADWE, students will be able to identify at least one harm reduction strategy.

As a result of participating in a meeting with the ADWE, students will be able to identify a standard drink size, BAC, tolerance, and effects of alcohol.

As a result of participating in a meeting with the ADWE, students will be able to identify physical and social risks of excessive drinking.

Learning Intervention

None provided.

Measure

None provided.

Data Collected

None provided.

Cannabis (CUDIT) Screening with Alcohol, Drug and Wellness Educator (ADWE)

Outcomes Statements

As a result of completing the CUDIT with the ADWE, students will be able to explore if they are having problems with cannabis.

As a result of completing the CUDIT with the ADWE, students will be able to understand high risk choices of cannabis use (e.g., smoking 5-6 days per week)

As a result of completing the CUDIT with the ADWE, students will be able to understand that cannabis use may lead to health problems.

As a result of completing the CUDIT with the ADWE, students will be able to understand the effects and various levels of THC.

As a result of completing the CUDIT with the ADWE, students will be able to understand strategies to reduce use, if the results are high.

As a result of completing the CUDIT with the ADWE, students will be able to identify at least one strategy to reduce risks associated with use.

Learning Intervention

None provided.

Measure

None provided.

Data Collected

None provided.

CARE Learning Outcome

Outcomes Statements

Students will identify that they are more familiar with UAA Resources and Services applicable to their needs due to interaction with UAA CARE team.

Students will have an increased understanding of the level of impact their behaviors and choices have on the UAA community and campus.

Due to interaction with CARE team, students will increase their confidence in effective coping and help seeking behavior to address their distressing situations.

Students will increase their ability to continue their educational progress toward a degree, certificate, or academic enrichment due to interaction with the CARE Team.

Learning Intervention

None provided.

Measure

None provided.

Data Collected

None provided.

Student Conduct Learning Outcomes

Outcomes Statements

After engaging in the Student Conduct process, students will be able to articulate why they were referred to Student Conduct.

After engaging in the Student Conduct process, students will be able to demonstrate, verbally or in writing, their role in the broader UAA community, including information about their rights and responsibilities as a student.

After engaging in the Student Conduct process, students will be able to evaluate their experience with the incident and the process to determine what they learned/got out of the experience (positive or negative).

After engaging in the Student Conduct process, students will be able to discuss the impact of their actions on themselves, others, and/or the community, and formulate a plan for instituting change.

Learning Intervention

None provided.

Measure

None provided.

Data Collected

None provided.

Disability Support Services

Assessment Inventory

Analysis of Students Served Fall 2010 through Summer 2018

Since at least Fall 2010 DSS has tracked data on students and the support they receive. Data analysis examined the attributes of these students and demonstrated how students served by DSS are successful at UAA.

Assessment Type

Operational/Program Outcomes, Usage/Tracking Data

Key Findings
  • Over 1,700 unique students were served over that 7 year period.
  • Alternative testing accommodations made up 40% of all accommodations provided.
  • 5% of undergraduate degree-seeking students at UAA received an accommodation through DSS.
  • 35.7% (n=66) of undergraduate bachelor-seeking students who started full-time as freshman in a fall semester between 2010 and 2012 and received accommodations or support from DSS at any time during their academic journey graduated with a bachelor degree within 6 years.

Online Check-in for Students Experiencing Disabilities

Since its inception in AY17, every student checks in through the system upon entering the DSS office and Adaptive Testing Lab. The assessment allowed DSS team members to track the total number of in-person student encounters.

Assessment Type

Comparative Benchmarking, Cost Effectiveness, Operational/Program Outcomes, Usage/Tracking Data

Key Findings

A total of 3,036 student encounters were logged into the DSS check-in system from July 2018 through June 2019. At least 59% of student encounters were related to testing accommodations and needs.

Walk-ins accounted for 17% of the total encounters. Walk-ins were typically information seeking (handled by front-line staff) or specific to an acute problem (handled by Director or Senior Disability Accommodations Coordinator).

Satisfaction Surveys

During late Fall DSS launched two separate surveys (one to DSS registered Students and one to Faculty and Staff) both asking questions exploring the overall satisfaction of UAA鈥檚 DSS accommodation and the campus鈥 overall accessibility.

The goal of the study was to develop a better understanding to improve accommodation and accessibility for UAA. Surveys collected totaled 351 Faculty and Staff at the University of 熊猫在线视频 and its Community Campuses and 165 Students at the University of 熊猫在线视频 Anchorage. The data from both surveys provided in-depth feedback on services, satisfaction, and ideas for future improvements for DSS at 熊猫在线视频 and its Community Campuses.

Assessment Type

Student Satisfaction/Perception

Key Findings
  • 94% of Students reported they were satisfied with campus accessibility and DSS services.
  • 73% of Faculty responded that they had not received any accessibility training.

Several pages of comments were collected with ideas for future trainings for Students, Faculty and Staff.

Learning Outcomes

None provided.

Military and Veteran Student Services

Assessment Inventory

Certification Request Processing Times

During the Spring 2019 semester our department started using TOAD queries to find students who made changes that are required to be reported to the VA. The TOAD queries are much more efficient than the old reports, and this along with continued improvement to our certification request form, has led to a decrease in our average processing time of certification requests.鈥

Assessment Type

Operational/Program Outcomes, Stakeholder Needs, Usage/Tracking Data

Key Findings

These process improvements led us to be able to decrease our average processing time for certification requests from the Fall 2018 semester to the Spring 2019 semester. During the two weeks prior to the beginning of the Fall 2018 semester our average processing time for certification requests was 11 days. For the following Spring 2019 semester the processing time had dropped to 7 days. This positive change is very beneficial to our students, as they are able to receive their benefits sooner and will know if there are any issues with their classes so they have more time to fix their schedule before the add/drop deadline.鈥

Learning Outcomes

VA Work Study one on one training

Outcomes Statements

After VA work study students complete one on one training they will have a working understanding of the VA educational benefits process at UAA, which will allow them to provide excellent service to students that come into the Resource Center for assistance.

Learning Intervention

Professional staff will provide training to new and returning VA work study students before the start of each semester. This training will include going over general office procedures and showing them where to find the answers to our most common questions. Returning work studies will be briefed on any changes that may have happened since the last semester they worked.鈥 鈥 New VA work studies will be paired with returning work studies for further on the job training.

Measure

Professional staff will observe work study interaction with students to confirm proper instruction is being given. The training will lead to a decrease in the amount of questions that work studies have to bring to professional staff, and they will have the knowledge and ability to answer an increasing amount of complex questions on their own. Work studies will also be asked to give feedback to professional staff of any areas of concern they may have so further training can take place on any areas that need to be addressed.

Data Collected

No

Veteran Howl Days

Outcomes Statements

Students attend a Howl Days session geared specifically for students planning on using VA educational benefits while attending UAA. Attendees participate in a VA Benefits workshop that explains in detail the process for using educational benefits here at UAA. By attending the presentation participants will be more educated consumers of their VA educational benefits.

Learning Intervention

Students attend an overview presentation of using VA educational benefits at UAA. They also are able to stop by our office during the resource fair for individual assistance.

Measure

37 students and 8 guests attend the Howl Days session, for a total of 45 participants. A pre and post survey was administered to attendees asking them to rate their level of agreement with the following statement: I am knowledgeable about my VA educational benefits.鈥

Data Collected

Yes

Multicultural Center

Assessment Inventory

AHAINA Mentor Evaluation Form

An initial form used to assess the qualifications of prospective AHAINA Peer Mentors

Assessment Type

Comparative Benchmarking, Stakeholder Needs

Key Findings

None provided.

AHAINA Peer Mentor Questionnaire

Initial Peer Mentor Interview Questionnaire

Assessment Type

Comparative Benchmarking, Stakeholder Needs, Student Satisfaction/Perception

Key Findings

None provided.

ePortfolio for AHAINA Student of Excellence

This is an example of the scoring rubric that was created to assess student applicants for the AHAINA Student of Excellence Award Program. This is housed in Digication software that the UAA e-wolf portfolios use.

Key Findings

Demonstrates a High Impact Practice

Initial Qualtrics Survey for Seawolf Success Participants

Initial Qualtrics assessment for Seawolf Success Participants

Assessment Type

Comparative Benchmarking, Stakeholder Needs, Student Learning Outcomes

Key Findings

None provided.

International Student Thanksgiving Dinner Participation

Numbers (over time) that compare the participants in the International Student Thanksgiving Dinner Celebration

Assessment Type

Comparative Benchmarking, Usage/Tracking Data

Key Findings

None provided.

ITVS Indie Pop-up Film Screenings

Assessments that ITVS 鈥媟equests us to complete following each film screening.

Assessment Type

Student Learning Outcomes, Student Satisfaction/Perception, Usage/Tracking Data

Key Findings

None provided.

Seawolf Success Intake Questionnaire

Initial assessment to be used as an intake for students applying for the Seawolf Success Program

Assessment Type

Comparative Benchmarking, Stakeholder Needs

Key Findings

None provided.

True Colors Assessment

A brief activity used for both Mentors and mentees to determine various personality types (similar to MBTI).

Assessment Type

Comparative Benchmarking, Stakeholder Needs

Key Findings

None provided.

Learning Outcomes

None provided.

Native Student Services

Assessment Inventory

None provided.

Learning Outcomes

None provided.

Office of the Registrar

Assessment Inventory

Brown Bag Training Sessions

The Office of the Registrar hosted 16 brown bag training sessions during FY19. Three focused on DegreeWorks, two on registration, 10 on academic scheduling/CLSS, and one on transfer credit evaluation. Participants were invited to complete a feedback form following each session, rating the relevancy of the material presented to their work and how well the topic was covered, as well as sharing takeaways from the session and feedback for future trainings.

Assessment Type

Operational/Program Outcomes, Usage/Tracking Data

Key Findings

Of those participants that provided feedback (23 out of 103 total participants), 22 indicated the sessions were useful and relevant to the work that they do (provided a rating of 4 or 5 on a 5-point scale). Our analysis of training sessions held over the past FY has highlighted the need to emphasize the critical importance of receiving feedback from all session participants to guide and enhance future offerings.

Enrollment Impact of Schedule Planner Use

The Office of the Registrar analyzed data specific to the use of Schedule Planner among UAA undergraduate students and the average number of credits in which students register, both Schedule Planner users and non-users.

Assessment Type

Operational/Program Outcomes, Usage/Tracking Data

Key Findings

Approximately 35 percent of UAA's undergraduate students utilized Schedule Planner during AY19, resulting in an increased credit load of 2-3 credits, on average, as compared to students that did not utilize Schedule Planner.

Reevaluation of Request for Exception Letters

The Registrar's Office reimaged our templated letters sent to students following review of a request for exception. We removed unnecessary information and attempted to present information in layman's terms with less university lingo. We also divided the letter into sections to promote clarity and improve comprehension in hopes of reducing the number of follow up questions received in response to the letter. At most, a letter has three sections: our response to your request, university policies, and your next step.

Assessment Type

Operational/Program Outcomes, Student Satisfaction/Perception

Key Findings

The Office of the Registrar received 112 requests for exception August-December, 2018. Approximately 8% of these students contacted our Assistant Registrar asking for clarification of the letter they received in response. We updated our templates following the fall semester. We received 96 requests for exception January-May 2019. Approximately 1% of students contacted our Assistant Registrar requesting clarification of the letter they received in response.

Learning Outcomes

DegreeWorks

Outcomes Statements

Students who use DegreeWorks will register for courses that fulfill their degree requirements and be aware of their progress toward degree completion.

Learning Intervention

DegreeWorks is featured on the Office of the Registrar website as a tool to help students stay on track and efficiently progress toward degree completion. Office of the Registrar staff create transparency for students and encourage greater use of the tool by tracking academic and policy petitions in DegreeWorks and highlighting the 'what if' feature for students interested in exploring another major or adding a minor. DegreeWorks is an important part of every academic advisor's tool chest and facilitates efficient advising appointments by reducing the amount of time advisors spend mapping out semester course plans and evaluating students' progress toward degree completion.

Measure

The Office of the Registrar regularly receives unsolicited, qualitative feedback from students and advisors who value DegreeWorks. They report that it creates transparency by allowing students and advisors to easily track progress toward degree completion anytime, day or night. Advisors have shared that DegreeWorks changed the way they structure their advising appointments, freeing up valuable time to discuss career interests, adding a minor or internship opportunities instead of spending the entire appointment manually reviewing a student's transcript to determine the courses they should register for next semester and how they are progressing toward graduation.

The Office of the Registrar could consider embedding a short survey link within DW for students to complete voluntarily. The survey could ask students to rate DegreeWorks' helpfulness and relevance to their role as a student. The survey could also solicit qualitative feedback regarding students' favorite DegreeWorks features and improvements they would like to see in the future.

Data Collected

No

Schedule Planner

Outcomes Statements

Undergraduate students who use Schedule Planner will register for more credits in a given semester than students who do not use it.

Learning Intervention

The Office of the Registrar highlights Schedule Planner when communicating with students about registration in person, by phone or email, and during events hosted around campus (Howl Days, Campus Kickoff, etc.). Schedule Planner is also featured on the Office of the Registrar's website and the student registration page in UAOnline as a tool that helps students create their perfect schedule. The Office of the Registrar's website includes a brief instructional video and FAQ to introduce students to the tool and provide answers to the most commonly asked questions.

Measure

The Office of the Registrar analyzed university-specific data obtained from Statewide regarding Schedule Planner use among undergraduate students at UAA.

Data Collected

Yes

On-Campus Living

Assessment Inventory

Campus-wide Dining Survey

Annual survey of campus community on dining quality, hours, offerings, and prices.

Assessment Type

Comparative Benchmarking, Comprehensive Program Review, Cost Effectiveness, Operational/Program Outcomes, Stakeholder Needs, Usage/Tracking Data

Key Findings

None provided.

Catering Satisfaction Survey

Qualitative follow-up with each catering customer within several business days of the catered event.

Assessment Type

Comparative Benchmarking, Cost Effectiveness, Operational/Program Outcomes, Stakeholder Needs

Key Findings

None provided.

Fall 2018 Quality of Life Survey

Survey is distributed to all students living on campus and asks about satisfaction with services, program events, and staff. It also asks demographics and looks to gauge student learning in some key areas. 鈥

Assessment Type

Operational/Program Outcomes, Student Learning Outcomes, Student Satisfaction/Perception

Key Findings

None provided.

Live-in Student Staff Job Satisfaction Fall 2018

Survey distributed to Fall 2018 live-in student staff members (RAs PAWs IRLs) asking general questions about job satisfaction and most rewarding and challenging parts of the position.

Assessment Type

Operational/Program Outcomes, Student Satisfaction/Perception

Key Findings

None provided.

Seawolf Dining Advisory Board

Student-led monthly focus groups with a focus on gathering feedback on campus dining quality, hours, offerings, and pricing.

Assessment Type

Comprehensive Program Review, Cost Effectiveness, Stakeholder Needs, Student Satisfaction/Perception

Key Findings

None provided.

Spring 2019 Quality of Life Survey

Survey is distributed to all students living on campus and asks about satisfaction with services, program events, and staff. It also asks demographics and looks to gauge student learning in some key areas. 鈥

Assessment Type

Operational/Program Outcomes, Student Learning Outcomes, Student Satisfaction/Perception

Key Findings

None provided.

Learning Outcomes

Incorporation of ePortfolio into the Student Staff Rehire Process

Outcomes Statements

Through the process of creating an ePortfolio, Residence Life student staff members will reflect on and demonstrate the growth they experienced over the course of the year in their positions.

Learning Intervention

Student staff members will be asked to include the following information in their ePortfolio:

鈥- Examples of bulletin boards

  • Quotes from students
  • Pictures from programs
  • Professional resume
  • Certificates
  • Awards
  • Hall decorations
  • Door decorations
  • Advertisements for programs
  • Personal statement
  • Campus partner collaborations
  • Training presentations
  • Values Clusters programming

These are all activities the staff will participate in. Compiling and reflecting on these items as they create the ePortfolio is also essential to the learning.

Measure

The measure will be the quality of the ePortfolio and retention of student staff members.

Data Collected

No

Ohm-Campus Living Yoga and Wellness Program

Outcomes Statements

As a result of participating in the Ohm-Campus Living Yoga and Wellness Program, residents will develop a stronger understanding of the connections between their physical, mental, and spiritual health and wellness.

Learning Intervention

The Ohm-Campus Living Yoga and Wellness Program includes weekly, free, mixed-level yoga classes from nationally certified instructors. The program includes health and wellness lectures, as well as recurring physical exercise programs.

Measure

Assessment of the success of this program will be incorporated into the Quality of Life Surveys in the fall and spring semesters.

Data Collected

No

Student Financial Aid

Assessment Inventory

FATV Usage Tracking

The Office of Financial Aid requests a monthly report that shows FATV usage to identify what videos students are watching and when. The data breaks down viewing by day and hour, desktop versus mobile viewing, and the search criteria students are using to find videos. Data reports also show the type of video students are watching, broken down between Future Students, Current Students, Parents, and Alumni. 鈥

Assessment Type

Comparative Benchmarking, Stakeholder Needs, Usage/Tracking Data

Key Findings

The 鈥媘onthly usage reports show the Office of Financial Aid that the FATV subscription frequently provides video content to viewers during times when financial aid advisors are not available. The data consistently shows viewers are watching the videos between 5PM and 1AM and also during our peak call center times between 1 and 3pm.

The report provides feedback on the search criteria students are using to find answers s鈥媜 that the Office of Financial Aid can be more responsive in our communications and social media usage- pushing out the answers that students are most interested in knowing at any particular time of year. 鈥 This year the predominantly popular topic is Cost of Attendance and Net Price calculator. This has resulted in an evaluation and improved content on those pages on the website to meet stakeholder needs.

Staff Training and Development

Historically staff had been trained by shadowing other staff and completing the modules in FSA Coach. We kept track of the content covered on a spreadsheet and notated the date completed. While a useful method, there was some content lacking and it was difficult to gauge where staff were in the training process as far as content understanding.

Our staff has undergone some exceptional transition in the past year and we were forced out of desperation to find a different way of addressing our training needs. We decided to hang on to the tracking spreadsheet and incorporate a mentoring piece, assigning a seasoned staff member to a new hire based on what tasks we have determined that new hire will do, as opposed to an overall training expectation. This has allowed for a more in depth understanding of key tasks and we can then evaluate understanding based on the work they produce.

Overall training topics for generic content are presented in our staff meetings and a subsequent test is done the following week to assess how each staff member understands the content. They sign and date the test so we have documentation and it provides an accountability piece should we have need of that later.

Assessment Type

Operational/Program Outcomes

Key Findings

Mentorship has fostered a bond with staff and has created a trust that was not previously there. Staff feel more comfortable asking questions and are not as afraid of making a mistake and admitting to it as they feel like we are partnering their training. The staff we have hired have all had a different type of skill set to our prior hires. Instead of looking specifically for accounting and business experience we have branched out to include more customer service friendly staff with some technical background. This appears to be serving us well in that they are more flexible with concepts, provide good customer service and are eager to experience new things.

Learning Outcomes

Federal Work Study Processing

Outcomes Statements

The Office of Financial Aid will be able to better manage and estimate Federal Work Study allocation use by manually creating offsets between offered and paid contracts, resulting in a more streamlined and efficient process leading up to year end reconciliation for FISAP.

Learning Intervention

The Office of Financial Aid has created a manual process to manage federal Work Study funds allocation. Instead of fielding contracts, approving, placing in one category in RPAAWRD and trying to pay through BANNER during a narrow 2 hour window each month, OFA has instead created an offered and paid offset situation, where we not only can see for each student how much they have been encumbered for via contract, but also how much they have been paid in reality, allowing us to know what they have left and how much we have to reallocate to others once the initial encumbrances have been established.

The contract once approved is sent to HR, then OFA places the encumbrance amount in RPAAWRD under AFFWS as a HIRE status. As the student is paid each pay period, OFA creates an offset situation, where the paid amount is placed in the AFFWSP code, and that amount is reduced in the initial encumbrance, allowing us to see what each student has left and what we have spent.

A report is pulled after each pay period is paid so we may update and resolve any overaward which may have occurred since we encumbered due to additional need based resources.

At the end of the Award year, the reconciliation process is much easier as all of our students who were encumbered match the actual payroll. Issues with timesheets are resolved along the way as we can and if they were not caught by accounting services or us, we do a labor redistribution to fix the pay periods affected.

This process allows more control over our fund, helping us estimate any excess we have due to students not working the hours they were initially encumbered for, so we might offer to other eligible students more quickly as opposed to waiting for the HR window to pay through BANNER. This also has eliminated the issues with non matching contracts sequence codes, pay raises and other issues which, if not disclosed to the financial aid office throughout the year, would result in hours of reconciliation to match payroll. Our report that is a direct payroll report now affords us a much better outlook without the copious corrections at the last minute. This has also resulted in FWS balancing in July once out last pay period has been paid for June, as opposed to August.

Measure

Measure is in the individual RPAAWRD statuses and amounts, so we no longer must wait for a report from accounting services but instead can balance our RPIFAWD overall encumbered with our contracts, and balance our RPIFAWD paid amounts with our paid student amounts in the payroll report we pull from TOAD.

Data Collected

No

Financing Your Education

Outcomes Statements

Through the varied communications with the Office of Financial Aid, prospective students and their families will understand that financial aid provides 鈥渁ccess鈥 and 鈥渃hoice鈥 to students who need assistance to attend UAA, and it is only a part of the financing picture.

Learning Intervention

The financial aid office has already set in motion a plan to show students how to pay for college on the website with the framework of 鈥婬ow to pay, how to apply, understanding your offer, and how to maintain your eligibility. Due to delay in the website update we have not gone much beyond that content. We are reevaluating the financial literacy content as well to work better as a whole with paying for college as they are not mutually exclusive. With the Financial Aid Counselor position, that was used to provide this content in person to student outreach events such as orientation, high school visits and working directly with students here by appointment. Unfortunately we were unable to conduct a survey of the impact before this position term expired, however we did get survey results from students who indicated the value of the position and the content provided was useful. As we expand this outcome we will incorporate a more comprehensive survey to single out specific content value.

Financing an education is not a one size fits all exercise, and while many believe financial aid is the answer to funding an education, the broader scope is to show how financial aid fits into the whole and how to consider other options alongside it to meet financial needs. Education is a series of choices and it is important to show students and families there are choices about how they finance an education, and how that education is pursued( full time, part time, augmented with transfer credits or testing options such as CLEP or DANTES).

The assumption that financial aid should pay for everything is a relatively new concept as initially it was considered an equalizer, or access tool to supplement funding the education. Yearly aggregate limits on funding were not designed to cover all costs, particularly indirect costs such as housing, food, and transportation. 鈥婽his is where the financial aid office as well as other campus partners can do a better job introducing all of the options such as student employment, payment plans, flexibility of credit load to spread out the costs per semester, testing out options for credit and how to save money with certain hacks (bus pass, resident advisor status, living at home). 鈥

The Office of Financial Aid is working on content to show students how the cost of education is established, what the direct costs versus indirect costs mean, how using different forms of aid stack up against those direct costs and how students and families can save money by exercising specific behaviors while ensuring successful completion of goals. This will require a partnership with advisors, recruiters, accounting services and career services so we can ensure a holistic approach is achieved.

Measure

Measure to be determined for a long term solution, a survey was done to determine content relevancy and delivery. Actual response on the survey was poor, more than likely due to the short time the position had been in place.

Data Collected

Yes

Student Health and Counseling Center

Assessment Inventory

Bystander Intervention Trainings

Bringing in the Bystander is sexual assault prevention program which has been funded through a grant provided by the CDC via the State of 熊猫在线视频. The program is provided by the Health Care Promotion team of the SHCC which includes Health Promotion Specialists and Peer Health Educators. 鈥

Assessment Type

Operational/Program Outcomes, Student Learning Outcomes

Key Findings

During FY19, the Bringing in the Bystander training was conducted 46 times with 593 participants.

500 people responded to surveys about the BIB training:

  • 67% of students stated they had a clear understanding of what an active bystander does prior to training. After training, this increased to 100%.
  • 70% of participants said they could look back on their experiences and recognize opportunities for bystander intervention prior to training. This rose to 96% after the training.
  • 94% of students stated they could identify a range of unacceptable behaviors that contribute to sexual violence. After training, this increased to 99%.
  • 75% of participants said they were very knowledgeable about the prevalence of sexual violence in 熊猫在线视频. After training this rose to 98%.
  • 96% of participants said they have empathy for victims of sexual violence. After the training, 99% of participants stated they have empathy.
  • 83% of students stated they had the skills to evaluate the benefits and risks of intervening as a bystander prior to the course, whereas after the class 99% felt they had these skills.
  • 61% of participants stated they could identify resources that are available to support bystanders and victims/survivors. After training 97% felt they could identify such resources.
  • 88% of participants planned to be an active bystander prior to the training, whereas 96% thought they would be an active bystander after the training.

Depression Screening

Based on the recommendation by the US Preventive Task Force that all patients being seen in primary care should be screened for depression, the Patient Health Questionnaire (PHQ-2), a sensitive 2 question screen for depression, was initiated on every patient scheduled for primary care. 鈥

Assessment Type

Comparative Benchmarking, Comprehensive Program Review, Operational/Program Outcomes

Key Findings

Tally of depression screening from 6/21/28 and 12/21/18:

  • 968 PHQ-2 were completed
  • 107 (11%) received positive scores
  • 71 (66%) of positives were already receiving mental health services
  • 27 (75%) were screened further with PHQ-9
    • 7 of these were already receiving mental health services
    • 20 had mental health referral discussed with 10 being referred and 10 declining referral
  • 9 (25%) were not further screened with PHQ-9
    • 1 was sent to ER
    • 2 seen one week later and repeat PHQ-2 was not positive鈥

Mental Health Quality Assurance Record Review

Each Mental Health Providers reviewed two records per month of their peers鈥 documentation of mental health appointments. 鈥

Assessment Type

Comparative Benchmarking, Comprehensive Program Review, Quality Assurance

Key Findings

Of the charts reviewed:

  • 100% had medications reconciled
  • 100% had medication response/side effects documented
  • 100% had an initial mental health evaluation completed
  • 100% had subjective and objective information documented
  • 100% had safety issues addressed
  • 100% had a plan of intervention appropriate to the supporting data
  • 100% had appropriate referrals
  • 100% had relevant labs ordered鈥

Physical Health Quality Assurance Record Review

Each physical health provider reviewed two records per month of their peers鈥 documentation of physical health visits with students. 鈥

Assessment Type

Comparative Benchmarking, Comprehensive Program Review, Quality Assurance

Key Findings

Of the 72 charts reviewed:

  • 94% of charts had allergies documented as reviewed
  • 97% of charts had medications documented as reviewed
  • 100% of charts had subjective information recorded that was relevant to the chief complaint.
  • 100% of charts reviewed had a physical exam appropriate to the history and subjective data.
  • 100% of charts reviewed had a diagnosis that was consistent with the subjective and objective data.
  • 100% of charts reviewed had an appropriate treatment plan.

Student Satisfaction Survey

Students are emailed a satisfaction survey one week after receiving SHCC services. Between 7/1/2018 and 6/30/2019, 430 students responded to this survey.鈥

Assessment Type

Student Satisfaction/Perception

Key Findings
  • 91.9% of students stated that they were satisfied or very satisfied with the services they received at the SHCC
  • 94.6% of students reported being satisfied or very satisfied with the physical health services they received at the SHCC.
  • 91.1% of students were satisfied or very satisfied with the ease of scheduling an appointment at the SHCC.
  • 96.5% of students felt their healthcare provider answered their questions very well or moderately well.
  • 94.9% of students felt their healthcare provider explained their follow-up care very well or moderately well.
  • 76.7% of students were satisfied or very satisfied with the waiting area options.
  • 96.7% of students felt the location of the SHCC is very convenient or somewhat convenient.
  • 74.8% of students agreed that the health related services they received at the SHCC were helpful to their academic success. 鈥

Learning Outcomes

Behavioral Strategies to Improve Healthy Habits in Students who are Overweight or Obese

Outcomes Statements

Students being seen for primary care visits at the SHCC will be screened for overweight or obesity. Those with an elevated BMI will be offered counseling regarding healthy diet and exercise habits and further screening if needed.

Learning Intervention

Students being seen for primary care physicals will complete a nutrition assessment and indicate their willingness to change. Those willing to change will be offered a follow-up appointment for nutrition and exercise counseling and further screening if needed鈥.

Measure

A chart audit of patients seen from January until June of 2019 will be conducted regarding number of patients who were overweight or obese, number who completed a nutrition assessment, number who indicated an interest in counseling for diet and exercise, and number who made a follow-up appointment.鈥

Data Collected

No

Behavioral Techniques Related to Treatment of Anxiety and Depression

Outcomes Statements

Students engaged in SHCC Mental Health Services, who have a diagnosis of depression or anxiety, will be able to identify three behavioral techniques aimed at improving target symptoms at their two week follow-up visit.

Learning Intervention

SHCC clinicians will meet with students individually to evaluate target symptoms related to depression and, or, anxiety. These symptoms will be discussed with review of behavioral interventions which have worked for the student in the past, or new behavioral interventions will be suggested.

This information will be documented in the EHR Medicat system.

Measure

A chart audit will be conducted at the end of the Spring 2020 semester, reviewing 40 records of students receiving a diagnosis of depression or anxiety will be conducted. The audit will be focused on noting the student's ability to name behavioral interventions that have helped them have fewer symptoms of depression and, or, anxiety.

Data Collected

No

Student Life and Leadership

Assessment Inventory

Daily Den Usage

The Daily Den works to have every student swipe or enter their UAAID when getting food. Demographic data and usage are compiled.鈥

Understand population and usage of Daily Den.鈥

Assessment Type

Usage/Tracking Data

Key Findings

The Daily Den served 8,141 meals in 2018-2019; 1,358 Unique Students served; 90% of students participating in the Daily Den are Commuter Students.鈥

Haunted Halloween Fun Night

Haunted Halloween Fun Night assessment focused on organization members who provided a game or both at the event.鈥

Assessment works to understand booth participate satisfaction with setup and how to improve in the future.

Assessment Type

Comprehensive Program Review, Student Satisfaction/Perception

Key Findings

Overall, the feedback on the event from registration to booth management was very positive.

The main pieces of improvement are: Communication on the floor plan earlier in the week and more organization in the volunteering before the event.

Overall, a large success for both student organizations and kids who attended the event.

KRUA Listenership Assessment

KRUA is sending out a listenership survey to assess our student listeners preferences and knowledge of the station.鈥

Determine stats on student listenership, awareness of KRUA and assessment of student listening habits.鈥

Assessment Type

Comprehensive Program Review, Stakeholder Needs, Student Satisfaction/Perception

Key Findings

According to KRUA's 2018 annual listenership assessment, more than 78% of survey respondents were aware that UAA had a student-run radio station, up from 70% last year; over 30% of survey respondents listen to KRUA more than once a week; more than 65% of survey respondents are likely to attend an event they heard about on the radio; 79% of listeners agree that KRUA introduces them to diverse music and perspectives.鈥

Leadership Conference

Assessment to understand how students found out about the conference, why they attended, and something they learned.鈥

Understand student motivation to attend leadership conferences and how to increase awareness and attendance. 鈥嬧

Assessment Type

Comprehensive Program Review, Student Satisfaction/Perception

Key Findings

Students felt like they were motivated to attend the conference for the following reasons: Gain/Improve/Strengthen Leadership Skills; To become more involved in the university.

In the assessment survey, students discussed their favorite sessions: Psychology of Diversity and Dream Boards.

Lessons learned throughout the conference: Setting goals that are obtainable; Diversity; and the importance of self-care.

Student Life and Leadership Alumni Assessment Inventory

Assessment given to Student Life and Leadership alumni to better understand the effects of working in SLL after a student has graduated. 鈥

Understand benefits and skills a student learned and was able to transition into a professional career.鈥嬧

Assessment Type

Stakeholder Needs, Student Satisfaction/Perception

Key Findings

None provided.

TNL Readership Assessment

The Northern Light hopes to gain a better understanding of how students retrieve their news, what they're reading and what they want more of.鈥

Understand how students are digesting their news and if TNL can cater to that. News is quickly moving to an online platform and we hope that this survey can help identify how the UAA demographic reads/what they read.鈥

Assessment Type

Comprehensive Program Review, Stakeholder Needs, Student Satisfaction/Perception

Key Findings

According to The Northern Lights 2019 annual readership assessment, more than 75% of readers agree that reading The Northern Light connects them to the campus community; more than 65% of survey respondents read The Northern Light, and of those readers, more than 50% read online. 鈥

UAA Campus Kickoff

Assessment to gather student feedback on all aspects of Campus Kick-Off鈥.

Understand success of all Campus Kick-Off events.鈥

Assessment Type

Comprehensive Program Review, Student Satisfaction/Perception

Key Findings

According to the 2018 Campus Kick-Off assessment the top three reasons students participate in Campus Kick-Off are to be a part of a UAA tradition, Learn more about UAA, meet new people. 66% of the students were first time participating for Kick-off for their first time.鈥

UAA Campus Sustainability Assessment

Assessment to gauge awareness of UAA Sustainability programs, and generate new sustainability ideas.鈥嬧

Sustainability awareness, sustainability promotion, and student feedback on UAA's sustainability efforts.鈥

Assessment Type

Comprehensive Program Review

Key Findings

According to the UAA Green Fee Board's 2019 campus sustainability assessment 41% of students are aware that UAA has grant funding available for student-led on-campus sustainability initiatives. 90% of students assessed always-too-sometimes recycle on-campus. 75% of students would support banning plastic straws.鈥

Learning Outcomes

Emerging Leaders Program

Outcomes Statements

Illustrate an awareness of and commitment to social and civic responsibility.

Learning Intervention

During the Emerging Leaders Program in the spring semester, the students participated in a service project with the Emergency Food Cache. We partnered with Student Health and Counselin鈥媑 Services to have students assist in putting together 100 bags for the Emergency Food Cache. We had Betty Bang do an introduction to the Food Cache and the impact on hunger in the campus community.

Measure

鈥嬧媁e facilitated a reflection exercise with the student participants, as well as collected feedback through a participation survey at the end of the retreat.

Data Collected

No

Student Outreach and Transition

Assessment Inventory

None provided.

Learning Outcomes

TRIO Student Support Services

Assessment Inventory

SSS Bridge Program Pre- and Post-Tests

The SSS Bridge Program pre-test examined the level of institutional knowledge participants had prior to the Bridge Program, as well as their level of preparedness for beginning their first semester of college at UAA. The post-test measured the same knowledge and preparedness after participants attended the Bridge Program as a benchmark for evaluating student learning outcomes, satisfaction, and program outcomes. Participants also had the opportunity on the post-test to address ways in which to improve the program to meet their needs. 鈥

Assessment Type

Comparative Benchmarking, Operational/Program Outcomes, Student Learning Outcomes, Student Satisfaction/Perception

Key Findings
  • 80% of participants reported being able to identify resources on campus that could help them academically after attending the SSS Bridge program
  • 88% of participants agreed or strongly agreed that all of their questions about starting freshman year were adequately answered
  • 70% of participants agreed or strongly agreed that they felt more prepared to start college after attending the SSS Bridge program; 25% of participants had neutral responses鈥

SSS Participant Cohort Banner Reports

Banner reports are used to monitor SSS participants' performance and success throughout the academic year and to track our program's progress in achieving our grant objectives regarding persistence/retention, good academic standing, and baccalaureate degree attainment. We track participants' fall/spring/summer enrollment, mid-term grades and GPA, academic standing, SAP and financial aid requirements and awards, semester bills/balances, and holds. 鈥

Assessment Type

Comparative Benchmarking, Operational/Program Outcomes, Stakeholder Needs, Student Learning Outcomes

Key Findings

None provided.

Learning Outcomes

SSS Bridge Program

Outcomes Statements

As a result of participating in the SSS Bridge Program, first-generation, low-income, and/or students who experience a disability will be able to identify people and departments at UAA that they can access as resources to support academic and personal success.

Learning Intervention

The SSS Bridge Program is a two-day transition/orientation program for incoming freshmen and transfer students who have been accepted into the SSS program. Participants of the Bridge Program connect with SSS staff, peers, Peer Mentors, Student Affairs departments (Financial Aid, Career Services, Student Health and Counseling), Administrators and staff, and academic advisors and first-generation faculty. Participants engaged in presentations, Q&A panels, and sessions designed to increase participants' ability to remember and understand the resources and supports available on campus. 鈥

Measure

SSS participants' institutional knowledge and understanding of campus resources were surveyed at the beginning and ending of the Bridge program using pre- and post-tests.鈥

Data Collected

Yes