U-M Office of Student Conflict Resolution (OSCR)
The Office of Student Conflict Resolution promotes a safe and scholarly community in which students navigate conflict. This project aimed to uncover issues with information sharing and finding on OSCR’s website through needs assessment and conducting a usability evaluation.
Collaboration with OSCR was through the University of Michigan course Needs Assessment & Usability Evaluation course.
OSCR’s Problem
The Office of Student Conflict Resolution promotes a safe and scholarly community in which students navigate conflict. User’s of OSCR’s services are struggling to successfully navigate their website due to information overload and the amount of jargon used on the site.
Interaction Mapping
Interaction mapping was used to gather a high-level understanding of the information architecture of OSCR’s site by concentrating on 3 major user interactions provided by OSCR’s staff: Background Checks, MIP information, and conflict resolution resources.
User Interviews
Conducted 4 interviews with past-users, likely-users, and staff members on how they navigate/search for information to better understand and empathize with users.
Key Questions:
Q1: What are the needs of current users and potential users of OSCR’s website?
Q2: What are these users’ current perceptions of this website?
Q3: What are the most valuable resources that OSCR provides to users?
Methods
We recruited participants for our study through social media solicitation and distributing promotional material. We also utilized snowball sampling to recruit after conducting in-person interviews. These interviews were audio-recorded and were thematically analyzed the data after transcribing them. Through a thematic analysis of interviews identified emerging themes about the underlying issues with OSCR’s website.
Findings
F1: There is a lack of knowledge around what OSCR is and what they offer.
F2: Preferred conflict resolution styles likely dictate how one seeks out information and resources.
F3: Power dynamics and trust are a primary concern for potential users.
Comparative Analysis
Our team researched 9 “competitors” to OSCR; direct, indirect, partial, parallel, and analogous to identify things OSCR’s site is doing well and what they can improve on in terms of information retrieval and the variety of resources that are offered by each respective competitor.
Key Questions:
Q1: Which organizations offer services similar to OSCR?
Q2: How are competitors presenting information on their pages?
Q3: What features make competitors’ web pages easier or more difficult to navigate?
Methods:
We compared competitors with OSCR using comparative tables given the criteria considered to be well implemented. The criteria were split into two categories and scored using a scoring table:
Easy Information Retrieval
Variety of Resources
If there is a “yes” (implemented well) for each criterion, we assigned 1 point. If there is a “no” (not implemented well) for each criterion, we assigned -1 point. For “maybe” (partially implemented), we assigned 0.5 points.
A 2-by-2 matrix was then used to visualize scoring by 2 previously identified categories.
Findings:
F1: The OSCR website includes ambiguous information about what OSCR does and who should use its services.
F2: The navigation of OSCR is difficult to use.
F3: OSCR is lacking an easy scheduling process.
F4: OSCR’s website is complex.
Surveys
In this survey, which had around 36 responses, our team aimed to explore the relationship between conflict resolution styles, the severity of the conflict, and how people resolve conflict. Overall, these quantitative findings helped us to better understand user motivations for using (or not using) OSCR.
Key Questions:
Q1: Is there a correlation between a user's conflict resolution style and the resources that people seek out to resolve conflict?
Q2: To what extent does the severity of conflict affect the ways users seek out conflict resolution resources?
Q3: How does the awareness of university resources influence the likelihood of use?
Q4: What features should be prioritized on the OSCR website to create a better user experience?
Methods:
We created a pilot of our survey in a Microsoft Word document. Due to the sensitivity of information obtained, we emailed our pilot survey to OSCR to ensure we had client approval. Upon creating the final version, we changed the answer style of questions based on feedback from the instructional team and introduced situational-based questions. In the final version, we changed the order of questions so as not to have our results impacted by recency bias. The final survey was created in Qualtrics, and before sending we added more frequent page breaks as well as clearer instructions (i.e., drag and drop to re-order) to improve the flow of the final survey. We sent the link to the survey to listservs at the University of Michigan, which would be considered our sampling frame.
Findings:
F1: Users generally seek out the same resources to resolve conflict, regardless of their conflict resolution style.
F2: Respondents tend to seek more external help and guidance when they encounter a severe conflict.
F3: Being familiar with a resource on campus does not always equate to seeking out those resources’ services.
F4: When looking for information on a website, people are more likely to utilize the search bar and navigation.
F5: In general, people are more comfortable scheduling appointments through an online form or calendar.
Heuristic Evaluations
Our team developed more specific items to check under Neilsen’s usability heuristics to gain a deep understanding of areas OSCR’s site was not being met from a usability standpoint.
Key Questions:
Q1:What kind of usability issues does the OSCR website currently have?
Q2: How can we sort the usability issues by severity?
Q3: What practical recommendations can we make for OSCR to resolve the usability issues?
Methods:
After deciding on our heuristics, we compiled a list of questions used to evaluate the site, as well as a severity rating scale to determine the prioritization of findings. Each member of our team performed a heuristic evaluation of the site individually.
Once the individual evaluations were complete, we came together to aggregate our findings where we collectively discussed a final priority score for each heuristics component. Throughout this process, there were some items where ratings and comments aligned with everyone in our group. However, on items where ratings were different, we rotated and explained our rationale for the provided rating and comments. In doing so, we were able to thoroughly discuss items and come to a consensus for the priority rating.
Findings:
F1: Forms on the OSCR website could perform better.
F2: Technical terminology contributes to comprehension barriers, especially for new users.
F3: The site offers minimal help and documentation.
(Remote) Usability Test
Conducted 5 remote usability tests on the heuristics we identified previously; tell users to complete specific tasks on the OSCR site. Although we were constrained by limitations such as remote user tests due to COVID-19 and desktop-only testing, many of our findings echoed what we uncovered through previous research (interviews, surveys, and heuristic evaluation).
Key Questions:
Q1: What do users think of the OSCR website?
Q2: Is the process of requesting services from OSCR intuitive?
Q3: Is there anything that hinders the users’ ability to find the information they’re looking for?
Q4: Are users able to find specific content using the navigation and search functions?
Methods:
Our team identified 4 tasks in detail and categorizes them into four main areas: forms, legal resources/navigation, search/navigation, and self-help resources/navigation. In determining success criteria for these tasks, we considered a baseline for the number of interactions, and whether users were able to successfully navigate to specific pages to complete the tasks.
We compiled several forms and questionnaires to supplement the actual usability test. A pre-test demographics questionnaire and post-test questionnaire asked users’ thoughts regarding their experience on the OSCR website including open-ended and Likert Scale questions.
A data-logging form was created for the observers to record data during the test; it includes a column for time to completion for each subtask, Yes/No/Partial for completion.
Findings
F1: Information overload caused users to have difficulty finding specific information.
F2: External and recycled links led to backtracking and uncertainty.
F3: Search functionality was integral to helping users find information.
F4: Language used on the site was not understood by all users.
Overarching Findings and Recommendations
Findings and Recommendations were presented to OSCR throughout the duration of the project over two presentations. Presentation 1 covers Interaction Mapping, User Interviews, and Comparative Analysis. Presentation 2 covers Surveys, Heuristic Evaluations, and Usability Testing
Findings:
F1: Conflict resolution styles do not impact the resources (i.e. OSCR) users seek out to navigate conflict, conflict severity does.
F2: Complexity of the site negatively impacts visibility across OSCR’s site making it challenging for users to navigate and find the information they need.
F3: Information and jargon overload causes issues for users when trying to narrow the scope of their information inquiry.
Recommendations
R1: Increase the visibility of frequently asked questions on the homepage to reduce user interactions within the website.
R2: Minimize the number of duplicate links and inform users of external links.
R3: Consider introducing a more robust search functionality, especially for mobile users.
R4: Decrease technical language used on the website.