CoA eDiscovery and Redaction Tool Replacement — A Case Study
(Note to reader: This contract has not yet been finalized and considered confidential. Company name, logo, interactive prototype, and security points will not be included.)
In an effort to maintain a commitment to making Austin’s city government transparent and accessible, the City of Austin has sought a solution for one particular City responsibility, the management of public information requests. This project presents an important opportunity to demonstrate that commitment. A digital solution is needed to support members of the public and City staff who participate in the Public Information Request (PIR) process. When it comes to Public Information Requests, the City of Austin seeks to provide a customer experience that is efficient, transparent, and compliant with public information laws.
UX Researcher and UI Consultant for the Law Department
Tools: Teams, Trello, Miro
research | planning | strategy | workflows | affinity mapping | user testing
Aug 2017 — Contract has not yet been finalized
For this project, I was provided with some background information and tasked with determining stakeholder needs. However, in order to get additional information I conducted primary research and usability testing. It was also my responsibility to provide user interface consultation.
Currently, the programs being used do not promote efficiency, automation of redactions and reviewing, and collaborative communication. A huge business goal is the need for the platform to be inclusive of all of these elements, a one-stop-shop.
- Satisfy a City Council resolution and respond efficiently and timely to PIRs
- Enhance efficiency in meeting regulatory requirements and deadlines
- Support secure internal and external communication and collaboration
- Automate review and redaction process pursuant to council resolution
User Research- Understanding the User
Although all of the programs we currently use to successfully complete and close a public information request do the job, how might we improve the process? I conducted user research with a departmental survey. I learned it takes nearly 10 business days to complete a request, that is really cutting it close to the mandated deadlines.
The following was distilled from the survey results and listening to stakeholders’ pain points:
Proficiency Dictates Efficiency
The proficiency level varies amongst departments and users. There is an obvious struggle to stay on top of all of these programs with training, licensing, and consistency. Which, unfortunately, can hinder mandated deadlines.
“we really need to have clearer instructions on how to access posted documents”
Because correspondence can come from many different places, communication is missed between departments and between the City and the public. This also affects efficiency and gives a negative outlook on governmental transparency.
“most of the messages go straight to the junk/spam folder in their inbox”
Having to review hundreds or even thousands of pages, page by page, for confidential information is just too tedious. It takes man hours, multiple programs, and many cups of coffee. Due to the inefficiency of the process, requestors pay the price with money and time.
“it’s the last thing on my list of stuff to do”
With all this in mind, everything boiled down to one point made by the City Council, the need to respond efficiently and timely to PIRs.
Mapping — Synthesizing Research
In order to deduce what the main business goals were, I organized my insights with an affinity map. After organizing my insights into different categories, I determined what the MVPs were. Some things that were found were:
- Efficiency Issues: “if we could just filter selections prior to creating a request, it would cut down on time loss” and having to ask so many follow up questions.
- Miscommunication: “Most of the messages go straight to the junk/spam folder”
- Too tedious of a task: “It’s the last thing on my list of stuff to do”
Ideation — Navigating the PIR Process
I created a user flow of what I determined to be the red routes of the PIR process. I decided to phrase the red routes as “How Might We” questions so that I may better understand what the business’ needs are:
- How might we send correspondence to the requestor?
- How might we collaborate with departments and attorneys?
- How might we make necessary redactions?
- How might we send cost estimates?
- How might we upload responsive documents?
- How might we review responsive documents?
Usability Testing — Iterating
The team has been testing for nearly a year with over ten work sessions. The point of these stakeholder test sessions is to suss out whether the new system will enhance our standard operating procedures, increase efficiency, can easily be used across multidisciplinary teams, and decrease the time it takes to respond to a PIR. These sessions lead to many concerns and UI considerations:
- Can a preview window be possible for responsive data outside of ecommunications to reduce the iterations of back and forth correspondence?
2. Can we add a filter to the communication search to reduce reviewing time?
3. How can you export or view an error report to mitigate future issues? Not definitive.
4. Can you add a filter to search people only distribution lists, or does one exist?
5. Can the auto-redaction action have a drop down menu or a list to choose from instead of a query function?
The platform allows for privileged and restricted viewing to control who gets to see what data, and for how long, eliminating the release of sensitive data. Users can leave virtual sticky notes or annotations for peers or leave reviewer remarks. Communication is in one place and not via phone, email, portal notes, and Teams messages. Miscommunication is a thing of the past.
It performs complex searches using keywords, dates, or other advanced searches across multiple locations of data and users, including distribution lists. You can also filter searched data because the tool manages the metadata and classifies them. This increases the speed of the search and reduces the amount of unnecessary review.
Auto-classification and Auto-redaction
The tool auto-classifies searched data by placing the information in virtual buckets. Rule sets can be created and allows you to filter what is relevant and what is not. Users can auto-redact sensitive data with a query search pattern or choose from a pre-programmed list of auto-redaction rules (at this point you create the list until the update rolls out). This will cut down on redundancy and time lost from manual review.
One key lesson I learned and assumption I had to throw out the window, is that everyone was operating and performing at the same level of proficiency with the PIR process. Not everyone was proficient with technology and not everyone had the same tools to complete their job. It was extremely important that I surveyed and spoke to every single point of contact from all of the departments. Get hands on with stakeholders in work sessions to physically see the pain points they dealt with daily. This allowed me to really empathize and understand the problems that desperately needed to be solved.
To reiterate, this contract is not yet finalized and the project still has several final security testing sessions scheduled. However, my role in the project is coming to an end and I am excited to set up training processes for the new tool when rollout begins. As for next steps, I hope to make my way as the system lead trainer for the City at large and continue to collaborate on projects like these in the future.