Court Filing Ingestion/Extraction Revamp
Description:
Revamp Reorg’s legacy Court Filing Ingestion tools and processes supporting 100M in revenue
Business Objective:
Update the internal tools managing scraping of court cases to reduce wasted manual effort, and improve the scrapers to reduce pacer fees and outages without affecting downstream workflows and the client experience
My Role:
Senior Product Manager
Company:
Reorg
Launch Date:
August 2024
Strategy and Background
Reorg’s initial founding products were all based on ingestion of court filings, extracting structured data from court filings, as well as reporting on court filings and court proceedings. Dozens of products and the majority of Reorg’s revenue has some connection to these scraped documents from the Pacer court filing system. The scrapers for retrieving information and the internal tools for managing court dockets and filings were hard to maintain, slow, costly to run, and had been extended haphazardly over time.
As part of a value creation plan agreed upon with Reorg Executive Leadership Team, it was determined that a revamp of this system was a top priority. I was brought in to lead the business design of the UI/UX and to ensure overall project success.
I carried out extensive discovery with the Reorg operations team and the developers maintaining the current system to understand current features as well as major pain points in the current workflows.
The Plan
- Create an all new UI/UX for managing court cases and filings in order to drastically simplify setup and management.
- Update the data-input tools used to enter in extracted case data
- Update all the client and internal notifications to provide deep linking into the platform to improve experience and save time
- Revamp the pacer scrapers and scraping logic to reduce scraping costs and eliminate logic gaps
Execution
- Conducted extensive discovery and user research to understand current workflows of business operations team, application support, and developers
- Detailed requirements for a new UI/UX that reduced the number of pages required to complete work in half while maintaining all existing required features
- Collaborated with a designer to create mockups and prototypes in Figma and gathered feedback from operations team
- Worked closely with the data engineering team as the voice of the user to help improve the business logic around scraping
- Worked with QA to define and signoff on test plans
- Coordinated between Dev and QA to review work in progress, write bug tickets, and ensure work was following the established requirements and mockups
- Coordinated with product strategy to lead two rounds of UAT testing that brought business users into test environments for signoff on the new tooling
- Utilized SQL to run extensive tests on data migrated through ETL flows as well as validate the new scraping logic
- Established a roadmap for future improvements to the UI/UX and scraping logic to further reduce scraping costs and manual workflows