Evolving the Schedule Visualization Experience in Portal3
Overview
Scheduling content to screens is at the core of managing a digital signage network. While our scheduling flows undergo regular updates, it’s equally critical to ensure users can easily review and validate scheduled content. This case study details the evolution of our scheduling visualization tool through four key phases: the MVP, Phase 2, Phase 3, and Phase 4.
As the primary Product Designer, I led the UX strategy for the MVP and Phase 4, conducted user research, competitive analysis, and iterative design improvements. In Phase 2, I mentored a new designer, guiding them through user-centered design principles and customer feedback analysis. Throughout all phases, I collaborated closely with engineers, product managers, and customers.
My Role
Problems
For the MVP, I first began by auditing how users view schedules in the legacy platform, uncovering the following pain points:
The inclusion of “slots” in our legacy platform were confusing and not very relevant for non-DOOH workflows (of which our customers fell into)
The timeline was inaccurate, making it difficult to effectively validate events scheduled with hour parting.
Selecting a date was cumbersome.
It was not possible to schedule content when looking at a schedule, so users would need to have multiple tabs open.
There was no way to easily reorder events on the schedule, without having to remove and re-schedule them.
I also conducted a competitive analysis on how competitors enabled users to view scheduled content in their products. While some were definitively better than ours, I noticed a recurring pain point: users could only view one schedule at a time. There simply wasn’t a way for users to easily compare what was scheduled on multiple devices simultaneously—something that was imperative for an impactful content strategy when working with digital screens in close physical proximity to each other.
Opportunity
How might we create an intuitive, flexible scheduling visualization tool that simplifies content validation, supports multi-device comparison, and scales with evolving user needs?
Goals
Create an effortless, intuitive scheduling interface.
Simplify the UI
Provide visibility into multiple schedules to facilitate content synchronization and curation.
Enable Multi-Device Comparison
Improve Device and Event Management
Streamline device selection and event reordering for greater efficiency.
Empower Direct Content Management
Allow users to schedule and modify content directly from the visualization tool.
Phase 1: The MVP
Objective: Launch a simplified, competitive scheduling tool for Portal3.
For the MVP, I focused on simple and clear solutions to directly address the pain points I discovered.
Key actions:
Audited the legacy platform to identify critical UX issues.
Conducted competitive analysis to uncover industry gaps.
Designed the Canvas Comparison Tool for multi-device viewing.
Removed confusing “slots” concept and introduced drag-and-drop event reordering.
Enabled direct scheduling from the visualization tool.
Impact: Delivered a streamlined scheduling experience, differentiating Portal3 with multi-device comparison capabilities.
By the time we were ready to revamp schedule visualization again, Portal3 had grown. Various features like mass scheduling, scheduling to video walls, and advanced scheduling features like hour parting, were added. The revamped design of the Schedule visualization page needed to accommodate all of these new features. At this stage, I served as a consultant, guiding a new designer through education and grounding them in the voice of the customer.
Phase 2: Expanding Features
Objective: Support new features like hour-parting, priority events, playlist scheduling, and scheduling to video walls.
Key actions:
Mentored a new staff designer, providing guidance around use cases and championing the voice of the customer.
Worked with developers to ensure accuracy of added timeline.
Added additional event types to the visualization key (e.g., playlists, priority events, orchestrated events).
Updated hardware selection to accommodate both single devices (“Canvases”) and video walls (“Walls”).
Revised all copy and prepared design file for handoff to engineers.
Impact: Enhanced schedule visualization flexibility, accommodating complex scheduling needs while maintaining usability.
After Phase 2, it was imperative to capture users’ feedback of the schedule visualization tool in order to mature it. Distilling the feedback uncovered several pain points, not all directly related to the visualization of the schedule but also to the possible workflows on the page.
Phase 3: Refining the Experience
Objective: Address user feedback to improve content selection and schedule clarity.
Impact: Boosted efficiency and user satisfaction by reducing friction in content scheduling and validation workflows.
Reworking Content Selection to Empower Users
Pain point: When scheduling content from the schedule visualization tool, users reported that the content selection modal was too small and they could not see enough details about their assets—like type of resolution, and orientation. They were also frustrated by the lack filter tools within the modal.
Solution: We expanded the content selection modal to bring the content library directly into the schedule visualization tool. Users could now change the view between thumbnail view and list view, see as much metadata as they wanted, and search and filter for assets and playlists.
Maximizing Screen Real Estate
Pain point: Users felt that they couldn’t see enough of the schedules when using the tool: simply put, screen real estate was not being used efficiently.
Solution: We minimized hardware selection into a modal to make better use of that all-too-important screen real estate. By doing so, we also enabled users to see more information their devices (as opposed to just seeing the device name) and to search and filter for devices.
Recognition Over Recall
Pain point: Users wanted to see more information about events on the schedule, which would become truncated when hour parting was employed or many schedules were open.
Solution: We added thumbnails to the event bars that represented the orientation of the scheduled content for more visibility. We also added a card that would appear on hover with additional information about the event.
After deploying phase 3, I again took to auditing the newly released implementation and monitoring user feedback. After organizing both internal and external user feedback, there were still some problems to be fixed:
Phase 4: Optimizing for Scalability
Objective: Prepare the tool for enterprise-scale deployments with features like a 7-day view.
Key actions:
Increased device comparison limit to 10.
Reverted from grid to vertical layout for easier schedule comparison.
Improved device selection feedback.
Reworked empty state to better inform users of the first step.
Added icons to distinguish Canvases from Walls.
Introduced event mapping on hover for better cross-day tracking.
Enhanced scrollbar usability with hover-activated growth behavior.
Impact: Supported the migration of our largest customer to Portal3, significantly improving multi-day scheduling workflows and reducing user confusion.
Reflection
This project highlighted the importance of proactive user research, design iteration, and strategic thinking. I evolved from executing design solutions to leading design strategy, mentoring peers, and influencing product direction. Working through each phase, I learned that:
Iterative design grounded in user feedback ensures continuous product improvement.
Balancing simplicity with scalability is critical in enterprise applications.
Cross-functional collaboration and mentorship amplify design impact.