Context
A digital platform to enhance our web managed services experience.
At Dusted (a Central London-based branding agency with digital offerings), we provide various types of managed services.
One particular service involves hosting, monitoring, and maintaining marketing websites. We enhanced this service by building a 0-1 digital platform to help teams fully utilize their web marketing channel by:
- Making it easier to track and understand web performance
- Helping them understand what their competitors are doing online
- Justifying resources for improvement
User research
Understanding the needs and pain points we could address
1. Quick secondary research
I gained an understanding of client behavior by speaking with managed services account managers who have regular client catch-ups, providing deeper insight into which types of clients/users we could best target.
2. 1-on-1 interviews
Without direct client access at this stage, I interviewed an in-house marketer and external marketer who matched our target profile to understand the competitive landscape.
Through this research, I learned that small marketing teams have high potential for getting more value from our managed services.
I identified 2 types of teams along with their pain points and needs:
- Marketing teams that know what they could do more of but are too busy to execute
- Marketing teams that are unaware of their web channel's potential and rely on other channels
I consolidated these insights into HMWs (How Might We statements) surrounding reporting, communicating ROI, addressing tool fragmentation, and data overload.
Existing marketing tools and feasibility research
What solutions are currently available in the market
We compiled a list of software that marketers use and examined cutting-edge AI tools to understand current capabilities. This helped us better rationalize the "Why us?" question.
We created an offerings-level map to visualize our findings.
Product direction
We prioritized 3 directions to explore and prototype
After examining the problems, market landscape, and what we could solve through several rounds of internal discussion, we selected 4 key focuses:
- Analytics
- Email Reporting
- SEO
- Competitor Website Tracking
Explorations & collaborative iterations
We created mockups & prototypes to quickly explore different approaches
AI Recommendations
We tested different system prompts and data inputs, including screenshots, but LLMs at that time did not meet the quality standards of recommendations our team would typically make. As AI models improve and the platform and its data sources mature, we plan to conduct new rounds of testing to determine if the recommendations could become valuable for clients.
AI Chatbots
We implemented a chatbot version but found that users often didn't know what to ask, and it suffered from the same issues as the recommendations feature.
MVP
A focused tool with room to grow
Through multiple rounds of iteration, we consolidated our design into a commonly agreed-upon approach.
Feature 1: Track website performance
We decided to present clear, essential metrics like total traffic, conversion rates, channel performance, and basic competitor comparisons for context.
Feature 2: Track competitor website performance
We previously explored a grading system but found that a percentage-based approach provides better detail, helping clients more accurately compare themselves to competitors.
For deeper insights, users can explore individual categories like traffic, conversion, and SEO.
Feature 3: Email reporting
With weekly email reporting, clients can customize emails for different stakeholder groups—choosing the timing, frequency, and content. For example, C-suite executives can receive a simplified version showing only top-level metrics.
Feature 4: Competitor updates summary
Introducing competitor website tracking to the platform allows users to quickly identify and understand strategic or product-related changes by competitors. This feature requires further fine-tuning as the AI must accurately classify the nature of updates.
Outcome & Learnings
Platform delivered with valuable insights gained
Since this project is being built alongside client work, the MVP is still in development and will continue to evolve based on user feedback and market needs. This 0-1 platform development presented unique challenges: designing software to enhance rather than replace our service offering, navigating diverse stakeholder interests through visualization and experimentation, and maintaining rapid prototyping workflows with close developer collaboration, often sketching solutions live when code was ahead of design.
Key Learnings
Navigating stakeholder interests
Working with multiple stakeholders required careful balance and clear communication. Key strategies included asking targeted questions to understand underlying concerns, visualizing concepts and trade-offs whenever possible, promoting experimentation over lengthy debates, and giving stakeholders time to process complex decisions.
AI tool applications and limitations
Testing different AI implementations taught us about current capabilities and limitations. While AI shows promise for data analysis and pattern recognition, human oversight remains crucial for quality recommendations and contextual understanding.