Summary: After the release of a new interface led to a decline in usage rates, I led a campaign to diagnose the problem, develop solutions, and implement them. My redesign not only brought usage back to par, but also significantly exceeded prior levels.
Background
Versatile PhD (VPhD) is a digital content and services provider for research universities across North America. VPhD’s products prepare graduate students and PhDs for non-academic careers.
I led a reinvention of the experience around the company’s marquee products. This included a wide-ranging discovery campaign that led to a redesign of the interface for our revenue-generating products.
At the start of the year, the company had added a new product line to complement an existing one. Both product lines were combined in a new interface and used a new information architecture.
But after a few months, high-level summary metrics showed that this new combined experience was not taking off in the way we had imagined it would. I needed to figure out why this was happening and come up with a plan to correct it.
Existing Design
We called this interface the PhD Career Finder. It was the gateway to our revenue-generating products: a range of case studies and other content documenting how individual PhDs had successfully transitioned out of academia and started new careers. Its architecture was designed to let users narrow down the whole spectrum of our offerings to those most relevant to them.
It also had features that supported using it as a career exploration tool. After choosing the product line that matched their academic training (qualitative or quantitative), users could view a range of non-academic sectors where other PhDs like them had succeeded, learn baseline information about those sectors and how to prepare for them, and see more specific matches of academic training and non-academic jobs: how a psychology PhD became a market researcher, how a social scientist became an instructional designer, how a materials scientist became a financial analyst, and so on.
Users accessed all of this through a series of levels:

After reaching the 4th step, clicking in any of them items in the right sidebar would load the appropriate product. For us, these were the end goal. Accessions of these content products were a priority metric which we reported to decision-makers at client universities.
Discovery
The summary metrics that brought our attention to the underperformance of these core products were useful indicators. But they were simple counters that revealed very little about the bigger picture.
I initiated a series of discussions among the three of us who made up the company: myself, the founder, and our developer. Several possible hypotheses for the decline emerged from these conversations:
- The lack of usage was the result of bad signposting. Adding and/or rewriting labels at key points in the user’s progress through the interface would correct the issues.
- Parts of the interface design were overly complex, and we should reconsider their design.
- The low usage was the result of users’ lack of interest in or lack of commitment to owning up to their problems, and so we should try to figure out how to better reach those users who did have interest or commitment.
The first idea would be easy to test. Success in working out how to appeal to users at this stage would also likely help us improve our user acquisition efforts.
The second was more complex. The interface was itself complex, so understanding it from the users’ point of view would require careful analysis. The team had also invested quite a lot of time, effort, and other resources in creating it. It embodied the company’s new visual identity. It was the vehicle for several business goals, including revenue generation. However, if we ended up diagnosing interface issues with sufficient precision we could make equally specific interventions.
I saw the third idea as an invitation to validate aspects of our business model and product strategy. But it was also sufficiently different in scope and potential implications that I decided to separate it from the investigation into the website itself. I spun this aspect off into its own project, where I looked more closely at those areas through the eyes of our users.
Approaches
The next phase of the project involved several lines of inquiry:
User journey analysis: I had previously built an R and Python workflow to extract, transform, and analyze data from our analytics (an open source solution called Piwik) and from our databases (MySQL). This opened up our view of how users navigated our website, especially the new product interface.
Semantic analysis: I audited the interface’s labels and other copy to uncover ambiguity or outright gaps between language, affordances, and a range of user jobs-to-be-done.
Survey data: I reviewed user satisfaction data and other findings from a store of data that we had collected some months earlier.
Usability studies: Beginning around the midpoint of this phase, I conducted a few usability studies with users.
Conclusions
The user journey data showed that the relatively small number of users taking advantage of our core products were the survivors of a much larger number of users whose depth of visit in the interface was much more shallow.
The other approaches collectively demonstrated that we had enabled that dispersal by presenting users with an escalating number of choices as they progressed through the states and levels of the product experience.
Although each individual step and state of the interface made sense, the cumulative effect of each piece of its architecture required users to make sense of more surface area and affordances. More calls to action grouped together meant more claims on users’ attention - and competition among those claims.
The Career Area page design was particularly troublesome. It was the final step before a user accessed a core product, but it made it difficult for users to take that step.

Navigational elements persisted, taking up the top of the view even through the user was no longer interacting with them. We gave lower-value information (the short paragraphs of descriptive content) a prime location. Occupying the left side of the viewport, it tended to monopolize users’ attention.
This diverted them from what we saw as the higher-value (i.e., revenue-related) information in the right column. But those links had a confusing “PREMIUM CONTENT” label and less space. This section had to shared space with infrastructure features like login/logout and whether premium access upgrades were in effect.
My colleagues and I agreed that our goal would be to reduce the number of actions and choices required to yield a high-impact reportable event (like a core product accession) once a user was inside the product interface. This would, we believed, improve activation and retention metrics.
Given the results, it appeared that reducing the number of choices available to users at key points in their journey through the interface would help them use the interface more effectively.
Delivery
In this stage, my contributions were both social and technical. I built consensus on our team around the design plan, working with the founder to make sure she didn’t end up feeling that the core of her vision had been lost. I iterated through a series of wireframes, removing or remaking elements of the existing design. Once landing on a final design, I worked closely with our developer to roll it out.
Design and Implementation
An end-to-end reimagining of the website product experience was not feasible. While there was ample evidence to suggest that we had not successfully aligned what we had to offer with our users’ preferences for using it, we lacked the resources required to totally start over.
Given that, I decided to focus on these two pieces of the puzzle:
- Reduce the number of user choices required to move from the first level of the interface to the Career Area level
- Remake the Career Area page design to foreground the links to the core products and reduce or eliminate features that competed for users’ attention
The founder and I worked together on the architecture of the first three states of the interface. We decided to flatten it, un-nesting the successive stages required to pick a career area.
BEFORE

AFTER

I redesigned the fourth state (the Career Area page) myself. I isolated four main changes:
- Abandon the two-column design in favor of a stacked/tabbed presentation
- Redesign the header area to take up less vertical space and be less redundant
- Foreground the core products by making users encounter them first and with little other competition
- Remove the informational content from the first user encounter by shifting it below the core products
I worked through a series of wireframes in Balsamiq until the elements communicated my vision, and then worked with our developer to translate them into HTML and a set of JQuery behaviors.

BEFORE

AFTER

Outcomes
In the months after we brought the new design online, usage rates increased by about 12% across our user base. It was even higher among cohorts of new users, approaching a 25% increase compared to the aggregate baseline rate prior to the new interface.
More importantly, compared to equivalent cohorts from the period of the old interface, not only did a higher proportion of these new users try out these content products, their rate of repeat use and depth of visit were both notably higher. Taken together, these insights suggested that the changes had made it easier for users to find what they wanted in our products.
Although we had reduced the steps required to access these products, it still took several clicks and decisions. I planned for more work going forward.
Summary: After several years working on a very monolithic development and release cycle, I led a change at my organization to move to a more agile and product-focused approach.
Through an intensive discovery process, I developed a set of principles for product development that allowed us to focus and prioritize our efforts, and save time and resources. I also improved our qualification and sales process, since we were better able to discern clusters of unmet needs in our target market.
This work helped us set and execute on a roadmap that ultimately led to the company’s acquisition.
Background
Versatile PhD is a digital content and services provider for research universities across North America. VPhD’s products prepare graduate students and PhDs for non-academic careers. After a new product launch resulted in unexpectedly weak adoption by our users, I led an initiative to validate our product strategy.
VPhD’s problem space is U.S. and Canadian higher education, where full-time, permanent faculty positions have become increasingly scarce over the last few decades. The economic downturn beginning in 2008 accelerated the trend. At the same time, research universities have maintained or increased their production of doctorates. These two trends have led to a massive imbalance in the market for hiring academic faculty: tremendous supply, but thin demand.
With an often limited awareness of the wider workforce and weak professional networks outside of academia, doctoral students and early-career PhDs struggled to re-invent themselves when they did not land a permanent faculty position.
While the company had established itself as a leading resource provider across top-ranked and emerging institutions, our product strategy had been conceived primarily with our clients in mind, not our users.
We knew a lot about them. They were university administrators who were concerned about trends driving doctoral career prospects into ever more precarious territory. But even so, they didn’t occupy the problem space. People enrolled in doctoral programs did. People who had earned a doctorate did. Those were our users.
I believed we could create a virtuous relationship between these two modes. If we provided users from client universities with the kind of experience that made us a trusted long-term resource, the decision-makers at those universities would feel that their money was well spent.
Objectives
I set out with three goals:
- Validate product strategy and value proposition for our end users
- Improve our understanding of our user base by moving away from an anecdotal base to a data- and segment-driven understanding
- Collect possible product and feature ideas through direct engagement with users and their behaviors
My primary methods were:
- Carrying out a deep analysis of our entire user base, and doing so for the first time in the history of the company
- A series of interviews with representative users from the cohorts and segments I discerned in the user base analysis
Discovery
Working with our developer to understand our user information database and the relationships among its various tables, I isolated a standard set of demographic factors relevant to our business model and services. I further defined a set of metrics relating to users’s relationship with the company and actions they’d taken since registering. Matching the demographic factors to the metrics required systems for extracting and transforming data from disparate sources across our systems. I built those in R, using direct links to our MySQL databases and the APIs of our analytics suite.
Exploratory analysis confirmed some of our assumptions about our users. We saw the highest levels of usage of our products in the late stages of doctoral programs. At that point in their progress to the PhD, students would typically have a much clearer sense of their actual prospects in the academic hiring market than in the first years of their programs. They would then seek resources like our products to start to figure out what they could do instead.
I also noted there was often a long period of inactivity between registration and a user’s first meaningful utilization of our products. (I defined this two or more visits in close succession where a user took action across our offerings, or dove deep into a single offering.) In those cases, it appeared to be the case that users signed up relatively early in their program and then returned to us once they reached the stage where active career planning became necessary. Typically this was three to five years later. (Our registration process included a request for a new user’s anticipated year of graduation, so we were able to make this judgment with a reasonably higher certainty.)
These patterns confirmed our assumption that user engaged in the demanding process of doctoral study would probably not make regular use of career-focused products and resources that supported non-academic job prospects. However, certain strong counter-trends emerged once I began to segment the data:
- Substantial numbers of users did use our products steadily over the time between registration and graduation, even over the course of several years.
- Many of these users came from a cluster of universities that shared key traits: public universities with strong regional reputations, emerging aspirations to national status, and diverse local economies.
- Whether connected to that cluster or not, users with steady usage patterns tended to use a wide cross-section of our features and affordances: newsletters, job feed emails, more complete profiles, and so on.
Interviews with a sample of these users were revealing. Despite typically high aspirations to faculty jobs, those roles did not crowd out other possible careers as successful outcomes to their time in graduate school. Through what seemed like a combination of institutional and personal factors, they had not fallen into a very common and increasingly high-risk mindset where a faculty career was the only option worth imagining.
This factor led them to avoid common states that we saw among more low-usage users. They did not sign up, tell themselves “I’ll come back later when I really need this” – and then not come back. They also did not let their academic aspirations and training dominate their time. Instead of telling themselves that they had no time for “extra” preparation, especially for what felt like a hypothetical future problem, they tried to learn and do things unrelated to the academic faculty career path.
Delivery
I considered what level of effort would be acceptable to the other users. They were interested enough to sign up. But their sense of their future led them to put off relatively simple actions that would help them diversify their options. What would that level of effort look like, and how far were our current product strategy and value proposition from supporting it?

To test this idea in the context of our products and the affordances of our platform, I created a taxonomy of actions that users could carry out. I also grouped them under different possible goals and jobs-to-be-done that would be relevant to our users. I used segmented email campaigns to suggest actions and combinations of actions to users.
As certain actions stood out as more effective than others, I began to add them to our site’s functionality. Working with our developer, I designed and helped implement a number of small features that generally surfaced a single available action on the company’s site in a place where users would not have ordinarily seen it.
Outcomes
From this experience, I isolated four takeaways that guided product development going forward:
- Align: Continue to narrow and clarify the value proposition.
- Drip: In the time between registration and recognition, deliver value in small bites relevant to the users at their stage of progress through graduate school.
- Synchronize: Use our understanding of user segmentation and progress stages to better time our interventions in user’s lives.
- Convert: Use the principles above to turn the “saving VPhD for later” users into active users.
These principles helped us focus and prioritize our product development efforts, saving time and resources. It also led to improvements in our qualification and sales process, since we were better able to discern clusters of needs particular to different types of university. Grounding our work in these principles also led us to create and begin to execute on a roadmap that ultimately led to the company’s acquisition.
It also showed me the virtuous iterative circle between analyzing data and interacting with users.
Summary: It was a classic product management moment: the team had reached a shared understanding of the problem, but not the solution.
Even more classic was that the problem was us. Our process for product and feature development wasn't cutting it anymore.
I made the case for a hypothesis-driven Lean Startup approach - and was promptly challenged to prove the hypothesis that an experiment-based approach could work for us.
I led us to the next phase by figuring out how to demonstrate the validity of core Lean methods without making production changes in our users' experience, and using little to no developer time. I changed the way we thought about users' choices.
Background
The company's founder was concerned about budget, and also about how testing might affect users' experience. For example, would A/B testing in production create too great a divergence in experience? Even though it wasn’t the case that we would go as far as “showing some users a totally different website,” she felt that too great a difference could be counter-productive.
I also had to consider the preferences of our developer, who was competent but also very cautious. I didn’t have a lot of time to get started, and the length of time I would have to show results was not endless. His meticulous work process had been a great asset. But I saw how it could clash with a faster and more iterative approach.
Given these constraints, I decided that the first round of experiments would focus on simplifying the user journey to our core products. I would design and implement the testing myself, and find an approach that would not require changing our production experience and using developer time.
Approaches
We been using email as a way to build relationships with users for some time. Our campaigns had been limited to promoting occasional online events or reminding users about the existence of features of our site that we thought they would find useful or interesting. It occurred to me that an email might be similar enough to a web page to make it a good canvas for experimentation.
So my first hypothesis turned out to be relatively simple: choices presented to users in an email message were a viable proxy for choices presented to them in website interfaces.
I based this hypothesis on the assumption that every stage in a user’s journey through the interface was the same: when presented with information and options, you make a choice, and then express that choice through a click.
Reduced to its essentials, a digital experience is what unfolds as users navigate and negotiate the options and choices that you provide them. Conceptually, it is how a user passes through the OODA loop in the environment you created for them:
- O: Observe
- O: Orient
- D: Decide
- A: Act
A user Observes the state of your offering, Orients herself to the affordances available in the context of her current needs, Decides which (if any) meet those needs, and then Acts accordingly: click, tap, abandon.
Operationally, this had the advantage of requiring zero developer hours to carry out. Using local client software or third-party email automation tools, I could easily build and segment mailing lists, deliver messages to users, and then watch how the clicks played out. Having already become proficient in getting relevant information out of our MySQL databases, the APIs of our site analytics, I rounded out my workflow with email analytics.
Scripting my analysis in R (and occasionally Python) made the process easily reproducible. When experiments led to actionable results, it was possible to quickly visualize the effects I was seeing and express them as guides to changes on our site, to be worked out with our developer.
Exploratory Hypotheses
With the overall goal of simplifying the user journey to and through our core products in mind, I designed an email campaign that began by recreating existing choices in the interface, and then iterated through variations.
I decided to take these high-level approaches:
- If something seemed hidden, I would surface it.
- If something seemed to be losing the competition for users’ attention, I would present it in isolation.
- If something seemed distracting, I would delete it.
Experimentation
An early message that established a baseline was simply including a link to the product interface. Just as with users who encountered it on our website, depth of visit was relatively low, with few users making it all the way through to a piece of revenue-generating content.
By contrast, messages that were more like “deep links” into the product generated very high interest, especially if we segmented and customized the messages so that they presented users with an option to access something that was tied to their own experience.
A second set of experiments gave users the ability to skip the first few choices required to navigate the interface. I enabled this by segmenting users into separate specific interest groups, work which those first few interface choices put on the users themselves. Instead, I pointed them directly to a subset of specific content areas that available information suggested would be of interest.
These messages elicited relatively high click-through rates compared to most of our typical messages to user. It appeared that after clicking and exploring what they found in a content area page, users often appeared to return to their email in the same browsing session and make more clicks. Interestingly, tracking links marking this experiment continued to appear in our analytics for some weeks after users had received them, suggesting that they adopted the email itself as a map to this stage of the product interface.
A third set of experiments took the “deep link” approach. These used the same labels to describe content products already present in a given content area page, but the presentation in an email them without the competition for attention found on those pages. Users were also required to sign into the website after clicking. A majority of them did so, suggesting that their desire to reach these products was high.
Conclusions
For me, the biggest takeaway was that the current architecture of the site placed too many choices between users and our core products. When users had the opportunity to skip ahead in the queue of actions we had created, they did so. It also appeared that my efforts to segment users and use the segmentation to more carefully curate the content products they encountered was received positively. That was especially true for new users. New users who received a curated starting point were more likely to browse through the existing architecture than those who were not.
Outcomes
As we started to build features incorporating these insights into the production site, users took advantage of them. One simple widget which surfaced a selection of our case study products on the first page after login proved especially effective in driving accessions of those products.
Internally, this experimentation tool enabled one of the most productive periods of collaboration between our developer and me. The ability to offer options to users and get quick feedback to validate feature ideas gave him the certainty he needed to execute once they made it to the roadmap. Our velocity increased by a considerable amount.
These “email the interface” experiments fed into a project to redesign the core product interface that ultimately led to a significant increase in accessions of those products.
The experience also gave us an improved understanding of how to use email as a marketing tool - and also how to train our client teams to more effectively market our services to their users. An email had to be tied to a business goal, and there needed to be a way to measure the adoption of the information or options it contained.
Personally, this experience affirmed for the value of being thoughtful and specific in designing digital experiences.
Summary: Managing your career is something you do on your own. Friends and mentors help from time to time, and often in big ways – but that’s usually not central to the relationship, let alone something you want to spend sustained time on with them. And then there’s job hunting. Totally isolating.
But occasionally people do join together to help each other with career choices - and stay together. Unlike hiring a coach, in a group like this you help each other make plans, make connections, and make headway. You help each other set goals and stay accountable to them.
They’re called job clubs. I’ve been in them, and helped others form them.
Usually they come together because the members want to make a change, or have had it thrust upon them by a layoff or a firing. Either way, job club members help each other through.
It’s a tough experience to scale. It requires things like patience and vulnerability that are scarce or risky when you try to recreate them in digital spaces. I created a demonstration site to show one vision of what it could look like.
Objectives
The main things that a job club does for its members are things that digital tools are good for.
- Coordinating everyone’s combined effort
- Gathering and sharing information
- Multiplying the power and networks of individuals so they become something greater
- Testing and developing original ways to solve problems
What I wanted to explore with this idea was the question of what it could look like if, for example, there was an online community where job postings were a lot less like advertisements stapled to a public bulletin board. In this community, they would be more like the apps and software on Product Hunt, or the questions on Stack Overflow. They would be invitations to discussion and problem-solving.
Approaches
As illustrated in the two screenshots below, this idea combines several areas from my past:
- Facilitating job search training courses
- Teaching writing workshop methods and leading workshops
- Building and supporting online communities where the members help each other
It’s also a snapshot or artifact of my career path. Having worked in an organization where developer time was relatively scarce, I developed some very lean approaches to prototype and demonstrate my ideas.
My favorite approach was something of a short-cut into front-end technologies. I’d find a website with a compatible feature set and architecture, or at least overlapping with the majority my idea. Then I’d scrape it (using either a command-line tool like HTTrack or the hard-to-beat Python pairing of Requests and Beautiful Soup) and archive it locally. After that, I’d rewrite the HTML, the CSS, and the JavaScript as necessary until I had my demo environment.
Outcomes
The first page below shows what I would envision a user encountering at the start of the visit. It has many of the features that a careers-and-jobs site would need: a list of current opportunities, a call-out box for events, different forms of relevant news and information, and so on. But it would also put job seekers and those on the hiring side in the sandbox.
But it’s in the individual job listings where the real work would get done. And the sample “Product Scientist” position shows a little of what that could look like. There are users interacting with people at the hiring organization. Other users with relevant information are empowered to share it. Everyone is getting something out of it.
There would need to be a lot more to it, of course. But when you’re trying to bring life to a product idea, the quicker you can get the vision out of your head and in front of the world, the better off you will be. I’ve used these visuals and other like them to do just that.


Summary: The team had reached a point in the growth of the company where we ready to move to a more agile and lean footing. We were going to try building in smaller units of work, developing new ideas and features incrementally and iteratively. We were also moving out of an era in the growth of the company where the use of data to measure progress had been limited to reporting out to clients.
To succeed, I needed more and better data than we had collected in the past. I needed to analyze it effectively, and make sure the analysis served business goals. I also needed to consider all of this in light of our user relationships and client relationships.
My efforts transformed our operations.
Objectives
To move to the next level, I needed to accomplish the following:
- Maintain our existing commitments to privacy and information security
- Stand up a suite of new analytics
- Develop and measure new metrics
- Find and integrate external data sources relevant to our market
Discovery
The majority of our users were graduate students enrolled in doctoral programs, training to be university and college faculty. Because of an incredibly strong aversion in academic professional culture to be seen considering “other options” outside of the professoriate, we allowed users great latitude in both the information they shared with us and what was shared with the wider user community.
Additionally, our contacts with client universities (especially public institutions) asserted a wide range of guidelines and strictures around data collection and use.
With these privacy and security requirements in mind, I chose an open-source web analytics software called Piwik (now Matomo). Unlike Google Analytics, we could host this open-source solution natively and maintain control over it
Analytics capabilities would let us immediately tie our marketing efforts to users’ actions on our site. It also opened up for the first time our ability to map and understand the user journey through our site, its features and its products. It would provide a record of user choices that we could analyze and build on. Finally, it would also give me the ability to contextualize trends and patterns in site use with user data.
Above these requirements for better understanding our users, we also had a pressing need to collect external data that would allow us to better understand our market. Anecdotally, one knew that the University of Michigan was many times larger than the University of Texas at El Paso. But how much bigger? How did doctoral programs at our client universities differ from each other? What was the overall balance of major fields at each? With which of their student segments were our product succeeding? When I spoke with a prospective client, how could I know which of our existing clients their institution most resembled?
The most feasible option (and likely the most useful one) proved to be the many data sets on US higher education collected and maintained by the federal government, through the work of the Department of Education, the National Science Foundation, and other agencies.
Delivery
With all of those needs defined, I set out to implement them.
Because we chose an open source option, I had to become the SME for its features and API, so that I could work effectively with our developer on requirements and implementation.
For example, I had to dig deeply into its customization options and resolve a key technical question in order for implementation to go forward. We needed to be able to segment traffic for each client university. I dug into the analytics software documentation for custom variables to clarify its capabilities for our developer in order to fulfill this need.

After learning the data model that the analytics software used, I wrote a variety of analysis scripts in R to extract, prepare, and visualize raw data, and then to analyze and present it. I produced a similar suite of tools to download, transform, and analyze data from relevant government data sets such as the NSF’s Survey of Earned Doctorates.

Outcomes
For the first time, we could now do more than ship features and wait to see if they had any effect on the handful of metrics that we were recording. We could make a change and then track users’ reactions to it in real-time.
We were also able to expand the metrics that we reported, showing in greater detail and depth the value that users from client universities were getting from our products. This led to higher retention rates from year to year.
Contextualizing the business within the wider market allowed us to take meaningful action based on an understanding of the differences among our clients and our users. This guided both me and the founder/CEO in our strategic decision-making about the growth and future of the company.
