An Adventure in Behavioral Design

Turning a Static Website into a Tool for Health Behavior Change

Jeff Brodscholl, Ph.D
Greymatter Behavioral Sciences

How can we make something as stodgy as a static, general health and wellness website a more potent vehicle for promoting health behavior change among its users – and how can we do it with an extremely limited budget and timeline, diffuse goals, and little room for maneuvering?

I had the opportunity to be the behavioral science subject matter expert on a project that looked exactly as described. The website was an asset owned by a global life sciences company that was intended to provide nonpromotional education and support to people suffering from various chronic conditions. At the time, motivational and tip-based messaging streams, as well as tools for personal health data tracking, had already been added to try to make the asset more meaningful in supporting the user in their daily health practices – but the heart of the asset remained the website with its article-heavy orientation, much of which was seeing only modest traffic despite repeated efforts to promote it, the notable exception being articles with purely practical content (e.g., recipes) which had become a user favorite. The lack of engagement had become a sticking point for the company, and it was one they were now wanting to tackle provided it be done in a very conservative way with little to no opportunity for disruption. It was at this point that I became involved.

Now it might seem like the project would have had little to do with health behavior change given the client’s remit – but that’s what it became about. The reason was simple: Fixing the engagement problem in this case really meant taking steps that would amplify the content’s value to the user, which meant making it better at helping the user adopt and maintain behaviors that would be beneficial to their health. This was, in fact, the most sensible problem-solving strategy for the company to pursue here, given that their own interest in the asset lay almost exclusively in its ability to fulfill on its customer value proposition, and not with any opportunity to monetize user data, sell additional content, or acquire revenue through digital ad placements – the usual motives for boosting traffic and attention.

But how were we to accomplish the project aims? What content and design modifications could we pursue that could have at least some chance of making a difference in the user’s offline behavior with their health? What kind of thinking would we need utilize the forge the connection between content engagement, the user’s health behavior support needs, and the types of maneuvers that could do right by both within the project’s narrow remit? And how could we measure the impact of anything we implemented, whether it be in terms of offline user behavior and perceptions or, if not, then at least in terms of website usage behavior consistent with increased interest in the content?

I’ll use this post to talk about the way behavioral science was used to address each of these challenges. I’ll discuss how the very aspects of the project that constrained it also became the motivator for behavioral science innovation, giving impetus to the style of behavioral science thinking I’ve argued for elsewhere, and making it a good example of how it can be implemented at a team level within even the tightest of timelines and budgets. I’ll also discuss how the project created significant obstacles for post-design assessment activities, setting the stage for additional behavioral science-based problem-solving that made it possible to detect some positive signals of increased engagement attributable to the implemented modifications. I’ll then close with some reflections on the project’s lessons for the ways we can unlock the value of behavioral science in behavioral design generally, focusing on the payoff we can achieve in narrow projects such as this, but also looking at it more broadly.

The Website from a Behavioral Point of View

To provide further background, it’s worth talking about an aspect of the client’s website that only exacerbated the challenge. As noted, steps had already been taken to move the asset away from the digital equivalent of traditional health education toward something more like a practical tool for adopting and sustaining good health behaviors. Normally, the decision to turn a digital product into a vehicle for solving problems in health behavior change would motivate taking a laser-focused approach to design and objective-setting to ensure that the product would have the effect on behavior that was intended – but, in this case, the asset’s target audience and objectives remained broad and diffuse:

  • The wesbite addressed itself to people coming in from any one of several chronic conditions;
  • Within any one condition, it contained content and tools that could be relevant to any one of several health self-management needs (e.g., lifestyle modifications, monitoring and measurement, medication adherence, etc.);
  • And, within any given need, there were materials that could conceivably address a multitude of health behavior drivers and barriers, though usually not in any systematic fashion.

Moreover, the website lacked an overarching behavioral model to give it cohesion – a function of its tendency to rely on a birdshot approach to health engagement that could serve many different audiences with heterogeneous health self-management needs and positions in their journey toward adopting and sustaining key health behavior practices.

What was the website's common denominator, though, was the clear intent to support people with content that could help them adopt and maintain behaviors that were understood at their core to be fundamentally volitional. This presumption was, in fact, baked into the website itself, as it efficacy depended on users wanting to make the effort to peruse it to see what it might offer, read articles and interact with other content, and, if they liked what they saw, take steps to register so that they could unlock personal trackers and email or text streams containing motivational messages and tips – all goal-directed activities that would have depended on the users understanding that their health behaviors were something requiring their own personal intervention to sustain or change.

Looking at the website as a tool for supporting volitional behavior made sense from a health behavior perspective, and it had several key advantages for the project itself:

  • It provided the grounds for developing a behavioral angle to the strategy for solving the content engagement problem. By connecting the website’s promise to its ability to help people in their willful efforts to adopt and maintain good health practices, it made clear that the best pathway forward lie with enhancements that could speak more effectively to the psychological processes involved in those efforts – matters that were now about giving the user what they needed as they carried through with goal-setting, planning, habit formation, behavioral feedback monitoring, and coping, to name just a few. And that meant changes that could potentially increase engagement by making the website a better fit to the user as a behavioral problem-solver – key to making the website feel more natural and meaningful, and increasing the user’s appetite for engaging with its content.
  • It then pointed the way toward the material we needed to analyze the website for its strengths and weaknesses and, from there, design improvements. If the key to understanding the website’s value lay with what it could do to support behavior requiring personal investment and effort, then there were specific behavioral principles we could reach for to understand those behaviors. Specifically, there were published models and findings we could leverage to picture the processes involved in people’s goal-directed activities, interpret the current website content for its relationship to those processes, and then drive thinking about modifications to increase the likelihood of the website impacting those processes to more effectively support the user’s behavior change and maintenance efforts.

Turning the Science Into Action: From Behavioral Strategy to Design

We used these insights to develop one of the key tools for the project – a custom behavioral lens, anchored in findings from motivational and cognitive science, that used a model similar to the “action phases” model I’ve described elsewhere as the backbone for its conceptual components. This lens gave us the material we needed to break the user’s health behavior down to its underlying controlled and automatic processes, and forged connections to simple, website-friendly maneuvers that had been shown in published work to have meaningful impacts on behavior through those processes. Like COM-B, the content was divisible into intuitive categories, which made it possible for a team composed of content strategists, designers, and copywriters with limited behavioral science background to use the content to inform their thinking under expert guidance. Unlike COM-B, the lens eschewed intentionally vague constructs in favor of the types of models and representations that are more commonly found in experimental psychology and quantitative social science. The lens then contributed to every step we took to diagnose the website’s weaknesses and find the improvements that could have some chance of being impactful while staying within the project’s parameters:

  • At audit, it drove decisions about what website features would be examined both locally and in terms of broader structure, and supported the development of a scorecard that could be used to code each item for its relationship to the lens’ key constructs;
  • At analysis, it helped us weigh the significance of the patterns uncovered by the completed grids, including whether there were certain behavioral needs that were being given too much attention or being addressed in an inadequate way, or were simply being overlooked;
  • And at design, it guided thinking from gaps and weaknesses to published, evidence-based maneuvers that could be a model for specific implementable changes and enhancements, while providing the logic for face-valid modifications that would emerge from scratch.

The result was three different types of modifications which we were able to arrive at after just a few weeks’ worth of work:

#1: Website Structure

At the heart of our lens was the recognition that a complex, high-involvement behavior like adopting and sticking with a strict diet entails a journey that unfolds in phases – often nonlinearly, but usually quite predictably, with each phase having its own behavioral jobs to be done and, thus, its own challenges. By bringing an understanding of these phases to the audit, we were able to see where content structure failed to follow what we know about how these phases unfold. We were then able to use simple content reordering, along with changes in content recommendation timing, to create a more obvious clustering of content by phase, produce a flow that would feel more natural, and make phase-relevant content salient to allow the user to select it based on their own intuitive understanding of where they were in their journey with a given behavior the content addressed.

#2: Current Article Content

The audit naturally surfaced areas where the content over-indexed on one behavioral need (e.g., goal commitment) at the expense of others (e.g., planning, habit formation), but it also helped us to see instances where a single piece of content such as an article addressed multiple needs from different behavioral phases in a way that was unorderly and muddled. Though the project parameters placed limits on how much new content could be created from scratch, our approach allowed us to “get the most from the least” by helping us zero in on the places where adding some small amount of content could yield the highest return – for instance, by addressing an important behavioral need, often in the "intention-action gap", that had otherwise been completely overlooked. And as with the website structure, we were, again, able to capitalize on our understanding of behavioral phases to recommend even easier maneuvers, such as simply breaking muddled articles apart or, alternately, finding straightforward ways to rebalance or deprioritize them.

#3: The Addition of Enrichment Modules

Finally, there were what we called “enrichment modules”, which proved to be the place where we had the most opportunity to introduce new content aligned with the behavioral angle we were bringing to the project. These were inline boxes that were designed to appear midway through articles with the goal of presenting material that would enhance the content without requiring significant copywriting or creative work. They were intended to be a backdoor way of dealing with the constraints imposed on new content generation – but they also provided the perfect real estate for maneuvers that could address behavioral needs through simple nudge-like messages and interactive experiences. Using our lens, we recruited evidence-based tactics that could be models for the design of these modules, aligning them with the behavioral processes that we believed were associated with the needs a given article addressed. We made decisions about where to add modules strategically, using them to rebalance muddled articles or make them better at addressing a need we had identified as being relatively neglected, while keeping their distribution over the types of needs they might address close to uniform. We then designed the modules to harmonize well with surrounding content, which had the effect of enriching it by tackling the user’s needs from a very different angle, often with visuals and tools that could create variety with a light, unobtrusive touch.

Post3_Graphic1_RubiconModel

Fig 1: The Rubicon Model of action phases [1-5]. Note that our backbone model was an amalgamation of several self-regulation models which converged to something similar to the above. The rest of our lens then used findings from the behavioral science literature to elaborate on the components of the backbone model to highlight potential weak points for support or levers for intervention.

These were necessarily small changes in the larger scheme of things, but their value wasn’t to be overlooked. They made sections of the website that had seen low engagement feel much more coherent and richer – an expected consequence of basing design modifications on an evidence-based picture of the psychology of the behavior the website was to help people adopt and sustain. More broadly, the changes moved the website’s design closer to having a behavioral logic consistent with the expected drivers of the behaviors it was intended to support – key to creating the foundation for future embellishments that could further align the website with its evolving objectives. And all this proved achievable despite the limited time, resources, and rules of engagement within which the project needed to be executed.

What Did We Accomplish? From Behavioral Design to Assessment

Still, there's always a lingering question in a project like this about whether the changes we had implemented would eventually come to have the impact on the user that was intended. We had based design decisions on insights and intervention examples that could be evaluated against a pre-existing evidence base, and had seen how the website changes resulted in an immediate, obvious impact on content quality – but neither guaranteed that the incorporated enhancements would have the desired effects on health behavior or content engagement given the specific context, and in the specific form, in which they were implemented. To be clear about it, we needed a way to measure changes in user behavior that could be attributed to the website changes, focusing, at a minimum, on behaviors that were deemed by the client to be most relevant to their business objectives – a matter that was of interest to both the internal team as well as the client themselves.

Yet, the nature of the project and the website itself created stumbling blocks that made achieving these objectives harder than one might expect:

  • As noted, the website wasn’t targeted toward just one audience or problem in health behavior change or health self-management. It was more of a general resource, designed with certain specific condition categories and patient health practices in mind, but still covering a lot of terrain in terms of the types of behavioral challenges that the content was built to address. No one health behavior or practice had ever been identified as particularly important to focus on, and thus no design decisions had ever been made that had been oriented toward optimizing the website’s impact on any one specific behavior or practice. Yet, this also meant that that, when it came to these behaviors, the project hadn’t included space for developing and executing a well-articulated assessment plan, either – an activity that would have required clear, upfront decisions about which specific health behaviors to measure, how to define success with them, and how to conduct the assessment beyond something that would have been purely post hoc, the latter being of little value given limited opportunity to properly plan for it or to aim the design work toward just one or two behavioral targets.
  • That said, the website modifications had been made for the expressed purpose of increasing engagement with the poorly performing content, and measuring this effect was both clearly desired by the client and something that seemed easy and appropriate enough to attempt. Yet, even here, the project limitations proved problematic, as they encouraged quick hits with aggregate SEO metrics that lacked careful consideration of what was needed to make findings attributable to design changes or how engagement improvements might be reflected in user behavior, setting the stage for assessment efforts that would have been unaligned with behavioral hypotheses, and would even have failed to control for critical confounds (e.g., website promotional activity that was still ongoing).

Fortunately, there existed a way to tackle the engagement assessment from another angle, thanks to a stream of work that involved custom modeling of the website usage data for a broad range of assessment purposes. Looking at it as a "be-sci" evidence problem, I decided to try the assessment using a spin on what is known as "event history analysis" – a statistical method that, like the survival analyses found in epidemiological work, looks to model the probability of an event such as death as a function of time divided into discrete intervals and other predictors. I liked the idea of thinking about the user’s website interaction as being a page-to-page walk through time in which “death” equaled a decision to exit the website that might be forestalled if content was encountered that made the user want to continue wandering in search for more. Reading pleasure and satisfaction into website user data is always a bit tricky, but when it came to the modifications involving the enrichment modules, there were reasons to believe that any drop in exit probabilities following module implementation would be a good indicator of it:

  • The modules didn’t entail taking anything away from the page they appeared on, so there was still an opportunity in principle for the user to achieve the same level of satiety with a modified page as they would have had prior to its modification;
  • The modules were also carefully crafted to not be obtrusive or distracting, so they shouldn’t have interfered with pre-existing opportunities for satiety in practice, either.

Thus, with 12 months of user data on tap, I set forth to examine the effects of the enrichment modules on the exit rates, fitting a model with parameters for order of page in session, whether a given page encountered in the order was one that had been targeted to include an enrichment module, and whether the session occurred in the 6 months preceding, or the 6 months after, the enrichment modules had been added. The model was specified so that we could see not only how the likelihood of exiting the website at a given page varied by whether it had been targeted to include an enrichment module, but also how this variability, in turn, differed by whether the observations occurred before versus after the implementation of the website modifications. I also did some follow-up work to see how long the effect of encountering a page targeted for including an enrichment module might have lasted, using exponential decay curves to estimate the effect on the exit probabilities over subsequent pages. All of this was quite imperfect in the absence of randomized tests or methods such as propensity weighting to control for user-level effects, but the steps at least dealt with factors, such as website promotional activity or differences in other aspects of page content, that could have accounted for whatever the analysis yielded.

And so how did our enrichment modules perform? The answer was pleasantly surprising:

  • For users who had navigated their way to a page targeted for enrichment module inclusion, the actual inclusion of a module cut the odds of exiting the website at that page by anywhere from one-tenth to one-half – statistically reliable even after accounting for the complex, nested nature of the data;
  • The effect was isolated to the post-modification period, ruling out the possibility that there was something special about the targeted pages, beyond module inclusion, that could have accounted for the differences;
  • Decay curve modeling further demonstrated that encountering an enrichment module had the effect of lowering exit probabilities for up to two pages following the encounter.

Not a bad finding however incomplete, and certainly commensurate with what had been possible under the circumstances.

The Project in Retrospect: Some Lessons and Reflections

I like this project for what it says about how far we can go in unlocking the value of behavioral science for even the narrowest, most constrained behavior change design efforts when we go about it a certain way. Given the project's limits, it would have been tempting to go for a few quick hits using fragmentary, generalized, off-the-shelf ideas about what’s behind people’s behavior and can work to influence it, taking little account of the particulars of the behavior the website was meant to address in the hope that the result would satisfy the project's aims. Yet, there’s nothing in the most popular material circulating in the be-sci enthusiasts’ market that could have helped us tune into some of the features of people's behavior that proved to be particularly valuable to know about here – the most notable being the phases of volitional action which, with the model we used to explicate them, gave us the foundation for accessing material about yet other relevant behavioral processes that could ladder up to the methods we selected for helping users adopt and sustain desired health practices. It was our custom lens that allowed us to look at the website globally, which expanded the improvement opportunity by making it possible to explore changes in structure and content balance that would be a better fit for the user’s health behavior journey. That not only increased the website’s cohesiveness from a psychological perspective; it also empowered making the most of local nudge-based maneuvers by ensuring their proper location, and by finding the highest number of places where different nudges could be introduced to allow for a broader net effect. The resulting changes proved immediately apparent when it came to the experience with the content, and, with the enrichment modules, they seem to have had at least some impact on user behavior consistent with increased engagement. It was the care we put into developing our own lens for the project that made all this possible.

This points to a larger lesson about what it means to design with behavioral science in a doing-more-with-less world, and why the effort it requires is worth it. To be clear, most products and services that exist to help people with some aspect of their offline behavior don’t get developed within a straitjacket. Usually, they’re given the resources they need to include dynamic, tailored content and experiences delivered in many different modes and channels, all carefully orchestrated to serve the singular pursuit of succeeding at the behavioral outcome while creating an experience that makes the customer see the value in the product or service itself. Pursuing that ideal is de rigeur for companies whose businesses exist precisely to develop and deliver those behavior-modifying products and services (think Noom, for instance), but it isn’t for companies that may pursue them for only secondary business aims – the life sciences being an example. The decision to default to the simple in that case can seem like the best thing to do when the goals are tertiary and the alternative feels far out and complex – but it doesn’t have to be this way. As this case demonstrates, we can get a tremendous amount of value from behavioral science even when we’re applying it to highly constrained behavioral design challenges – essential to ensuring the development of something that is likely to succeed because it meets users where they are, provides proper support for their behavioral needs, and avoids the fate of falling flat. And it shows that the very approach that makes this possible might seem esoteric to anyone looking under the hood, but it is actually quite feasible even if it requires some extra sleuthing to pull it off.

That said, the project also surfaces some consequences that can follow when a behavioral design project becomes a bit too constrained – the most notable in this case being the limits we encountered to assessing the impact of the changes that we implemented. Again, to be clear, this wasn’t a project that had been motivated by a desire to improve the website’s impact on user health behavior and practices. It had been motivated by a need to increase engagement with some of its content, the improvement of its ability to help users solve problems in adopting and maintaining desired health behaviors being a means to achieve that aim. With engagement being the top concern and the project restricting the opportunities to enhance the website as a behavior change or support tool, it made sense that there might not have been much motive to assess the impact of the website modifications on health behavior itself. Yet, as I noted, the website had already been developed to address many different health practices appearing across multiple condition-specific audiences that were heterogeneous in their behavioral needs – and it was interesting to see how the disinterest in assessing the website’s impact on offline behavior had an analogue in the lack of attention paid to how best to measure the effect of the modifications on website engagement. It’s as if the common thread were the lack of crisp objectives which then manifested both in the website’s basic design and in an insufficient attention to the finer points of measurement – a core issue that may even have set the stage for the somewhat reactive nature of the project despite its connection to a broader evolving vision (i.e., to make the website more a pragmatic behavior change tool than a repository of static health education material).

Still, there's a way in which these messy realities serve to further amplify the value of the applied science approach I've described in this post, as it was our approach that allowed us to deliver for the project in additional ways to make up for these challenges:

  • It gave us the goods to develop a well-founded set of prior beliefs about the user’s health behavior from which we could make educated decisions about website enhancements – key to maximizing the payoff odds on those targets in the absence of a relevant testing regime;
  • It empowered a post-launch engagement assessment method that was more rigorous and behaviorally informed than what was originally considered, allowing a hidden signal of engagement impact to come to the surface thanks to a tool in the quantitative scientist's toolkit;
  • And it set the stage for a more strategically oriented thinking style in which applying the science made us think deeper about the website’s behavioral targets, target identification drove the specific conceptual material we leveraged for insight and design work, the desire to be evidence-based increased the appetite for assessing the impact of design changes, and the focus on behavior made possible a crisper way conceive of assessment.

That last bullet is the essence of applied behavioral science in a nutshell – and one of the best things you can make room for to get the most out of the science for these types of efforts.

References (Were We Got Some of This)

  1. Gollwitzer, P.M. (1990). Action phases and mind-sets. In E.T. Higgins & R.M. Sorrentino (Eds.), Handbook of motivation and cognition: Foundations of social behavior (Vol. 2, pp. 53-92). New York: Guilford.
  2. Gollwitzer, P.M., & Bayer, U. (1999). Deliberative versus implemental mindsets in the control of action. In S. Chaiken & Y. Trope (Eds.), Dual-process theories in social psychology (pp. 403-422). The Guilford Press.
  3. Gollwitzer, P.M. (2012). Mindset theory of action phases. In P.A. van Lange (Ed.), Theories of social psychology (pp. 526-545). Los Angeles: Sage.
  4. Keller, L., Bieleke, M., & Gollwitzer, P.M. (2019). Mindset theory of action phases and if-then planning. In K. Sassenberg & M.L.W. Vliek (Eds.), Social psychology in action (pp. 23-37). Springer, Cham. https://doi.org/10.1007/978-3-030-13788-5_2
  5. Keller, L., Gollwitzer, P.M., & Sheeran, P. (2020). Changing behavior using the model of action phases. In M.S. Hagger, L.D. Cameron, K. Hamilton, N. Hankonen, & T. Lintunen (Eds.), The handbook of behavior change (pp. 77-88). Cambridge University Press. https://doi.org/10.1017/9781108677318.006

© 2026 Grey Matter Behavioral Sciences, LLC. All rights reserved.