Creating Behavioural based Customer Segments in ODP- Post 2 of 4

In Blog Post 1, I talked about the foundations of delivering a super personalised experience to customers using Optimzely products.

In this post we’ll discuss the first pillar – Segmentation. Specifically, the technical implementation which sets the foundations for the super powerful Segmentation of your customer base.

Use Optimizely’s Data Platform (ODP) to harmonise data creating a 360 view of the customer from all channels (online, in-store, historical, real time, etc) and your extended Product data.

With ODP you can create Segments of customers who have similarities with regard to their characteristics, preferences, site engagement, behaviours and brand interactions.

When you reconcile your Customer data with your Product Catalog to indicate what type of interests your visitors have based on their behaviour – you have a very powerful Segmentation tool.

Integrate ODP with Optimizely B2C Commerce

There are two core ODP data entities you should enhance in ODP with data from your B2C Commerce system:

  • Products
  • Customers

Products

ODP contains a number of pre-built product connectors, one of which is the Commerce Cloud connector. It is a no-code connector that communicates with your Commerce Cloud website via the Service API.

The Commerce Cloud connector will synchronise your Core Product Data between Commerce Cloud and Product entities stored in ODP. By core product data I am referring to Product Code, Name, Image, Variants.

To enable more advanced Segmentation capabilities based on the type of products your customers are interacting with, you should enhance your product data with additional attributes stored against the products in your Catalog.

Consider creating a scheduled job to push your enhanced attribute to ODP via batch requests to the Products API.

https://docs.developers.optimizely.com/digital-experience-platform/v1.5.0-optimizely-data-platform/reference/batch-requests

https://docs.developers.optimizely.com/digital-experience-platform/v1.5.0-optimizely-data-platform/reference/update-products

Customers

I have covered an approach to integrating Customers in a previous blog post so won’t go into detail here but feel free to check this article out:

https://johnnymullaney.com/2022/06/14/integrating-optimizely-data-platform-for-gdpr-compliance/

ODP Set Up

This section assumes your technical integration is in place and we will now proceed to create a Segment based on our customers interaction with the enhanced Product Catalog.

In our example, we will create a Segment of Customers who have viewed Products where the product has a custom attribute called “Team” and the value of that property is equal to “Manchester United”

Add “Team” Product Attribute Field

Log into ODP as an Administrator and proceed to

Settings -> (Data Management) Objects & Fields -> Products

Click Create New Field and Add your Team field to the Product object

Segment Set Up

Go to the Segments interface and create a new Segment.

First set customers who have done the Product Detail View event.

Then add a Where Filter on the Product associated with this event so only products with “Manchester United” team values are associated with the segment.

As long as there are products in your catalog whose Team is “Manchester United” – auto complete will suggest this value. Super Easy! 🙂

Conclusion

The ODP integration explained above gives you a super flexible system to enhance Segmentation capabilities based on the type of products your customers are interacting with.

In the next post in the series, I’ll talk about using a Customers Segment Profile to provide a personalised experience across Web and Email channels.

Pillars of delivering a Tailored Customer Experience on Optimizely DXP- Post 1 of 4

New Era Cap has been manufacturing baseball caps for American sports teams since the early 1930s. They are a heritage brand with their roots firmly laid in Baseball, one of America’s most popular sports. Over their considerable history they have expanded to new sporting domains and evolved into a popular culture lifestyle brand with their products worn by some worlds most famous celebrities.

A brand like New Era Cap can mean a lot of things to a lot of different people whether you are a passionate New York Yankees baseball fan or somebody who likes to keep up with the latest fashion trends. Because of that, it is so important that the brand E-Commerce experience and communications talk to customers in a personalised and familiar way.

This blog series will demonstrate how brands like New Era Cap are using Optimizely DXP technologies to provide their customers with consistent familiar omnichannel experiences.

The Three Pillars of Tailored Customer Experiences

Working in collaboration with New Era, we used Optimizely DXP technology to implement the three pillars to delivering a Tailored Customers Experience.

Pillar 1 – Segmentation

To effectively personalise a customer experience, you need to understand who individual customers are – their profile, interests and the behaviours while interacting with your brand.

ODP (Optimizely Data Platform) is used to captures unified customer profiles and behaviours. Using ODP you can create Segments grouping customers based on engagement with your brand or any other profile/behavioural criteria you define.

Pillar 2 – Personalisation

ODP Segments can be used to personalise web and email content for your customers using Optimizely Content Cloud Visitor Groups. There are a number of methods that ODP segments can trigger personalised Email campaigns with the most appropriate depending on your Email Marketing solution provider.

Pillar 3 – Experimentation

How do you know the personalised content you are serving your visitors is converting? The next step is to Experiment with variations of your content.

This will inform you strategically what is working best within customer Segments and allow you to iterate effectively. Optimizely Experimentation has prebuilt integrations with the Content Cloud to test what exactly is converting with your real world customers.

Next Post

In the next post, I will delve into Pillar 1 – Segmentation using Optimizely technologies. We’ll look at how to build effective customer profiles and use that to effectively segment your audience.

Integrating Optimizely Data Platform (ODP) for GDPR compliance

The most common concern I’ve seen raised in Europe is how best to integrate ODP with your website in accordance with GDPR compliance data protection regulations.

Some ODP Background

ODP consists of the following key data entities that will be synced between your website and ODP.

Products Product catalog structure and master data

Customers The Visitors profile data telling us who this customer is such as Name, DOB, Language etc

Events A visitors various interactions with the website such as page views, login, add to cart, orders

Orders Orders that the visitor has placed in your E-Commerce system

Out of these 4 entities, all but the synchronisation of the Product feed is typically subject to GDPR compliance regulations.

Commerce Cloud Connector

Optimizely’s Commerce Cloud connector is a no code add on that can be used Optimizely Commerce Cloud’s Service API to synchronise Product, Customer Contact and Order data entities.

In Europe you would be best advised to enable Product imports but disable Contacts and Orders so that you can do the relevant checks to make sure that your customers have first agreed to the relevant cookie policies.

Note: ODP’s Commerce Cloud Connector is due for release in EU H1 2022

GDPR Friendly Integration

My advice is to separate your ODP tracking implementation from your websites code base by using a Tag Manager that forwards data to ODP only for requests that adhere to the relevant cookie policies.

I’ll assume you are using Google Tag Manager for the rest of this post.

Use your Cookie Opt In Trigger

You will likely already have an Opt-In performance cookie trigger set up in GTM to manage the execution of your Tags. The set up of your trigger will depend on the product you use to manage cookies but it should look like something similar to the below.

All ODP Tags should use this trigger to make sure that only customers who agree to share their data are synced to ODP.

Load the ODP JS Tag

As you are tracking a session, you need to load the ODP JS tag on each page. The JS tag will include the PageView event by default.

Log into your instance of ODP and copy the ODP Integrations -> Javascript Tag option .

Then in Google Tag Manager you can simply add a custom HTML Tag using the Opt-In Performance trigger

This will load the ODP JS script and fire the “pageview” event for customers who have accepted the configured cookie policies.

Using the Tag Manager Data Layer

Using this approach you can push any Tag Manager data layer events you are already firing to ODP and push new data layer events for any other data you want to track.

Customer Profile

As you learn new profile information on your customer, push a “CustomerProfile” event to the data layer with the customer profile object.

Then create a Tag to push that data onto ODP as demonstrated below.

Note the timeout wrapping the ODP API push. This is because of an intermittent issue with a race condition that sometimes caused this event to fire before the JS script was loaded resulting in the event failing to fire.

Event Tracking

Events are actions the customer takes on your website. They can be anything from page views to keyword searches or completing a checkout.

The below example triggers the Tag when a “Search” event is pushed to the data layer and the customer has accepted the relevant cookie policies.

Conclusion

Using a Tag Manager is a great low code option for integrating ODP into your website while maintaining your GDPR compliance.

Manually Importing Products from Commerce Cloud to Optimizely Data Platform (ODP)

ODP has a turn key Integration app called the Commerce Cloud Connector which can be responsible for the synchronisation of Contact, Order and Product data between your Optimizely Commerce Cloud instance and ODP.

However in Europe the Commerce Cloud Connector is not due to be released until the end of June 2022.

In Europe, due to GDPR guideline compliance you may not ever be able to turn on the connector for Customer Contact or Order data . I can cover that in a separate blog post but you will need to make sure that the relevant cookie policies have been selected by the customer.

While waiting for the connectors release in Europe, a manual product data export and import to ODP is a low cost interim solution. The key is to make sure that the schema you choose to save the products in ODP will be maintained when you turn on the connectors product import.

Key Product Data Schema Import Considerations

ODP currently supports 1 currency with that currency in the UI showing as USD. It is on the product roadmap to support additional currencies but for the moment you will have to choose whatever you determine to be the default currency on your website for the product export.

ODP is flexible with regard to your product data structure. However if you intend to use the connector, it’s best to follow a Product->Variant model to match your Commerce catalog so that the consistency of your data will be maintained.

Exporting Product data

Assuming your default market is GB, the following script will export your products from the catalog in GBP£.

SELECT
CN.[Name] as category,
CE.[Name] AS [name],
(
       CASE
              WHEN Charindex('_',CE.Code)> 0 THEN CE.Code
              ELSE CE.Code
       END
)AS sku,
min(PD.UnitPrice)AS price
,
(
       CASE
              WHEN CEChild.Code IS NOT NULL THEN CEChild.Code
              ELSE CE.Code
       END
)AS parent_product_id
FROM [dbo].[CatalogEntry] CE
		INNER JOIN NodeEntryRelation NR ON NR.CatalogEntryId = CE.CatalogEntryId 
		INNER JOIN CatalogNode CN ON CN.CatalogNodeId = NR.CatalogNodeId
       LEFT OUTER JOIN [dbo].[CatalogEntryRelation] ER ON ER.ChildEntryId = CE.CatalogEntryId
       LEFT OUTER JOIN [dbo].[CatalogEntry] CEParent ON ER.ParentEntryId = CE.CatalogEntryId
          LEFT OUTER JOIN [dbo].[CatalogEntry] CEChild ON ER.ParentEntryId = CEChild.CatalogEntryId
       LEFT OUTER JOIN [PriceDetail] PD ON PD.CatalogEntryCode = CE.Code
WHERE (PD.MarketId ='GB'OR PD.MarketId IS NULL)
GROUP BY  CE.CatalogEntryId, NR.CatalogNodeId, CN.[Name], CE.ClassTypeId, CE.Code, CEParent.Code, CE.MetaClassId, CEParent.MetaClassId, CE.Name, CEParent.Name, CEParent.CatalogEntryId, CEChild.Code, CEChild.MetaClassId
Order BY CE.CatalogEntryId

The output of the script can be copied to a CSV file for import to ODP.

Row 1 of the above export is the product with the variants linked by the parent_product_id

Price in this example is pound price for the GB market. You can update the Where clause to isolate the prices for your default market of choice. Support for multi-market & multi-currency pricing is on the roadmap for ODP so i look forward to hearing more on that in the coming months.

Importing to ODP

Copy the output of this script to Excel to convert to a CSV file before doing the following:

  • Do a Find/Replace to convert any “NULL” values in the price column to empty
  • Update price values to 2 decimal places

The name the file odp_products.csv and drop into the ODP Integration CSV Upload interface

Conclusion

While you’re waiting for the ODP Commerce Cloud connector to be released in Europe, this simple Export/Import process will help you migrate product data into ODP in a way that will be consistent with the connector schema.

Experimenting with your Keyword Search Algorithm – 2 of 2

In post 1 in the series we set up a CMS manageable Search Algorithm Settings Page and plugged that into our Search & Navigation query.

We’ll now build on that to create an Experiment that will help us determine the optimal algorithm configuration.

Extend the Search Algorithm Settings Page

Firstly we will extend our Algorithm Settings Page with the following properties

        [CultureSpecific]
        [Display(Name = "Is Default", Description = "Indicates that this content represents the default Search Algorithm configuration", GroupName = TabNames.Experimentation, Order = 200)]
        public virtual bool IsDefault { get; set; }

        [CultureSpecific]
        [Display(Name = "Experiment Key", Description = "Matches the variation key of the Optimizely Experiment", GroupName = TabNames.Experimentation, Order = 210)]
        public virtual string ExperimentKey { get; set; }

When we set up our experiment these culture specific properties will allow us to set a default configuration and experiment with algorithms across language sites.

Setting up the Experiment

In my previous Optimizely Experiments series, I talked about setting up and running an Experiment which covers the basics of the Optimizely Full Stack product. Rather than revisit I’m going to assume the foundations and concepts explained in this series are understood.

Experiment Event Metrics

Log into your Optimizely Full Stack account to begin configuration.

Define a new metric for this Experiment Called “Keyword Search Click Thru”. This event will be registered when a customer clicks a search result from a keyword search. Higher numbers of these events will indicate that the algorithm is returning relevant results to our customers.

If you are using Optimizely Search and Navigation typed search, you will most likely have Click Tracking implemented in your solution. Read more on that here:

Assuming you are using Optimizely Search and Navigation you could push the event to the Experiments API at the same point that you register the search click tracking.

Experiment Set Up

Lets now publish variations of our Search Algorithm Settings page in the CMS, setting the Experiment Key to unique values for each instance of the page.

Then log into Optimizely Full Stack and configure your Experiment Variations noting that we will use the “experiment_key” variable to map to the settings page instance.

Finally we can configure our experiment to push to AB test instances of our Algorithm Settings page

Executing the Experiment

To retrieve the Experiment Key in code using the Experiment Service detailed in my previous blog series you simply need a few lines of code

                var decision = _experimentationService.Decide(HttpContext, "keyword_search_algorithm_experiment");
                if (decision != null && decision.Enabled)
                {
                    return _experimentationService.GetFeatureVariableString(HttpContext, decision.FlagKey, "experiment_key");
                }

Now that you have the key, you can use this to load the appropriate Algorithm Settings page and AB test the effectiveness of your search configuration.

Conclusion

Search Algorithms are tricky and the optimal configuration for each Commerce application depends on too many moving parts to predict such as catalog size, data, user intent, market specific preferences etc.

Using Experimentation to measures your algorithm success, tweak, learn and measure again is the best possible way to move towards the sweet spot and grow your conversions.

Experimenting with your Keyword Search Algorithm – 1 of 2

This series will discuss an approach to experimenting with your Keyword Search Algorithm using the Optimizely Search & Navigation, Commerce Cloud and Experimentation products.

Search & Navigation

At a basic level the following happens on a keyword search:

  • Customer attempts a keyword search on your Commerce Cloud website
  • Commerce Cloud code builds a Search & Navigation query and posts to API
  • Search & Navigation executes the query on it’s Elasticsearch index taking into account factors such as Statistics gathered through Click Tracking to assign a relevancy score to the products determined to be the best search results

Assuming click tracking is already implemented, the variable we have control over to tweak in optimising search performance is the search query sent to Search & Navigation.

Search Query Optimisation

To CMS manage the query, we create an Algorithm Settings Page that can integrate with the search query sent to the API.

Algorithm Settings Page

The following is an example of a Search Algorithm Settings page PoC from my Github:

https://github.com/johnnymullaney/Foundation/blob/search-algorithm-experiment/src/Foundation/Features/Search/Search/SearchAlgorithmSettingsPage.cs

I’ve divided the content type into three tabs representing important aspects of the query.

1 Property Weightings

Property weightings specify what perceived value we assign to a keyword match in various content property’s.

For example we may specify in the query that a keyword match in a Product Title should be twice as relevant as a match in the Description. This will be a key factor when Search & Navigation assigns relevancy score to search results based on that request.

2 Boosting

Your query can instruct Search & Navigation to boost the relevancy scores further of search results that meet a pre-defined conditions. For example a query may include a relevancy boosting for products that are marked in the CMS as “Popular”.

3 Query Optimisation

I am categorising query optimisations as anything else that may improve the quality of search results such as Synonyms, Fuzzy Search, And/Or operators etc.

There is a very useful Labs Github repository which contains a number of extensions that can be integrated with your Search & Navigation solution to improve search performance.

https://github.com/episerver/EPiServer.Labs.Find.Toolbox

Integrate Settings Page with Search Query

We can integrate our Search Algorithm Settings Page into the query that we build in code.

var query = _findClient.Search<GenericProduct>()
             .For(filterOptions.Q)
             .MinimumShouldMatch(searchAlgorithmSettings.MinimumShouldMatch)
                .InField(x => x.DisplayName,     searchAlgorithmSettings.DisplayNameWeighting)
                .InField(x => x.Brand, searchAlgorithmSettings.BrandWeighting)
                .InField(x => x.Description, searchAlgorithmSettings.DescriptionWeighting);

            if (searchAlgorithmSettings.EnableSynonymsImproved)
            {
                query = query.UsingSynonymsImproved();
            }
            else
            {
                query = query.UsingSynonyms();
            }

            if (searchAlgorithmSettings.EnableDisplayNameRelevanceImproved)
            {
                query = query.UsingRelevanceImproved(x => x.DisplayName);
            }

            if (searchAlgorithmSettings.EnableDisplayNameFuzzyMatch)
            {
                query = query.FuzzyMatch(x => x.DisplayName);
            }

            if (searchAlgorithmSettings.EnableDisplayNameWildcardMatch)
            {
                query = query.WildcardMatch(x => x.DisplayName);
            }

            if (searchAlgorithmSettings.PopularProductBoostingValue > 0)
            {
                typeSearch = typeSearch.BoostMatching(x => x.PopularProduct.Match(true), searchAlgorithmSettings.PopularProductBoostingValue);
            }

Next Post

In the next post, we’ll create an Optimizely Experiment to AB test the success of search query variations.

Foundations of “Good SEO” with Optimizely

Maximising traffic acquisition through good SEO practices is a key strategic goal of any commercial website.

Recently I’ve been working with some brands on SEO optimisation from a technical perspective and have spent some time trying to map out what exactly “Good SEO” means.

So what is Good SEO?

That’s a tough question!

In my opinion Good SEO goes far beyond any particular discipline. It means great health and performance across a number of overlapping broad areas from technical to the quality of content and beyond. To execute effectively in each area entails strong process, execution and ongoing health checks.

The broad area’s I look at SEO through are as follows:

  1. Technical Best Practices
  2. Crawlability / Indexation  optimisations so crawlers can efficiently discover the entire site
  3. On Page HTML optimisations so crawlers can understand the intent and content of your page displaying relevant information in Search Engine Results Pages
  4. International domain strategy linking content across languages and markets
  5. Off Page brand health contributing to a brands authority

I attempted to map SEO considerations in these areas out in the diagram below. For diagramming simplicity each point is allocated to a particular area but I acknowledge that many of the points can overlap multiple areas. Please feel free to leave a comment below if you see something I have omitted.

SEO with Optimizely

Optimizely CMS and Commerce Clouds provides the framework to enable really strong SEO practices.

By practices I mean both high quality performant technical implementations coupled with the tooling to enable good content planning & management processes across your organisation.

There are great community add-ons that are extendable to provide really strong SEO foundations.

Site Health Checks

Websites we work with often have tens of thousands of pages or more. A stray piece of code released on production or content management mistakes can cause real SEO issues. These issues will often be invisible… until they are discovered. So the challenge is discovering the issue before they cause a commercial impact.

I’ve worked with and recommend SEMRush as a tool. You can automate SEO Site Audits to execute regularly and email metric driven reports detailing areas like HTML Markup errors, International SEO errors, Crawlability, Site Performance and a number of other metrics. It also plugs into Google Search Console and Google Analyics so you can view current performance within the reports.

Regularly tracking these metrics means you can identify issues early as you improve your site health scores over time.

Next Post

In this post I’m simply sharing details about something I have been working on in the hope that someone will find it useful.

This is a really big topic so am happy to extend to posts on more specific areas with regard to good technical practices. If it’s useful, let me know 🙂

Optimizely Block Output Caching

There are lots of good quality Blog and Forum posts about differing ways to implement Block level Donut style Output caching in Optimizely.

The below is the approach I prefer to use. It keeps things simple while making sure that cache is unique for visitor groups and languages. Using this method cache will also be invalidated when a new version of a block is published.

Block controller

Add the Output Cache attribute to the Block Controller index method.

    [OutputCache(VaryByParam = "currentBlock", VaryByCustom = "language", Duration = 600)]
    public ActionResult Index(ProductListBlock currentBlock)

VaryByParam: Setting this to “currentBlock” will cache based on the ToString() method of the BlockData model returning the cache key. We add our logic there to make sure cache is refreshed appropriately.

VaryByCustom: Used to extend output caching with custom requirements. You must handle logic for the custom string by overriding the GetVaryByCustomString method in the Global.asax. We use this to cache language specific versions of our block.

Block Model

Override the Block model ToString() method to return unique cache keys for published versions of the block.

    public override string ToString()
    {
        var content = this as IContent;
        var changeTrackable = this as IChangeTrackable;

        if (content != null && changeTrackable != null)
            return $"{content.ContentGuid}-{changeTrackable.Saved.Ticks}-{EPiServer.Editor.PageEditing.PageIsInEditMode}";

        return base.ToString();
    }

Global.asax

Override the GetVaryByCustomString method in the Global.asax for the custom string passed into the OutputCache declaration. In our case this makes sure that language versions of the block are uniquely cached.

    public override string GetVaryByCustomString(HttpContext context, string custom)
    {
        if (custom == "language")
        {
            return HttpContext.Current.Request.Cookies["Language"]?.ToString();
        }
        return base.GetVaryByCustomString(context, custom);
    }

Personalising Headless Content across Channels

Personalising Headless CMS content across channels can be achieved with Optimizely Visitor Groups when a property in the http request can identify the source.

For example a mobile app request sent to a Headless CMS could include the following Http Header key value: “mobile-app-request”:true

Request Header Visitor Group Criterion

A visitor group criterion is required that can can match a request if it sees a header with a specified value.

We have pushed the following GitHib repo containing source code for a Request Header Visitor Group Criterion packaged so that it can be added to a nuget feed.

https://github.com/made-to-engage/EPiServer-Request-Header-Visitor-Group

Once installed, CMS Editor’s have the ability to personalise content for channels.

My Experience with Optimizely Fullstack (via Rollouts) – 4 of 4

In this final post of the series, we will investigate the result of a real world experiment and discuss what the results meant to the roadmap of this project.

So far we have talked about working with Optimizely Fullstack from a development perspective and integrating with a Commerce application. Then we configured and coded our first experiment in a few lines of code.

A Real World Experiment

Just like the Experiment we have talked through setting up in this series, we ran an experiment for an E-Commerce website to determine the optimal Sort Order for category pages.

This websites category pages defaulted to display Newest products first. The theory was that customer interact with this website to purchase new release products.

However there was a hypothesis that:

If customers view products on category pages with Best Selling products first, the following success criteria will be proven:

  • Customers will click to view more products
  • Average Order Value will increase
  • Revenue will increase

Real World Results

With Event Tracking enabled as discussed in the previous posts in the series, Optimizely provided us with the following results on each of our success criteria.

Overall Results

We rolled the experiment out to just 10% of total sessions. Even with a low percentage we began to see interesting trends.

At this early point in the Experiment, Customers who were served Best Sellers by default:

  • Were as likely to click through to view a product details page on either New or Best selling products. No notable difference
  • Had an Average Order Value 10% behind New Products
  • Were 19% behind on generating revenue

Of course you need much more sessions and to run the experiment over a longer period of time for the result to achieve significance but these early findings were interesting.

It showed the original hypothesis for the Experiment was likely wrong. The Experiment was failing – and that in itself was a game changer.

What else did we find out

Using Optimizely Reports, you can refine your Results by any Audience Attribute.

Refining the reports per language and/or market showed that some Markets had a huge preference for New products. It didn’t seem to matter for other Markets where the difference was insignifant.

Another game changer. We need to refine and improve our algorithms to best serve products to users of different markets.

And how do we do that?

  1. We are going to create our hypothesis for what customers prefer in each market based on analytics and experimentation results
  2. We are going to run experiments for each market as we work through a program of experimentation, analysis and continued optimisation.

And this is just for the Category page sort order!

Conclusions

Starting is Low Cost & Easy!

Sign Up for a Free Rollout Accounts.

Use the code bases discussed in this series to get started really quickly.

Start with a simple experiment.

1 Experiment leads to many more

We picked a simple Experiment – Sort Order on Category pages. From that experiment came data on customer behaviour.

From that data came questions. A big finding was that customers seem to have quite different expectations across markets.

Let’s run more targeted experiments to find out more on this…

We should also run experiments on all key functionality as we continue to optimise conversions.

Strategic Process Changes

Well designed Experiments are a low cost way of validating assumptions around how your customers interact either with the current site or may interact with new features being thought up.

Run simple Experiments with MVP implementations to prove value before proceeding with full delivery of an expensive feature. Failed experiments are valuable. This ensures budget is used effectively.

Also Experiment to optimise existing user journeys and functionality to hit the sweet spot across all segments of users. And then continue to measure. It is guaranteed that sweet spots will move over time.