In my day job, I’ve recently been focusing a lot of time on building heatmaps, admittedly it wasn’t the easiest thing that I’ve ever done. What was amazing about this project is that it really made me think outside of the box, because there were so many cases where I thought of using alternative technologies. Unfortunately there also aren’t that many good/real world examples on how to build these heatmaps.
We wanted to build something that was computationally lightweight after all we have to think about several factors here, including the over all user experience, the overall cost of this project & some financial projections, etc. I did wonder about implementing the backend by building something that would consume some data & essentially build videos, taking in what is essentially frame-by-frame content. But then I thought that if the service was ever under an immense amount of load/stress, then realistically we may see issues there, such that it would either require some serious scaling, or it could just all round slow down the rest of the application.
Instead our team at Quote On Site decided to go with a much more simplistic solution, I won’t give away too much of the secret recipe, but what I can tell you is that the backend is essentially a CRUD service. Now with the front end, that’s where things got pretty complicated, not only with ensuring that we’re recording the right information, because on top of the mouse tracking, we’ve also added how much real world valuable time the user spends looking at each page within a document. This included many variables, from the positioning, to where the page is within the view port, whether or not the user is inactive, e.g. they could’ve walked off to make a cuppa, etc. ☕
All in all it works great, the recording works great & surprisingly, I thought that the recording may actually perform a lot worse than what it actually does, but I’m pleasantly surprised that it works much better than what I had anticipated. Thinking of all the data that’s being requested & posted to the server, I thought that there may be some serious performance hits, but so far, it’s performing like a champ! 🏆
On top of all of the above, with another iteration, we decided that we wanted to have a session replay feature, essentially allowing our users to see what the customer did with the document. We decided, by design to replay at a rate of +25%, this was quite tricky, especially as previous playbacks & previous sessions had slightly different values stored in the data layer, thus causing a conflict, so we needed to essentially start versioning bits of the software from day 1. 😅 … After all, if we played it back at the real speed, it would be like watching paint dry!
But I am certainly happy to report that even with the different iterations, it still all works great. For me the most challenging part of this feature has got to be synchronising the application state, as we’re using an
iframe to replay the user sessions. If anyone has ever tried to manage state with a relatively large & complex web application before, you’ll know that this task in itself can be quite complex, thus the likes of
reducers coming into existence. Well imagine trying to do that on two different levels, in parallel, that’s the kinda fun I’ve had trying to synchronise the state. I had a moment where I spent quite sometime just brain farting… 🥴 … In all honesty, trying to synchronise the state that was within the
iframe & what was in the parent DOM’s presence, that got me quite a few times. As I wanted to produce a pretty solid MVP in as little time as possible, I was pushing myself & in all honesty I think a part of the problem was that because I was racing the clock, I kept forgetting about the fact that the state is shared between what is essentially two different DOM’s. 😂
Not conviced that the heatmaps work pretty well? Then by all means, take a look at this nice little piece of marketing material that someone within my team put together! 😀