Our worst data mistakes
- Posted on
👤 Featuring Pierre Perez, Success Director & Angie Judge, CEO of Dexibit
Reflecting on 10 years of experience transforming visitor attractions with data, Pierre and Angie dissect the lessons which have shaped Dexibit’s approach over the years – what works and more importantly, what doesn’t.
Transcript (generated with AI)
If you want go from gut feel to insight inspired, this is the Data Diaries with your hosts from Dexibit, Pierre Perez and Angie Judge. The best podcast for visitor attraction leaders passionate about data and AI. This episode is brought to you by Dexibit. We provide data analytics and AI software specifically for visitor attractions so you can reduce total time to insight and total cost of ownership while democratizing data and improving your team’s agility. Here comes the show!
Angie: Okay, we’re gonna talk about failure and mistakes and tell you all of the things that we’ve learned over the years of either doing data transformations ourselves and what sort of missteps we’ve made, or often when we talk to visitor attractions who turn up to the exhibit, having tried to build something themselves and some of the lessons that they’ve learned that they share with us as well. And here with me today is Pierre. Pierre, what do you reckon is on my list? Top, top mistakes.
Pierre: First of all, I like what you say about mistakes we have made. And as, as like the point of learning. When you do some data is that you will continue to make mistakes but because you’ve made some in the past, it’s how quickly you, you react to them and how quickly you apply a fix, right? So I feel like making mistakes is a natural step in the process. It’s never gonna go away. But it’s about how quickly you react to them and what fix you put in place. So there is there is a massive learning here. But in term of what the most common mistakes are, the common mistakes that we are, we’re making I feel like a lot of time, people undervalue, the amount of work that may be implied in a specific integration sometimes, or they think that, you know, they a very good example of that would be someone going with a new vendor thinking that it would be very easy to extract data and think, oh yes, they have an API. Oh yes, they have all these fancy things. And when we turn up and we want to integrate and we realize that it’s not as straightforward as it can be. So ensuring that there is good data availability and assortment from the sources is one of them.
Angie: It’s so true. And it’s funny I was reading a stat the other day that 85% of data projects fail and the number one. So this is a Gartner, Gartner finding. The number one thing that they cite is integration difficulties and then followed by things like management resistance and internal politics, but integration difficulties upfront there. And I think in our industry and visitor attractions in particular, that’s a really difficult one because a lot of the systems are quite old. You know, a lot of the ticketing companies been around for a long time. They’re pre-cloud. Maybe they’ve made attempts to get on the cloud. Maybe the box says that there’s an API, but under the hood, not really. Or maybe there is one, but it’s, very suboptimal. Doesn’t include the right attribute, segregates data in the wrong way. All of these sorts of things. So, sometimes you don’t know these things until you get into it. Sometimes, you know, the vendor coming and can give, the customer the heads up on, on what the realities are going to be like.
But yeah, for sure. This is one of the things that internal teams, when they’re trying to build something themselves, really, underestimate the level of difficulty, not only to build, but also to maintain longer term and to think about what that’s like over generations of data teams, over generations of vendors over all of the different maintenance things.
Like what if the vendor goes down, what if they uh, retire part of their API and issue an end of life notice on it. What if. they change up how they address some attributes that there’s a lot of maintenance involved after the fact. Did I have that one on my list? I’m trying to think. I’ve got an integration one on here. One of mine would be trying to integrate all the things without actually making any forward progress on insights and like operationalizing data. So it’ll get to the point where it’s like, oh, we’ve got this e-commerce system. How much of your revenue does it comprise? Like less than 1%. Or this extra ticketing system, what do use it for?
Oh, this one event that we run each year and it’s like, we wanna integrate this and this and this and this and this, and this. And just adding a huge pile of complexity rather than getting going with the basics and just getting some dashboards out the door, getting some users trained up, getting some reports automated, like just making forward progress. I think my big lesson learned is like focus on the big rocks and get momentum going, and then for the rest, like upload some extracts if you really need the data, but focus on getting data out to the people. And then come back for all of those extras later on over time.
Pierre: This is such a good point and, and so timely because you and I had a conversation with a stakeholder. This week, right. Regarding that topic on you know, that stakeholder had limited resources, , in their team, in their budget to, to build from scratch and when they talk to folks in the organizations, everyone wants everything. But they don’t really know how they want it. And that puts a lot of pressure on the technical team, they try to build what they are being told to build without necessarily understanding what’s the end user. What outcome the team is after. And that’s certainly, you know, a mistake that we’ve done in the past as well ourselves with people telling us, hey, we want this, and hey, we want this.
And we’re like, oh yeah, we really wanna make them happy and, we are gonna get this, connected without taking a step back and be like, hold up a second. What is it that we trying to achieve by bringing this data set and how often are we gonna be looking at this data set to make it relevant, right?
And sometimes, as you mentioned, we think of taking a very complex, real time route to give, to bring data in front of them. And there may be another alternative that’s a little bit more simple that will give them the same results. But that will make it a lot lighter on their team and on our team as well.
So I agree with you. Wanting everything at once is sort of like, a utopia almost of, you know, we can have everything ready really quickly, and it’s not, it’s not always the case.
Angie: Yeah. And it’s just over investment really, isn’t it? And reality is, it just pushes out the timeframes of getting all the data integrity accepted and everything else, and getting all the dashboards designed. But. For very little gain. Like you say, there’s that often that one data set that somebody looks at once a year and it’s a bit deflating to go through getting the data all perfected.
And, only to find that nobody, nobody looks at it. Speaking of this one, I think next on my list, this is gonna be popular with you, pursuing a hundred percent data integrity when there’s little business value in doing so. People can’t see Pierre’s face right now, but he’s got that ‘oh, yes’ look.
So often we find like, oh, there’s this refund of five tickets from a few months ago and now the data’s outta sync and it missed the reconciliation window. Or, of whatever the case might be. I mean, data integrity is really important. People need to have confidence in their data. But often and often oftentimes we integrate in voila, a hundred percent accuracy, everybody’s happy. But if that’s not the case, like 99% is good enough you can spend endless hours and wasted chasing a few tickets down here, or a wholesale retail transaction down here, or trying to work out this random thing that happened once in, 2023, or every now and then. And the reality is to like to make good decisions and plans, it doesn’t need to be perfect and perfect really is the enemy of progress in this case.
Pierre: I mean, by default, I am from the school that done is better than perfect from the get go, right? And perfect has a very subjective definition for every single team in the organizations. Your finance folks will always be, a lot more focused on the accuracy that, you know, you visit to experience your operations, et cetera.
So it really depends on who you speak to. In terms of expectation of accuracy, while I’m usually here, you know, what we usually do is that we focus on insights, and trends. So the fact that your visitation, maybe up 25.01% against 25.02% is irrelevant in term of the accuracy.
The relevancy: is it up or down? And by how much. And when you look at your other data in line with that, in parallel with that, can you see trends and insights based on that? What is more important is in my eyes. More than a hundred percent accuracy and just making sure that the rules that you have around each specific data point are loud and clear to everyone.
When do we recognize someone as a visitor? When do we recognize someone as an exhibition attendee? When do we recognize revenue for. A specific ticket, et cetera, et cetera. These are really important roles to put in place. But having a discrepancy of, you know, 1, 2% in an overall report or a novel count, probably gonna have very limited impact in the medium to short to long run.
Angie: Yeah. And a big hack that you’ll find I’m giving away the house secrets here, so if we ever do this to you, you know, you know what we’re doing now. But, if you ever find yourself in this situation, like one of the things we do is somebody will say, oh this data set is out by 42 tickets, for example. We’ll immediately convert that into a percentage because that sort of puts things into perspective for people making a decision on what are the decreasing returns of chasing those last few often points of a percentage on data accuracy. And sometimes the other thing to think about as well is like the realities, of how good the data is in the first place. So a real typical one that we see with this is Google Analytics. Google Analytics comes out of Google slightly differently depending on how you dimension it and how you sort of pull that data out. But we also have to realize the fact that particularly in this changing world of cookies and now in this changing world of, generative AI appearing in search results or even just straight into conversational search or conversational content. It’s not that accurate to start off with. Sorry, Google. And so, worrying about a couple of percent on Google Analytics is, depending on how that data has been mentioned can be a bit of a lost cause as well.
I mean, the big thing is to see like, is it going up or down? How does it relate to visitation trends? What’s performing and what’s not? What’s the conversion rates and how do they change over time? That’s the stuff that really matters, rather than, was it 1,337 or 1,339?
Pierre: Absolutely. And with that there is also, when we talk about list and when we talk about impact, there’s also different updates, right? That may be made either the source or by the vendor that may affect your accuracy. You know, along the way. For example, a specific vendor pushing an update on a specific day, and you’re thinking, oh, everything’s gonna run smoothly.
We don’t touch any, of that part of the product. And then the day after comes and you’re like, whoa, hold up a second. They actually touch a lot more than, you know, specific part of the API specific part of the product. And there is now, a bigger discrepancy or there’s things that we are missing after that update.
They may have changed an attribute name, they may have added an attribute. There’s ample things that are, that can go wrong, okay? That can go different when an update is being pushed. And I think that’s one, one that will be on my list as well, is that if someone mentions that they’re making an update, you will follow up very quickly in product to understand. Has this changed my numbers, yes or no? Because things can go wrong from there very, very quickly. If you’re not aware that it has changed, the results that you’re getting from that integration.
Angie: Yeah, that’s so true. Data integrity is not a one time thing. It’s like a thing that you need to address periodically and like Pierre says especially if something’s changed, go back and check. Just do those health checks every now and then. One of my favorite areas to do this with is physical hardware, particularly like, people counting cameras. There’s been some very famous cases, like out of the UK, a couple over there where the footfall, camera counting visitor is coming through. The perimeter broke or, worse broke quietly. Where it started undercounting. I think there was one case with, um. These are very large sort of national museums and galleries.
One case where the light bulb dimmed or they replaced the light bulb with like a slightly dimmer light bulb and didn’t recalibrate the footfall counter and undercounted. We are talking about like half a million, a million visitors a year affected public funding was a really big deal in the media.
These are the sorts of things that periodic data integrity and in this case doing manual click account tests. My suggestion is always to do them with daylight savings. We are always reminded to check our smoke alarm batteries at daylight savings. Also, check your people counter accuracy at daylight savings.
So just do a manual count for, you know, half a day or something. Compare it back. You can load your manual count anDexibit alongside your people count so you can, compare the two data sets and check your accuracy, of the actual hardware as well as, the data that’s coming out of the system. Next on the list. Pierre hasn’t seen this list, so this is completely fresh. We’re getting the live reactions. I’ve got. Keeping data in the corner or the IT department. So, what I mean by this is like only the tech team are involved in onboarding and no other stakeholders are invited to business reviews.
And they put off user training and the internal message becomes something like, let us know what you want and we’ll go and build that for you. This becomes a real agility trap where we don’t achieve that. Like democratized data vision. Users don’t have the confidence to self-service their own needs. Data literacy and culture slows, and the end result is just like an IT department with a long list of requests for reports and data going out the door and that vision of data transformation really hasn’t been met. These aren’t in any order, but this would be one of the top ones on my list, I think, of impacts down the road in terms of like, have we achieved the objective?
Pierre: I like how you mentioned that and it. There’s two metaphors that come to mind when, when you mention that. First is like there is no perfect timing to present the data, right? Like, waiting for perfect timing to, to do that, for everything to be lined up and everything to be perfect is almost having a perfect timing to, to have your first child, right?
There’s not such thing, oh, I need a house, or I need more money, or I need a car, or I need this. You are always gonna need more. You’re always going to need to do something better before you do that. So do it now and then get feedback, right? Or be pragmatic about it. The other thing is it’s composing a data structure is almost like composing.
A beautiful piece of music, right? You never know how good it’s gonna be until you put it in front of people and they give you your, their feedback, right? You can write the most beautiful music in the world, but if no one listens to it, you’re never gonna know how good it truly is. And you need people’s feedback, right?
And how they feel. Same thing with data, same thing with structuring these dashboards. Pulling it to the team when it’s good enough to be put in front of the team, get people’s feedback on, yes, we like this. No, we don’t like that this, we want to see it in a different way. And then we can tailor to their needs.
Almost reverse engineering the process. Instead of saying, we are gonna do all this work here, we’re gonna put it in front of the team and happy days. It’s more about let’s put a draft together. It needs to be good enough to be, I agree, it needs to be good enough to be, uh, put in front of people, but then we need to hear the feedback, right?
You need to sell almost a data structure to the team and to sell, you need to listen, and to listen, you need to have some feedback.
Angie: Yeah, we have a release philosophy here at Dexibit but when we’re putting out product, so like developing software which is three words, and I think that makes it nice and simple and powerful and it’s, ‘little, early and often. So release something small, release it as soon as you can, and then iterate on it constantly once you get that feedback. And people talk all sorts of things about agility and there’s all sorts of frameworks in the world for. Mastering it, but little early and often. I think sort of describe the best way to really achieve that and data that can be like, you know, this is why we’d make a big push for visitation.
Like, let’s get a visitation dashboard out the door. Let’s get it automated on a daily basis. Let’s get it out to a big audience and before we go integrating all the things, doing all the fancy stuff, just even small wins like that. Make big strides when it comes to bringing everybody on the journey along with you.
Pierre: What else is on the list? Angie tell me.
Angie: Um, spending six months on a data audit and assessment without making any forward progress. This is like often something people do before getting started. And it turns into these like largely theoretical exercises. Priorities and realities always end up being different once you get into it. And basically we’re just delaying value while increasing cost. And realistically, very few people are going to really read the report that they spent the thousands of dollars on a consultant to write. And usually we find like stakeholders actually need to be sleeves up in the data to know what they want and to know what their data needs. And that sort of idea that you only know what you need to know, when you need to know it kind of thing. Like, doing a brief audit is cool. That for us is often just a conversation of what systems have we got and what are you trying to accomplish? Maybe a workshop but don’t, overanalyze don’t get sort of trapped in that analysis paralysis before getting going. Literally, like, let’s just get into some data and get started. Because it’s through these conversations and through the work that everything is revealed. And people start to work out what they, what they want and what they need.
Pierre: I feel like this is something that we’ve become really good at with the folks that we have in our team, is that we get thrown very often new systems, right, that we’ve never dealt with before. But the folks here that we work with are really good at contextualizing that data and picking up a field and say, yes, this specific field is the same as that specific field in that system.
And we should call it, for example, date schedule, right? Or date redeemed, or this specific status means, that the ticket was canceled, that it’s not valid or that it has been bought as a gift, et cetera, et cetera, et cetera. There’s a lot of this exercise, of trying to understand and contextualize the data from a source that potentially an internal team may not have been exposed before, or they may make assumption, which are not in line with best practice or with what the data was, first put in place for.
So I feel like there is that as well that can take a lot of time on going back and forth with teams. In an internal organizations where we sort of have, without being you know, pretentious about it. But we sort of have been exposed to so many different systems for such a long time that we know how things should be structured and should be looked at going forward. And that’s how we’re saving a huge amount of time right when we integrate.
Angie: Yeah, when. We’ve got a thing Dexibit, which is part of our software product. It’s called a semantic layer. And basically when we hook up to one of these vendors or source systems, we pick up every single attribute and give it a name a proper name. ’cause often it’s a field name and the source database is something really wonky. We give it a nice description to make it easy for people to work with that data to understand what they’re working with and any kind of notes that they need to know about it. We might give it some transformations that maybe it’s in the wrong time zone or maybe, the data just needs a bit of a cleanup or needs some tags or something put into it. Or maybe it needs to be enriched to really unlock some value or sort of converted and classified and things like that. And then we, give that, that system, maybe some derived metrics to calculate some stuff dynamically that it’s missing. We’ll give it some visualizations. And then we might map it into a domain and say, well, this is ticketing data and so it should look like this and get all these extra visualizations and templates and stuff. And we sort of form it. Make it conform to some industry best practices as we see them in this data model. But it’s essentially this huge pattern matching exercise. ’cause even if it’s something, completely off the wall, different. We can say, actually this is sort of transactional data and therefore it should look like this and have these things.
Or, this is location data and so it should have these things and look like this. And we’ve got the benefit of having done this over and over and over again with hundreds of vendors and lots of different data sets to be able to accomplish that. It’s very, very difficult to do. And a data audit on a piece of paper.
And I don’t want to be accused of being a hypocrite. We do have a data audit and assessment template on our website, but it’s literally there to support a very brief, um, conversation, not to spend months and months worrying about because literally once you get into these things everything will be different anyway. Speaking of money wasting, my pet peeve is data sovereignty. Which is a whole other conversation in itself, but an example of where you can spend a ton of money on something that delivers very little value and you probably don’t need. FedRAMP would be another thing, like very expensive, won’t actually change anything for anyone.
And it’s really just sort of compliance for compliance sake. In most cases, because we’re not talking about PII or PCI data here personally identifiable or credit card information we are talking about like business data, transaction data. And so sort of keeping some of these very things that can be very expensive, very time consuming unless they are an absolute necessity. And just getting really pragmatic about them would be something I’d pop on that list too.
Pierre: Absolutely keeping the right context around the data that you are gathering and analyzing is one of the key things. And, and I agree with you, it is it can be quite a massive roadblock, right? Sometimes when we have teams or, individuals , that don’t really follow that context or understand that context and what we’re trying to do. Yeah, 100%.
Angie: And then if we sort of zoom out a little bit, I’ve got here making a data, a project, not a product. Now when you implement, let’s say, a point of sale system, you get it, set it up, you get configured, you switch over, train everybody up, do your migration, and off you go. Data is totally different. And one of the things that we do at Dexibit that’s quite unique is we have a, what we call a data concierge service.
We do. Usually monthly business reviews with our customers, where our success team will sit down and run through the numbers, talk about trends and patterns and insights and benchmarks and new features and new suggestions. And somebody from, the stakeholder group that are attending those calls will say, well, I’ve got this question, or I’ve got this problem, or we are switching out this vendor, or this system, or this partner, or changing our pricing, or whatever the case might be. And we talk through the data ramifications of how to service that decision or that action. It’s a moving beast and it that you have to go with that. It’s not something that’s like one and done and you just dust your hands and walk away. So treating it like a product that evolves, that’s like living and breathing and constantly learning and improving and will always have a roadmap.
And that doesn’t mean that you’ve not finished onboarding. It means that, this is something that’s moving with your organization as it grows and changes.
Pierre: I think one of the sentence that still give me nightmares is we are integrating this new vendor and the question is, great. Do they have an API and you know what the answer may be, or we don’t know. That sort of answer gives you nightmare, right? You, you don’t know what you in for when that happens.
So, yeah. 100%. It’s almost like, um, I used to work with with a lot of, of talented folks, in the previous organ, and the model was 1% done, right? We are always 1% done. And that’s sort of the same thing with, with your data, right? It’s always 1% done. It is always evolving. It is always a moving project that you need to be agile around. And that’s really, really important.
Angie: And probably related to that is the mistake of ignoring organizational change management and this is sort of the silent killer I think, of any kind of transformation, but particularly, data, I think is, there’s a lot of fear around it. Some common spots where this arises sort of qualitative evaluation, particularly in the age of AI where people can, have some very real fears about whether AI is going to replace their job of analyzing voice of the visitor.
Or and so sort of helping them reposition as an insight leader and champion an educator within the team the wider organization’s, really important. And instead of talking about the merge of quantitative versus qualitative, not the two being enemies of each other, Gatekeeping particularly around sort of, system administrators that have been there for a long time, maybe that know all the rules and they’re the only person that people can go to, to get data from and they keep the spreadsheet with all of the things very, very occasionally, but it does come up.
There can be financial risk involved with that in terms of like, fraud and theft. So it’s a very real problem to address. And then others like are just sort of, general areas where usage can otherwise be low and people can just not engage. And just keep making decisions the hard way without data. So getting into lots and lots and lots of training. We don’t cap out our user training and that’s, that’s, one of the reasons why helping people with the storytelling aspect, helping them with the translation of like business problem through to, data analysis and just consistently reinforcing great habits, with the team. These sorts of things are really, really important for leaders. And if I go back to that Gartner thing of why did data projects fail, like, lack of leadership support or having leaders on board was one of their top ones of reasons behind the integration one. So this is definitely a big thing. The end of my list, Pierre.
Pierre: Yeah. No, that was great. And on that, you know, I absolutely resonate on that. I think, a while back now I had a conversation with the stakeholders. And we were looking at a specific part of the product, and that person said to me, oh no you know, you’re looking at this wrong.
But no you wouldn’t know that unless you were me. And that sentence is really striking, right? Is that one person in that organization, only knew a rule that really should have been known by more than one person. And it’s fairly, it’s almost dangerous, to have that knowledge isolated to one specific person.
’cause anything could happen to that person, and then you lose that knowledge, right? So, I agree, Support of leadership being transparent, documenting the changes, documenting the structure is really important to make it successful.
Angie: Is there anything missing from this list that you’d include in your top? I’m really putting Pierre on the spot here.
Pierre: When we purely speak about data and availability of the data, for me, my biggest one is how granular do we have that data available to us at? Right. It’s we’ve seen that many times where, like where the data is not granular enough. When it’s being sent to you and then you’re extremely limited on how you can use that data with other data sets that are, have a final granularity.
An example of one would be footfall, right? We’ve been in a situation before where we had. You know, tickets being bought onsite flowing hourly or an hourly granularity and your retail revenue is flowing and an hourly granularity and it’s great. And then you have food for data and that was a daily granularity and that mismatch here of granularity sort of rolled, a lot of your insights out of the window because they’re not, you cannot match them together. You’re missing a piece of the puzzle and you have all the data in it’s happy days, but because that data is not coming at the same granularity, you’re actually missing insights. So I think that’s one of the big one for me is.
You know, go as granular as you want for two reasons. One is that yes, on the inside it’s really important to be as granular as you can and have the same granularity across all the different source that you deal with. And then the second is that if something goes wrong or if you notice a problem it is, way more easy to go down to a transactional data type than aggregating at a daily level, for example. Right. We talked about accuracy very early on. If you have something that’s, you know, an accuracy that’s off for whatever reason, you want to be able to go down to a transaction or an order, for example, when it comes to retail and tickets and break that down by order or transaction.
Because if you don’t do that, really you start to guess work on what is going wrong. So that would be like a really big one for me is, you know, the accessibility availability of that data at a, we call it transactional here at Dexibit, but really as fine as you can have on the perspective.
Angie: Yeah. When P is talking about transactional data is talking about like, on Monday, the 12th of June at 12:47 PM somebody bought a coffee and it was a latte and it was large and it was $5 40 and the tax was this much, and they bought it together with a muffin. It was a blueberry flavor, and they chose to have it to go, you know, it’s this kind of level of detail versus like on Monday we made $5,347. So the granularity of that data is, yeah I would agree. Super, super important. And probably related to that, forecasting accuracy. Judging, I know that seems like a really left field leap to go from coffees to forecasting accuracy, but judging, forecasting accuracy on different granularities, you know, looking at, a target for a month and going, oh, well that’s 94% accurate. And then looking at a daily forecast and saying, oh, well that’s only 87% accurate and therefore it’s worse. Like, always keeping in mind the level of granularity that you are, thinking about when you’re talking about data or when you’re talking about predictions.
And particularly when you’re judging accuracy sort of apples for apples comparison is a bonus one.
Pierre: Yeah, absolutely.
One last one for me, which is you know, the validation, like we talk about data integrity needs to be acceptable before transformation, if that makes sense. So we do wanna make sure that we are dealing with the right data before we start transforming that data and bringing insights into play.
For example, when we talk about footfall, we wanna understand. You know, a common metric that we that we track is how many visitors came to that exhibition. Either, you know, comparing a footfall visitation against a footfall entrance to a specific gallery, or the number of tickets that were scheduled for that exhibition.
Or for example, how many people visited the museum out of. X amount of people who visited the museum, how many entered a retail store. But what you wanna understand is that you wanna validate the first two data points that you are dealing with before transforming and having the insights, so you want to make sure that one, you’re counting people correctly within opening hours and on both sides, opening hours at the museum and opening hours at the store.
Then once you have that validated, the end, you can transform into a ratio, for example, of. X amount of people have that have entered the museum, also visited the store. But don’t skip that process, right. Of validating the data before you transform or you do anything with it. So that’s one of, that’s one of my one.
Angie: Yeah, same truth for forecasting as well. To come back to that topic, not to harp on about forecasting, but I was in a customer meeting earlier this week where we pulled out the beautiful new forecast. We just, spun up and then went, oops. The visitation data is, come out incorrect on this, this, and this.
We actually need to go back and reevaluate some of our business rules. And of course, all of the forecasts were then thrown off and to constantly try and reinforce that point of, by the way, these have been trained on the data that was, apparently missing these things. Now we know and we’ll go and address that, but like, yeah, again, another example to Pierre’s point of like validating the input, that’s going into a machine learning model and training it up rather than expecting it to magically no where the input data is wrong. Very important. We could probably come up with a whole nother list just for AI as well, I imagine. Speaking of, machine learning, and probably top of my list there would be, don’t spend several years writing an AI policy before you dip your toe in the water. But there’s probably a, there’s probably another 7, 8, 9, 10, lessons learned from that as well.
So love to come back for those on another episode.
Pierre: Yeah, one hundred percent. I think you know, the moral of the story is that, is it gonna be perfect? Yes. One day. Let’s start now. Let’s be pragmatic and enjoy the journey and we, we may make mistakes along the way. We’ll fix them. And it’s a journey that counts, right? And then we’ll have what we want at the end, but, don’t be too much in a hurry. Essentially, get the basics right and then build from that.
Angie: Well, if you’re just setting out in your data journey, I hope these help you in some way, prepare for some of the realities and, avoid the, common tracks traps for new players. If you’ve been through, data transformation yourself, we’d love to hear about your own experiences or data centers to add to ours so we don’t feel quite so alone. Share them with us. Share this episode with your colleagues and your friends, and we’ll see you next time.
If your goal is to get more visitors through the door, engaging and spending more, leaving happy and loyally returning – check out Dexibit’s data analytics and AI software at dexibit.com. We work with visitor attractions, cultural and commercial, integrating with over a hundred industry source systems across visitor experience and venue operations, providing dashboards, reports, insights, forecasts, data management and a unique data concierge.
Until next time, this is Dexibit!
 Ready for more?
Listen to all our other podcasts here: