Build vs buy: Calculating the total cost of ownership for big data analytics

If you’re planning a big data analytics program for your visitor attraction, there are many aspects to a holistic solution and approaches to consider for success. Without the right planning up front, data projects can quickly become unwieldy. Or simply fail. In the wider technology sector, 85% of big data analytics projects in corporations fail due to integration complexity, data governance challenges, a lack of expertise and the tricky business of leading cultural change. 

Pursuing to develop a big data analytics solution in-house – building a complete data technology stack and internal data capability yourself – is a big undertaking. Watch out for the complexity under the surface, as a business intelligence tool is only the tip of the iceberg. Be prepared to set aside a significant budget and most importantly, a lot of time.



If you’re evaluating a comparison between Dexibit’s product and building a custom data solution inhouse, here’s a shopping list of what you’ll need to consider: 


1. The data solution

Building and maintaining integrations

Getting to good data is a big chunk of the data challenge and a huge amount of the work happens underneath the hood of the data stack. Expect most of your initial costs to incur here and know it will take longer and cost more than the team planned. For all the various systems you source data from (and a few new contextual data sources you’ll need to buy, like weather), each will need an integration developed, tested and of course constantly monitored and maintained. Going without data automation risks success – the more effort your team have to put into getting data, the less energy they’ll spend acting upon it.

You’ll also need some means of managing those integrations – scheduling when to tap what system. You’ll need to come up with a way to handle things like refunds, which will change the underlying data since it was last taken in. And a way to track failures (or worse, partial failures) and fix these quickly.


Managing a data lake or warehouse and data pipelines

All that data will need to go somewhere. You’ll likely want a combination of a data lake and/or warehouse, to get the best of both worlds of managing provenance versus performance. Somewhere in between, you’ll need a data pipeline to clean and transform data ready for use. For example, getting any kind of numbers out of raw ticketing data is never a simple affair. Location data from WiFi is worse – it will contain all sorts of issues such as staff presence, fixed equipment and passersby. Rather than beautiful trails, it will be indecipherable in its raw format (presence data is a complex domain that will require a lot of research and testing, at least 12 – 18 months worth if you dedicate staff just to this source). Even something as simple as footfall data will need business rules applied and managed over time. Preferably all this will sit in the cloud, which means hosting too (along with advanced data governance to manage security and more).


Adding value to data and presenting it 

Once you’ve got good data, at that stage you’ll then want to add value to it rather than just report on historic data, which will mean utilizing artificial intelligence – machine learning models, natural language and advanced analytics to predict and analyze your visitor’s behavior. Each will need to be researched, developed, tested and continuously improved over time (expect this to take at least 6 – 12 months for each). Plus, you’ll want to keep up with the latest innovations so your data solution finds new ways to enable your visitor attraction, otherwise it will become stagnant.

Then, you’ll finally be ready for the last mile – getting all that goodness to your users. This means a tool for dashboards, reports and visualizations (or more than one, because your technical team’s requirements will differ from business users). The tooling itself might not be tricky, but you’ll spend lots of time customizing it for staff who are less confident with data and working out what’s best for different departments or use cases. Especially when it comes to visualizations – assume the vast majority of your users will not be comfortable creating a graph and instead need these built for them. Experiment to see what works for them and upkeep as they add requests (so make sure your build phase continues on through at least the first year of operation).


2. The program office

So far, we’ve only covered the technical solution. To implement all this, you’ll need a technical project manager, a business analyst to capture data and user requirements and someone managing organizational change (such as leading education and training) – ideally experienced in both data projects and the visitor attractions business model. You’ll need to balance your resourcing across the roadmap of new requests you receive from the business, the needs of your technology upkeep and important investments such as security. On the technical side, you’ll need a combination of data engineering for integration, ingestion and transformation, data science for modelling, infrastructure and development or implementation skill sets, depending on how much customization your visualization tool requires. You’ll also want someone to document everything. Straying from technical skill specialization will compromise the result. Aside from the fact these resources are expensive, they’re also difficult to recruit.   


3. Ongoing operations

Data isn’t a one off project that you’ll do once and be done with. It’s a change to how your team make strategic and operational decisions. Integrations will need to be maintained, ingestions monitored, integrity checked, security protected, models watched and improved. You’ll want to make sure your team have support on hand – everything from ‘how do I change my password’ to ‘how can we grow visitation?’, in a way where knowledge is not at risk of a key person going on vacation or leaving. Plus new features and functionality delivered to continue enabling your team’s innovation, so your data program doesn’t get left behind in a few years with the rapid pace of technology. Remember for every dollar you spend building technology, you’ll spend many more keeping it current. 

The one thing that’s hardest to put a price on is time. If you’re starting from scratch, getting everything in place to enable your team with insight will soak it up and drag it out. Learning pitfalls and mistakes the hard way takes longer and ultimately, those unknowns pose the greatest risk to the project’s success. This will likely be a multiple year journey with a dedicated project team to get right.

If it sounds hard, that’s because it is. Big data analytics and the application of artificial intelligence are complex if you’re starting from scratch. If your head and budget are hurting at the thought, don’t go it alone! At Dexibit, we’ve worked with visitor attractions big and small, in every segment of the industry, the world over. Designed specifically for the visitor attraction, we predict and analyze visitor behavior with a full stack, turnkey solution in the cloud and unique data concierge. We know what to watch out for, when to pull what levers and how to get to success.

Learn more about Dexibit’s features (which includes forecasts, insights, dashboards and reports), or book a demo with our team today.