As the world rolls into another year, many of its cultural institutions will find themselves at the realization of an important checkpoint: the final stretch on the strategic plan to 2020. Aspirations such as growing diverse, omnichannel engagement reach, depth and conversion; achieving operational efficiency, governance transparency and funding retention or pushing innovation and digital enablement will all point towards embracing data in the years ahead. But what will that look like for museums?
At Dexibit, predicting ‘the future of history’ is our business, using big data analytics and artificial intelligence. To find out more about the pivotal year of 2018 for museum analytics, we talk to the Dexibit ‘A’ team – Adam McNabb, Lead Developer; Alex Garkavenko, Head of Product and Adnaan Velji, Data Engineer – about what the new year will bring.
What’s the vision for Dexibit, a year ahead?
@Adam: Our strategy at Dexibit right now is titled ‘Ka mura, Ka muri’, a proverb which talks to ‘walking into the future facing backwards’. It’s a fitting phrase for the museum sector and particularly relevant for Dexibit in our realm of leveraging machine learning to learn from a museum’s past to predict where it’s headed next. In our product, that means using big data analytics for historic reporting and in the moment dashboards, plus forecasting too. We see Dexibit a bit like the nervous system of a museum – integrating signals, anticipating change, transmitting recommendations. Our vision is to bring together the museum team to provide just the right insight – at just the right time – for strategic and operational decisions alike. Our mission is to achieve this seamlessly, intuitively and dynamically for cultural institutions the world over, no matter their size or level of data confidence.
What sorts of trends are you seeing in musedata?
@Alex: In 2017, we saw broad agreement amongst museum executives that the industry’s leadership style would benefit substantially from being more data informed – at a technology level, that means having faster, easier and deeper access to insight. Our challenge ahead for 2018 is to make sure that’s on everyone’s radar for museum technology investment and to provide museums large and small with tangible steps to get there. What’s fascinating to me right now is that big data analytics in museums is ‘crossing the chasm’ from early adopters through to a mainstream initiative. Big data is no longer a buzzword and analytics is no longer a scary one. With that, we’re also seeing a lot more comfort around data governance (like security, privacy and sovereignty) after good quality debate in the sector recently.
What’s happening in the wider big data analytics industry?
@Adnaan: With the wider recognition and adoption of big data analytics, ‘Insight as a Service’ is becoming more important. In the past few years, we’ve seen a lot of companies across industries going out and making large investments on data teams and infrastructure, which is great. At the end of the day though, what companies and organizations really need are actionable, insight based recommendations, which can be delivered through the cloud, without having to deal with the cost and complexity of an inhouse environment.
@Alex: Sure, artificial intelligence is a deeply complex area of technology, but in my opinion, it should be exceptionally easy in how we design for the end user. This means point and click analytics, or using natural language technology, plus providing proactive recommendations. It’s important if we’re to move from data being one person’s job, to being a natural part of everyone’s role.
How will machine learning play an increasingly important role in Dexibit?
@Adnaan: In 2017, we launched the first of our machine learning models in a new Forecasting module for predictive analytics, and there are more of these models on their way to predict all sorts of indicators of visitor experience and museum performance. As one model became two, two become five, and five become more, we’ve been thinking a lot about how to standardize and scale the underlying machine learning infrastructure within Dexibit, while enabling and decreasing time to market for all our data science projects – the result is something we’ve nicknamed ‘Leonardo’. The data team at Dexibit is also working on using machine learning to manage data integrity and quality, and ultimately, build user trust – the key to success for any application of machine learning.
What are you really looking forward to in Dexibit’s 2018 roadmap?
@Adam: My personal favorite for the 2018 plan is getting more proactive in how we help users navigate insight and recommendations. I think for a lot of users coming into an analytics platform, the sheer amount of data available can be overwhelming, it leads to a ‘where do I start?’ conundrum. Our approach to this problem is being able to pick out what will be of the most interest to our users and when, then to learn from what they like.
How do you plan out R&D for the year ahead?
@Alex: From a product perspective, my job is to listen to all the conversations we’re having with museums and then juggle short term priorities around data visualization and integration requirements; mid term objectives for user experience and extensibility enhancements and long term strategy around artificial intelligence innovation. For Dexibit, the latter is all about being able to predict the future for cultural institutions, and increasingly provide personalized recommendations to help cultural executives harness this. In 2017, we took a big step forward with the launch of our new Forecasting module. You can also already see the seeds for where we’re taking this next, such as the concierge messaging feature to which we’ll be adding natural language and bot automation for proactive recommendations and Q&A based enquiry. This is representative of how we introduce new innovation into Dexibit – what we call ‘Minimum Viable Products (MVPs)’ to test out ideas incrementally in a lean and agile manner as we build new functionality.
@Adnaan: Day to day, as various museums come onboard, we have to integrate with their systems and pull in their data, which sometimes involves building new integration packs if their vendors are new to our ecosystem. The exciting part though is stepping back from day to day operations and thinking strategically about how we can use data science to really power up what we’re doing, from ingestion to insight. For the most part, this is more organic than a features roadmap, largely driven by the growing amount of data with which we work; we keep an eye on the next tipping point as we scale, so that any evolution in our infrastructure addresses challenges before they become consequential. At the same time, we also have to keep an eye on what innovations the product team has on their roadmap, so we can make sure that we have the infrastructural and data capabilities to enable them to keep pushing.
@Adam: Collaborating with our product team, who have their finger on the pulse of what our clients and the museum industry need, I take a hard look at what the challenges are in delivering features and functionality. This usually involves researching and architecting technical solutions well ahead of us encountering them. We aim to be proactive in identifying those challenges. As we look ahead to 2018, we’re evaluating priorities with product owners and applying this process to help ensure we delight our museums and build trustworthy insight. To deliver this, we take our entire roadmap and break it into quarterly plans which then divide into six ‘sprints’, each of two weeks duration. As Alex said, this agile method allows us to get new work into our user’s hands as quickly as possible. We learn what works, what doesn’t and continually improve at pace – from ingestion to visualization and everything in between.
For more on Dexibit’s 2018 roadmap, tune in for our preview webinar.