Connected devices offer additional opportunities for interactions with the guest before, during and after their stay, but at the same time there are also opportunities for big disconnects. Let’s take the recent example of a colleague: “Tom.” Tom made his reservation on line using his loyalty card number and his name, Tom A. Jones. When he went to use his mobile device to check in to his room on arrival at the property, the reservation was missing in action. After several interactions with guest support, it was discovered that the reason that his reservation was missing from the mobile app was that the name on his loyalty card (Tomas A. Jones) did not exactly match the name on his reservation (Tom A. Jones.) My friend Tom, who is passionate about Data Quality, was astounded that this was even an issue. If our goal is to take actions based on connected device data that are designed to improve the guest experience and improve our own profitability, how can we be successful if we don’t really know who that guest is?
While there are many innovative uses for connected technology in hospitality and gaming, we have to stop and think – are we ready to capitalize on the data coming from connected devices? First, it is no small feat to understand the multiple sources of connected data and bring together this data in a way that helps you make sense of it. Where should you start? Can you link data coming in from your keycards or mobile keys with your customer relationship management system? Second, even traditional channels such as guest profile and loyalty program data can be fraught with problems in the form of missing and duplicated data – just as we saw with Tom’s challenge earlier.
No hospitality or gaming company wants to base their marketing strategies on unsound data, but to overcome these challenges, you need a way to pull all of the data related to the guests into one ecosystem, whether it is streaming, virtualized or an actual data warehouse. Using data integration and data quality capabilities is a great place to start. Data integration helps you consume the connected device data that you have coming in. Data quality helps you match that data with your guest profiles. You can use the same approach with your at-rest data sources, bringing together data from disparate sources across your portfolio regardless of the kinds of systems they are held in.
Robust data quality routines can ensure that you are never in the position where you have a database full of guests that you do not know or understand. Once you have a solid data management in place, you can start to bring in new sources of data and analytics. For example, you can match what you know about a customer from social media with offline data and blend this with location data. The common denominator is to have the data quality and data matching in place to gain and maintain an accurate picture of your guests.
At the convergence of data quality, data integration and data management is data governance. Data governance is a set of processes that manage data assets across the enterprise. Whether you consider online and offline data solely for marketing or use it to benefit your entire organization – since the data gathered is touched by so many departments – it is critical to appoint a cross-functional data governance team to manage your data as an asset.
Connected guests are generating petabytes of data every day and at the same time, hospitality and gaming companies are discovering new ways to engage with this connected guest. In order to maximize your marketing impact, the best way forward is to synchronize marketing and service interactions with your guests based on a comprehensive understanding of the connected guest data. All of this data is of no use unless you can turn it into information and insight. For this you need analytics.
Analytics can be broadly categorized into two main groups, descriptive and predictive. Descriptive analytics are what’s commonly known as business intelligence. These analytics help describe what is happening in the data through metrics like percentages or averages, and are typically displayed in static reports or dashboards. Descriptive analytics answer questions like “how many?” “where?” or “what happened?” If your organization struggles to access data, the capability to access descriptive analytics – slice and dice and drill down – can seem like a revelation, but it still is only providing a snapshot of what happened in the past.
Predictive analytics help to anticipate trends and foresee opportunities. Predictive analytics use historical data to predict the future, answering questions like, “what if these trends continue?” “why is this happening?” and “what’s the best that can happen?” These analytics allow managers to proactively plan for the future, hedging against risk and taking advantage of opportunities. You can think of descriptive analytics as reactive decision-making, whereas predictive analytics moves you from reactive to proactive decision-making.
There are several categories of predictive analytics, including the following:
• Statistical modeling: Statistical modeling helps you understand “why” trends in your data are happening by identifying which factors have a relationship with each other and how much they influence key measures. Correlation, hypotheses testing and regression are common statistical techniques. Statistical modeling is concerned with identifying and understanding relationships.
• Predictive modeling: With predictive modeling, historical data, current conditions and key demographic variables are used to predict behavior or outcomes. Predictive modeling is typically used when you already have some idea of the relationships or have a well-defined goal or usage for the results. This category also includes regression, segmentation modeling and customer lifetime value calculations. These predictive models are frequently used in the production environment, meaning that they are used for routine decision-making (predicting campaign response rates or assigning new guests to a segment) as opposed to exploratory analysis.
• Data mining: This set of techniques is becoming more common as data sets grow. Data mining is useful to detect patterns in large data sets and either describe those patterns or use the patterns to predict outcomes. Data mining is used when you do not know what you are looking for or what outcome you are expecting and can be either descriptive or predictive. Common data mining techniques include decision tree and clustering algorithms. Frequently, data mining is used to uncover patterns or trends that are later combined to build a formal predictive model to be used in a production environment.
• Forecasting: These models use historical patterns and current conditions to predict a future state or trend (revenue, demand, guest counts, etc.) There are many different kinds of forecasting models designed to suit different kinds of problems. For example, a different method might be used if there are trends or seasonality in your data versus for data that is relatively stable.
• Optimization: An optimization problem is a mathematical problem that calculates the best possible answer or outcome, considering operating conditions and business constraints. An optimization problem has a defined goal (maximize revenue, minimize labor costs), which is subject to some constraints (for example, the capacity of the hotel, expected demand, a budget), and the output is a set of decisions that will best solve the goal while respecting the constraints (the price to charge by room type or the number of employees to schedule per shift).
When we factor in data flowing from the connected guest, whether it is mobile data such as location or data from devices such as iBeacons or NFC, we need to think about applying analytics differently. Data collected by sensors and devices is now available on a real-time streaming basis. This provides the opportunity for us to learn from trends in this data quickly and act on those trends within moments. In order to take advantage of streaming data, we need to reconsider how we use the analytics I have just described. In most cases we can use the same analytics, but we need to think about where we apply these analytics: on the edge, at rest or in the middle. Let’s explore:
• Analytics on the edge: the application of analytics at the specific device or sensor
• Analytics at rest: data pulled out of the stream and used for high-performance analytic model development
• Analytics in the middle: the application of analytics taking place on the data as it is streaming. You may have also heard this referred to as “the fog,” and it can be a combination of the streaming data itself enriched with stored data such that we can detect more complex events sooner.
When you are accommodating data from the connected guest, the optimal analytics experience becomes a multi-stage analytics experience. It includes continuous queries on data in motion and at the edge, with results updated in increments. This new process moves analytics from centralized data warehouses to the data stream itself using edge analytics, giving us analytics that are closer to the occurrence of events. Multi-stage analytics is the application of the right analytics at the right time in the right place to the right data, which is what you need to explore and capture the value of the connected guest.
Natalie Osborne is senior industry consultant for SAS Institute’s Hospitality and Travel practice, and an 18+ year veteran of hospitality and hospitality technology solutions development, specializing in analytics and revenue management. Prior to joining SAS, Natalie was the director, product marketing for Minneapolis-based IDeaS Revenue Solutions, where she worked from 2000 to 2011. She is a frequent contributor to industry publications, speaker at industry conferences and is co-author of the SAS and Cornell Center for Hospitality Research blog, “The Analytic Hospitality Executive.”