Big Data In The Hotel Industry

Big Data In The Hotel Industry

Big Data is making its way through every industry possible, including the hotel industry. This industry is another perfect example of how effective use of analytics can dramatically change how a business is run. There are many types of data being collected including video, audio, and web data, and there are huge volumes of each. However, many hotels are carrying all of this data with them, but not acting upon a majority, if any of it.

Many hotels gather loyalty information about their guests, but fail to take it a step forward. This, to me, is the perfect opportunity for any and all forms of predictive analytics.

Hoteliers could exploit their data and deepen their knowledge of guests in order to develop a more granular understanding of segment behavior, needs, and expectations; identify profitable customer segments and their buying preferences; and identify opportunities to attract new guests.

In this industry, understanding guest preferences, purchase behavior, and profit potential can dramatically increase brand loyalty and wallet share of their most valuable guests.

One key factor that hotels must be aware of throughout this process is what guests they are targeting. Targeting the frequent guests who are likely to take advantage of other hotel amenities would generate much more overall profit than targeting the guests that are just stopping by. By marketing to their most loyal guests preferences, hotels will find themselves staying ahead of trends, setting strategies, and achieving their goals. This will then lead to happier guests, and more money in the bank.

And look at that…all of that can be done with the right analysis of data!

Is Data Management The Key?

Is Data Management The Key?

Many organizations today struggle to get their cards in order and turn data into dollars.

“The more data you have, the more crucial it is to better manage your master data and improve the maturity of your master data management (MDM) program,” said Saul Judah, research director at Gartner. “Existing approaches to data management are, in many cases, insufficient to accommodate big data sources on an enterprise scale. Collecting data without managing it properly also creates ongoing costs as well as regulatory and compliance risks.”

In order to save money, CIOs and Chief Data Officers who oversee big data initiatives need to consider the following steps:

Update Information Strategy and Architecture
Many organizations have had success leveraging big data insight around specific business operations, but typically it’s limited to a single business unit or use case. Few firms have explored how to make big data insights actionable across the entire organization, by linking big data sources with trusted master data.

For example, many marketing organizations use data from social sources — such as Twitter and Facebook — to inform their campaigns, but they don’t reconcile this with trusted data in customer/prospect repositories that are used by customer services or sales. This can lead to incoherent customer communication that can actually undermine the sales or customer service process.

Become More Agile
Effective use of big data requires a mixture of old and new technologies and practices. This necessitates an agile approach that applies a bimodal IT framework to information governance (see “Why Digital Business Needs Bimodal IT”). MDM traditionally uses a Mode 1 approach which is policy-driven and approval-based. Big data typically uses a Mode 2 approach with little or no predefined processes or controls. Tactical and exploratory initiatives are much better suited to the “faster” Mode 2.

Move to Limit Risk Exposure
When an organization executes actions based on information sources outside the curation of MDM — as is the case in many big data implementations — exposure to certain types of business risk increases. Factors such as poor data quality, loss of critical information, and access to unauthorized information become more likely. Gartner recommends appointing a lead information steward role in relevant business units to assist in creating and executing risk controls with regards to data use in business operations.

 

All of the above steps to help manage your data can quickly turn around and save or make your firm money. You have the data- now just unlock the value of it with master data management.

Is All-Flash Storage the Next Step for Government Data Centers?

Is All-Flash Storage the Next Step for Government Data Centers?

It’s no secret that the amount of data government agencies must store is increasing at a rapid rate- and this growth shows no signs of slowing. But government agencies can’t just build more data centers to house this data – there must be a more efficient and cost-effective solution. In this Q&A, Melanie Stevens, Director, State and Local Government and Education, discusses why all-flash is an important next step in the storage evolution for government agencies.

Q: What is all-flash and why is it important for government agencies to consider? Melanie Stevens: Many government  agencies are facing a storage dilemma in their data centers. Their need for storage is growing, but there is insufficient funding to purchase new equipment and limited budget for IT staff, space, power and cooling. Across the data center, we see increasing  speed and lower cost in many networking devices, such as servers and switches. However, storage has failed to keep up because of the way mechanical disk works. Capacity and cost have grown, but performance has stayed flat. Because of this, the performance per gigabyte of disk is getting slower. Flash memory is faster, has more space and is much more power efficient than disk. The Pure Storage all-flash storage array meets the availability, reliability and scalability requirements of government agencies. It reduces flash storage to a price point that makes it cost effective for distributed deployments. Our all-flash storage array is cost competitive and more efficient to administer than disk, so it’s a logical next step for government data centers.

Q: What are some of the major challenges  all-flash can help government agencies overcome? Melanie Stevens: In the past decade, we’ve seen the widespread adoption of virtualization in the data center. This has  had an amazing impact on the efficiency of how state agencies do business. Server consolidation now supports ratios around 20-to-1 per physical server, and that number  climbs with every new release of a processor. For government agencies, this means being able to do more with less, and faster. At the same time, virtualization creates its own challenges for storage. Virtualization is only as efficient as the storage on which  it runs, and it requires more resources than the pre-virtualization era. This issue is compounded by applications such as  virtual desktop infrastructure (VDI), which is commonly used to support kiosks, mobile workforces and online services offered by state agencies. While government agencies have realized ROI from server consolidation, they have to turn around and spend those savings on additional storage. As end users continue to virtualize more applications, disk arrays will only get more expensive and put government agencies further behind in the budget battle. Pure Storage provides the technology that allows government to maximize the benefits of virtualization, without inflated storage costs. So, whether the application is to manage database requirements, virtual desktop or server infrastructure, our all-flashstorage array allows for maximum performance without the usual backend cost.

 

 

 

Streamlined Reports Save Time and Money – Save With Data

Streamlined Reports Save Time and Money – Save With Data

Reducing Reporting Redundancies, Targeting and Consolidation Save 250 Hours per Month

Broadband operators rely on the billing systems to provide them with accurate and timely reports. These reports are crucial to day-to-day operations, and are increasingly important for both strategic planning and the execution of business goals.

The Client:

A region in one of America’s biggest Multi System Operators responsible for 250,000 customers.

The Business Issue:

The client was using time intensive manual processes to develop reports required to support ongoing business. Various queries were used to retrieve data from the billing systems, often times delivering inconsistent results across differing reports. The standard practice was to run and merge multiple reports from these queries, then copy the results from these queries and other sources into spreadsheets for further manipulation, all to arrive at questionable data. Coordination of reports between departments and groups was non-existent and there were no standardized Methods and Procedures (M&Ps) for developing these reports. The reporting group was stretched beyond capacity resulting in a multi-week backlog for the generation of new reports. The client retained Cliintel to help streamline and document the reporting process.

By identifying redundancies and targeting inefficient processes Cliintel achieved a reduction in internal resource workload by 250 hours/month, improved report accuracy, increased customer satisfaction and a return on the investment in less than 4 months.

The Approach:

Cliintel takes a holistic approach to every business issue presented for resolution. Our project professionals evaluate the situation, design a solution that fits, gain adoption and optimize performance. Cliintel’s focus was on increasing productivity and reducing costs by examining and evaluating reports for each department – independently and thoroughly. Using this holistic approach, Cliintel interviewed and met with end users in 8 departments to assess their reporting needs and to identify redundancy. Cliintel performed a gap analysis and inventoried and performed an audit of many of the crucial reports needed for day to day business operations. Labor intensive and inefficient manual processes were targeted and re-engineered. Consolidation opportunities were identified and redesigned.

The Solution:

Cliintel designed and developed new queries extracting only the relevant data necessary for each report. Database clean-up tools were created to ensure the accuracy of the data and work processes were streamlined to consolidate redundant reports.

Additionally Cliintel:

  • Identified and re-designed current reports and procedures
  • Implemented a strategic plan that included process re-engineering and system enhancements
  • Implemented a KPI plan and provided ongoing KPI analysis framework
  • Provided recommendations for further enhancements
  • Provided a base for future reporting development

The Results:

By identifying redundancies and targeting inefficient processes Cliintel achieved a reduction in internal resource workload by 250 hours/month, improved report accuracy, increased customer satisfaction and a return on the investment in less than 4 months.

We’re proud to help our clients solve tough problems and achieve stunning results. To see what kind of results Cliintel can deliver for you, please visit www.cliintel.com or e-mail askcliintel@cliintel.com.

Change Is Great Be First Free Book

Big Data Book - First 3 Chapters Free!

Ready for painless change?

For a limited time we are offering our book "Change is Great, Be First" by Richard Batenburg.

Please fill out the form to join our mailing list and receive the first 3 chapters of the book for free.

Be sure to confirm your email address.

Success! Now check your inbox and confirm your email address to receive the book.