Many organizations today struggle to get their cards in order and turn data into dollars.
“The more data you have, the more crucial it is to better manage your master data and improve the maturity of your master data management (MDM) program,” said Saul Judah, research director at Gartner. “Existing approaches to data management are, in many cases, insufficient to accommodate big data sources on an enterprise scale. Collecting data without managing it properly also creates ongoing costs as well as regulatory and compliance risks.”
In order to save money, CIOs and Chief Data Officers who oversee big data initiatives need to consider the following steps:
Update Information Strategy and Architecture
Many organizations have had success leveraging big data insight around specific business operations, but typically it’s limited to a single business unit or use case. Few firms have explored how to make big data insights actionable across the entire organization, by linking big data sources with trusted master data.
For example, many marketing organizations use data from social sources — such as Twitter and Facebook — to inform their campaigns, but they don’t reconcile this with trusted data in customer/prospect repositories that are used by customer services or sales. This can lead to incoherent customer communication that can actually undermine the sales or customer service process.
Become More Agile
Effective use of big data requires a mixture of old and new technologies and practices. This necessitates an agile approach that applies a bimodal IT framework to information governance (see “Why Digital Business Needs Bimodal IT”). MDM traditionally uses a Mode 1 approach which is policy-driven and approval-based. Big data typically uses a Mode 2 approach with little or no predefined processes or controls. Tactical and exploratory initiatives are much better suited to the “faster” Mode 2.
Move to Limit Risk Exposure
When an organization executes actions based on information sources outside the curation of MDM — as is the case in many big data implementations — exposure to certain types of business risk increases. Factors such as poor data quality, loss of critical information, and access to unauthorized information become more likely. Gartner recommends appointing a lead information steward role in relevant business units to assist in creating and executing risk controls with regards to data use in business operations.
All of the above steps to help manage your data can quickly turn around and save or make your firm money. You have the data- now just unlock the value of it with master data management.
It’s no secret that the amount of data government agencies must store is increasing at a rapid rate- and this growth shows no signs of slowing. But government agencies can’t just build more data centers to house this data – there must be a more efficient and cost-effective solution. In this Q&A, Melanie Stevens, Director, State and Local Government and Education, discusses why all-flash is an important next step in the storage evolution for government agencies.
Q: What is all-flash and why is it important for government agencies to consider? Melanie Stevens: Many government agencies are facing a storage dilemma in their data centers. Their need for storage is growing, but there is insufficient funding to purchase new equipment and limited budget for IT staff, space, power and cooling. Across the data center, we see increasing speed and lower cost in many networking devices, such as servers and switches. However, storage has failed to keep up because of the way mechanical disk works. Capacity and cost have grown, but performance has stayed flat. Because of this, the performance per gigabyte of disk is getting slower. Flash memory is faster, has more space and is much more power efficient than disk. The Pure Storage all-flash storage array meets the availability, reliability and scalability requirements of government agencies. It reduces flash storage to a price point that makes it cost effective for distributed deployments. Our all-flash storage array is cost competitive and more efficient to administer than disk, so it’s a logical next step for government data centers.
Q: What are some of the major challenges all-flash can help government agencies overcome? Melanie Stevens: In the past decade, we’ve seen the widespread adoption of virtualization in the data center. This has had an amazing impact on the efficiency of how state agencies do business. Server consolidation now supports ratios around 20-to-1 per physical server, and that number climbs with every new release of a processor. For government agencies, this means being able to do more with less, and faster. At the same time, virtualization creates its own challenges for storage. Virtualization is only as efficient as the storage on which it runs, and it requires more resources than the pre-virtualization era. This issue is compounded by applications such as virtual desktop infrastructure (VDI), which is commonly used to support kiosks, mobile workforces and online services offered by state agencies. While government agencies have realized ROI from server consolidation, they have to turn around and spend those savings on additional storage. As end users continue to virtualize more applications, disk arrays will only get more expensive and put government agencies further behind in the budget battle. Pure Storage provides the technology that allows government to maximize the benefits of virtualization, without inflated storage costs. So, whether the application is to manage database requirements, virtual desktop or server infrastructure, our all-flashstorage array allows for maximum performance without the usual backend cost.
Process Redesign and Software Enhancements Reduces Operating Expenses
Customer service is critical for the success of most companies. But it comes at a cost. For a leading provider of industrial-strength hardware, software and services this cost had become inordinately high due to a business decision to “give service” or support to customers regardless of entitlement. Cliintel was asked to assess the situation, create new processes to correct the problems in the software which had been developed for the European/Middle East and Asian (EMEA) markets, and then to oversee its deployment in the Americas.
The Client:
The Client is a leading provider of industrial-strength hardware, software and services that power the internet with installations in more than 100 countries.
The Business Issue:
The client had reached an impassable situation. During a period of quick expansion, both the database available to the call center representatives and the tools needed to verify customer entitlement for services contained significant inaccuracies. In an attempt to address these issues, the client created a policy of providing unnecessary service to assuage customers and attempt to maintain customer retention. The customers soon became accustomed to contacting the call center and simply demanding service which often resulted in a technician being dispatched to their location to resolve issues. This practice was barely maintaining customer satisfaction levels, but was creating unnecessary costs in excess of $4 million per quarter.
The company needed a solution to the systemic problems of inaccurate customer information. In other words, the client needed a tool that would quickly and accurately determine the entitlement status for any given customer 24×7, 365 days a year as well as provide an action plan for correcting, uplifting or abandoning customers who were out of their warrantee or support period.
The project was completed under budget, within the 45 day schedule, with the client realizing a savings of $4 million in the 1st quarter, eliminating the unnecessary costs previously being incurred.
The Challenge:
The client database mining algorithm had been developed in the EMEA markets for the purpose of handling such a problem. The challenge was to deploy this tool in the Americas, where the customer base was 25 times the size of the EMEA markets currently using the tool, with a deadline looming in 45 days.
The Solution:
The solution was to gather stakeholders, agree upon a level of service and negotiate with the Sales and Service Delivery organizations on an acceptable response time for corrective action. Then they could proceed to development to rapidly enhance the EMEA product for the Americas and deploy it in 45 days.
Due to the incredibly short deadline Cliintel employed a strategy to break down the processes into manageable units. A Rational Unified Process (RUP) model was used to complete the development of the enhanced software, followed by a Change Acceptance Process (CAP) developed with the stakeholders to ensure a smooth and efficient transition to deployment.
The Project Results:
The project was completed under budget, within the 45 day schedule, with the client realizing a savings of $4 million in the 1st quarter, eliminating the unnecessary costs previously being incurred.
We’re proud to help our clients solve tough problems and achieve stunning results. To see what kind of results Cliintel can deliver for you, please visit www.cliintel.com or e-mail askcliintel@cliintel.com.
The appropriate use of information and analytics will be critical to achieving our shared economic and environmental goals, especially given the urgency of climate change. Bringing American energy into the 21st century is imperative – and an incredible opportunity.
Reliable Reports and Metrics Improve Customer Service
In the initial phases of the telephony over Hybrid Fiber Coax, one of the largest Multi-System Operators (MSO) in the broadband industry knew they had issues with both customer installations and customer satisfaction, but lacked the ability to identify the extent of the problems or the reasons behind them. The organization called upon Cliintel to develop a system to capture reliable, actionable data and to assist them with interpretation of that data.
The Business Issue:
The corporate telephony field operations group at the MSO needed a better understanding of the total installation process. This group was receiving conflicting information around the daily installs from disparate markets across the United States, each of which were performing thousands of installs per day. Reporting was not standardized, and the accuracy of information was very much in doubt.
To manage the situation, the MSO needed to:
Identify which data was important to capture
Design a reliable means to obtain the data
Determine how to interpret the data
Management wanted to know the reasons for procedural problems during an installation, how long the installs were taking, and set a reasonable baseline a true time per task. Additionally, they wanted to find out the causes of the number of “no shows” for service appointments
As new procedures were adopted, installation times and “no shows” were reduced resulting in revenue generation in excess of $10M.
The Solution:
Cliintel worked with the MSO to identify and source all of the critical data points. This included creating Reason Codes to describe why an appointment was missed, cancelled or rescheduled. Additionally Cliintel resources created business rules for allocating technician drive time. Once the critical data elements were identified, an automated reporting system was developed and deployed. This system greatly reduced the “human error” element that had confounded earlier attempts at manually gathering data. Meetings were facilitated between the corporate and individual market groups to define the standards against which the data would be measured, such as: How long each type of job should take? The meetings were continued as reports were made available to the organization on a daily, weekly, monthly and quarterly basis. Charts, tables and graphs were utilized to help the organization understand the data. The meetings with the corporate and market groups continued to aid the organization in understanding what it was seeing and to develop strategies to deal with the identified problems.
The Results:
Trends were uncovered highlighting problems with provisioning efficiencies, installation times and customer satisfaction levels. These trends were identified in the reports and provided decision support and agnostic intelligence, actionable at the various levels within the organization. The publication of the results enabled the global installation standards team to identify new procedures to address the trends. As new procedures were adopted, installation times and “no shows” were reduced resulting in revenue generation in excess of $10M. A concurrent rise in customer satisfaction levels confirmed that the team had identified the important variables. This was a two-year project that left the MSO with a reporting tool that worked to help the organization understand and improve the daily installation process for the telephony product line across the enterprise.
We’re proud to help our clients solve tough problems and achieve stunning results. To see what kind of results Cliintel can deliver for you, please visit www.cliintel.com or e-mail askcliintel@cliintel.com.
Recent Comments