by Tara Buck | Nov 2, 2015 | Big Data, Predictive Analytics
The music industry has gone through huge changes over the last 10 years, and finding a “fair” way for artists to make money for their work has become something of a project. CD sales plummeted ever since digital downloads became so popular and readily available. Recently, digital downloads have been steadily declining as music-streaming platforms such as Pandora, Spotify, and Apple Music have gained a rapid following.
Simply put, the industry is changing. So, how can we help artists receive the compensation and recognition they deserve? The answer to that lies within big data.
Pandora’s recent acquisition of Ticketfly gives us a glimpse at the future of the streaming industry- a future that is evolving to include live music. And that’s where the money is- if Pandora can manage the big data.
When streaming was introduced, it gave previously unheard artists a new opportunity to be heard and put the listeners in the driver’s seat. Streaming companies may now be able to drive those changes by using data and directing listeners straight to live events.
Traditionally, ticketing companies have used purchased data along with associated demographic data to target their customers for upcoming events. That should be enough right? Wrong. To accurately use predictive analytics to target these music lovers, streaming companies need to use their years of data from their millions of users. This data is rich with usable information, whether it be demographics from registration, what artists one skips, or what time of day they prefer certain music. Combining this with fan data from ticketing companies benefits both the artist and the listener. So far, Pandora is on the right track with this.
Selling concert tickets to fans based on the types of music they enjoy is not a breakthrough advancement in this industry. However, what can act as a radical change is going to revolve around what data is being used and in conjunction with what.
With Pandora’s recent acquisition, they will not only be able to suggest artists similar to those you already like, but they will also be able to notify you of upcoming concerts that they predict you will like.
By using big data to grow and monetize these fan bases, more live event tickets will be sold and artists will have the opportunity to gain the credit they deserve.
Click here to learn more about how big data can help solve your problems.
by Tara Buck | Oct 18, 2015 | Big Data, Business Optimization, Measuring Results, Save With Data
There is data that can be found in every industry and profession out there- from healthcare to professional sports. If the legal profession is meant to be involved in all of these as well, one question comes to the front of my mind: how can big data help the legal profession?
Let’s start of by stating that there is so much data being collected around the world, and depending on how it is stored, catalogued and indexed, it can take a person days or weeks to sort through. Thankfully in today’s world, however, we have the luxury of programs and computers- computers that can go through this data instantly. Having these programs and applications accessible to us can instantly bring down the cost of analyzing the data, provided you get it into the right format.
Let’s take an example of a case involving product liability. There is a product that is defective. Where does big data come in?
Well now you have all of the records from the creation of that product: the memos, the mockups, and drawings. You have everything that had to do with the end product. You may even have the reviews and testimonials- good or bad. What would have taken months to go through by hand may now take an hour with some computer application. This is because the computer is going to search for certain words, and if the attorney picks the right words and the data is set up in the right format, the attorney now has everything they need at their fingertips. And all in a matter of minutes!
One may think it would be enough as an attorney to go out there and say ‘this product is defective and here are 500,000 complaints about it.’ The reality is that you can actually go one step further with the tools that are now given to us. The attorney can go back into all of the 500,000 complaints and determine what the common component is- and maybe that common component is user error. Then the tables suddenly turn and it is not the product- it is the user!
Big data can now help lead attorneys to answers that they may have never gotten a chance to come to with time and money constraints. Whatever side may be benefitting, information is power and big data is providing just that!
by Tara Buck | Aug 26, 2015 | Big Data, Project Management, Save With Data
Many organizations today struggle to get their cards in order and turn data into dollars.
“The more data you have, the more crucial it is to better manage your master data and improve the maturity of your master data management (MDM) program,” said Saul Judah, research director at Gartner. “Existing approaches to data management are, in many cases, insufficient to accommodate big data sources on an enterprise scale. Collecting data without managing it properly also creates ongoing costs as well as regulatory and compliance risks.”
In order to save money, CIOs and Chief Data Officers who oversee big data initiatives need to consider the following steps:
Update Information Strategy and Architecture
Many organizations have had success leveraging big data insight around specific business operations, but typically it’s limited to a single business unit or use case. Few firms have explored how to make big data insights actionable across the entire organization, by linking big data sources with trusted master data.
For example, many marketing organizations use data from social sources — such as Twitter and Facebook — to inform their campaigns, but they don’t reconcile this with trusted data in customer/prospect repositories that are used by customer services or sales. This can lead to incoherent customer communication that can actually undermine the sales or customer service process.
Become More Agile
Effective use of big data requires a mixture of old and new technologies and practices. This necessitates an agile approach that applies a bimodal IT framework to information governance (see “Why Digital Business Needs Bimodal IT”). MDM traditionally uses a Mode 1 approach which is policy-driven and approval-based. Big data typically uses a Mode 2 approach with little or no predefined processes or controls. Tactical and exploratory initiatives are much better suited to the “faster” Mode 2.
Move to Limit Risk Exposure
When an organization executes actions based on information sources outside the curation of MDM — as is the case in many big data implementations — exposure to certain types of business risk increases. Factors such as poor data quality, loss of critical information, and access to unauthorized information become more likely. Gartner recommends appointing a lead information steward role in relevant business units to assist in creating and executing risk controls with regards to data use in business operations.
All of the above steps to help manage your data can quickly turn around and save or make your firm money. You have the data- now just unlock the value of it with master data management.
by Tara Buck | Aug 7, 2015 | Business Intelligence, Business Optimization
It’s no secret that the amount of data government agencies must store is increasing at a rapid rate- and this growth shows no signs of slowing. But government agencies can’t just build more data centers to house this data – there must be a more efficient and cost-effective solution. In this Q&A, Melanie Stevens, Director, State and Local Government and Education, discusses why all-flash is an important next step in the storage evolution for government agencies.
Q: What is all-flash and why is it important for government agencies to consider? Melanie Stevens: Many government agencies are facing a storage dilemma in their data centers. Their need for storage is growing, but there is insufficient funding to purchase new equipment and limited budget for IT staff, space, power and cooling. Across the data center, we see increasing speed and lower cost in many networking devices, such as servers and switches. However, storage has failed to keep up because of the way mechanical disk works. Capacity and cost have grown, but performance has stayed flat. Because of this, the performance per gigabyte of disk is getting slower. Flash memory is faster, has more space and is much more power efficient than disk. The Pure Storage all-flash storage array meets the availability, reliability and scalability requirements of government agencies. It reduces flash storage to a price point that makes it cost effective for distributed deployments. Our all-flash storage array is cost competitive and more efficient to administer than disk, so it’s a logical next step for government data centers.
Q: What are some of the major challenges all-flash can help government agencies overcome? Melanie Stevens: In the past decade, we’ve seen the widespread adoption of virtualization in the data center. This has had an amazing impact on the efficiency of how state agencies do business. Server consolidation now supports ratios around 20-to-1 per physical server, and that number climbs with every new release of a processor. For government agencies, this means being able to do more with less, and faster. At the same time, virtualization creates its own challenges for storage. Virtualization is only as efficient as the storage on which it runs, and it requires more resources than the pre-virtualization era. This issue is compounded by applications such as virtual desktop infrastructure (VDI), which is commonly used to support kiosks, mobile workforces and online services offered by state agencies. While government agencies have realized ROI from server consolidation, they have to turn around and spend those savings on additional storage. As end users continue to virtualize more applications, disk arrays will only get more expensive and put government agencies further behind in the budget battle. Pure Storage provides the technology that allows government to maximize the benefits of virtualization, without inflated storage costs. So, whether the application is to manage database requirements, virtual desktop or server infrastructure, our all-flashstorage array allows for maximum performance without the usual backend cost.
by Rich Benvin | Aug 4, 2015 | Case Studies, Creating Competitive Advantage
Through an aggressive merger and acquisition phase, one of the broadband industries largest Multi-System Operators (MSO), found themselves needing help in systematically migrating their subscriber billing and workforce automation systems. To reach this goal, the client enlisted the help of experts at Cliintel to both maintain service levels as well as improve efficiency in the field.
The Client:
The MSO was the world’s largest provider of cable television, high-speed internet and local telephone service to 14 markets across the United States.
The Business Issue:
The Atlanta market consisted of over 100 dispatchers and over 500 technicians, all serving 600,00+ subscribers. This market utilized multiple groups of contractors to assist in field installations, service and audit functions. Atlanta was also the last of nine markets to be on an older, outgoing Workforce Management tool. Severe contractual penalties were associated with an extensive data center being kept on-line to support these tools. If the system was not shut off within 60 days from project initiation, the previously pro-rated cost to the market would exceed $1 million a month. The problem was particularly precarious in this case because the Atlanta market division of the MSO was highly dependent on the outgoing system that was actively utilized by the routers, dispatchers and field technicians.
Due to the success of this migration, increases in the efficiency of in-house labor allowed for the reduction of contract labor, allowing the MSO to realize an immediate profit of over $5 million, with a gross savings of over $300,000 in the first 90 days.
The Approach:
The Atlanta market’s staff had grown accustomed to change and transition. The decision to utilize Cliintel was based upon the team’s ability to execute the project plan utilizing the best practices identified from lessons learned from the prior deployments to successfully overcome the risks and issues brought to bear during the implementation. The urgency to sunset the old system and avoid the associated costs was extreme, mandating that the deployment of the new solution be on time, on budget, and fully available in production. It also had to be deployed with performance
reporting and cost/benefit trending analysis capabilities.
The Solution:
Discovery meetings were held by the project team to gather information from the market’s staff, vendors and management. The meetings engaged upper management in making high-level decisions on configuration, and to gain buy-in. Meetings between upper and middle management were held to assess criteria for these two levels and to achieve goal alignment between upper and middle management. Finally, meetings were held with front-end users and supervisory staff.
The goal of the meetings was to properly size the level of effort required to deploy the work force automation platform into this system across the four key departments: routing, dispatch, field technicians, and administrative.
Normally, training was held in a phased format utilizing one team of trainers to work through each department. In this instance, training was required in parallel in a “just-in-time” fashion. This approach required exacting coordination between the project management staff, market staff from dispatch, routing and field supervisors and both the corporate and market training staffs. Coordination would also have to occur in parallel with the wireless hardware and service provider, as well as the vendor for the work force automation platform itself.
The field, including that day’s routing, dispatch and all phone, radio and internet communication, would have to make a hard cut from the old workforce automation system, cellular phone and radio, to one handheld device in the same day. Technicians would hand in their hand held device on their way into training and leave to deliver their portion of the day’s production, with a new handheld device, having been qualified as proficient – all in the same shift.
The Results:
Cliintel was highly successful in coordinating the entire span of this endeavor and moved from field service to implementation and utilization by the network maintenance and outage dispatch teams. The rapid implementation schedule throughout the affected departments, and fast utilization of the handheld devices were attributable to unwavering executive sponsorship with consistent reinforcement. Logical, focused processes helped to achieve buy in, goal-alignment, and participation in training and meetings where the players were able to see the commitment from upper management. If resistance was noted, issues could be addressed on the spot and be immediately resolved.
The solution was deployed on time and under budget. All departments were successfully using the new system on day one. The outgoing product was smoothly replaced and the client was able to shut down the data center without incurring the $1 million per month penalty.
Due to the success of this migration, increases in the efficiency of in-house labor allowed for the reduction of contract labor, allowing the MSO to realize an immediate profit of over $5 million, with a gross savings of over $300,000 in the first 90 days.
We’re proud to help our clients solve tough problems and achieve stunning results. To see what kind of results Cliintel can deliver for you, please visit www.cliintel.com or e-mail askcliintel@cliintel.com.
Recent Comments