Technology News

latest updates from easySERVICE™

Gartner survey said Companies would run their Mission-Critical BI in the Cloud

Cloud-Computing

In each of the last four years, around 30% of respondents to a Gartner survey said they’d run their mission-critical BI in the cloud. This year, however, nearly half — 45% — said they would adopt cloud BI.

Researchers at Gartner say that 2014 may be the tipping point for cloud BI. In each of the last four years, around 30% of respondents to a Gartner survey said they’d run their mission-critical BI in the cloud. This year, however, nearly half — 45% — said they would adopt cloud BI.

Historically, cloud BI products have been most appealing to smaller businesses, in part because those are less likely to have an IT department that can manage an on-premises product. However, analysts are starting to see larger companies adopting cloud BI, typically starting with individual groups or departments.

Shifting data analytics to the cloud doesn’t come without its challenges, though. For example, it’s unlikely that all corporate data will move to the cloud, particularly in larger enterprises. That means many businesses will have to map data from both cloud and on-premises sources to the BI software, whether that software itself is on-premises or in the cloud. Also, bandwidth constraints may slow down data transfers and can lead to increased costs, if a business must upgrade its connectivity to improve data transfer.

Nevertheless, some businesses have already adopted cloud BI services, analysts report anecdotally, though specific figures aren’t available. Many companies that have made the move say that the benefits — including fast time to market, no need to maintain on-premises software and simplicity of use — outweigh any downsides.

Mixing up data sources

Take Millennial Media, which sells a mobile advertising platform. It needed to pull together data from disparate sources, both on site and in the cloud.

Around two and a half years ago, Bob Hammond, CTO for Millennial, began looking into BI as a way to marry data from Salesforce with transactional and financial information from in-house systems and then let decision makers at the company visualize it.

“No human I know of can . . . make business decisions based on data that hasn’t been brought together into a single source,” he says. The company needed BI, he says, because “we weren’t able to take data from multiple systems and connect that data logically and view that data in a UI so that we could understand what was going on.”

Making the move

Most traditional BI vendors will start shifting toward the cloud, if they haven’t already, says Carsten Bange, founder and CEO of Business Application Research Center, an analyst firm that specializes in enterprise software. “How these will look is the big question,” he says.

Some of the traditional vendors are likely to offer essentially hosted versions of their software, rather than full multi-tenant SaaS, he says. A multi-tenant SaaS app serves multiple customers from the same server. In a hosted scenario, one instance of the software serves only one customer.

A hosted environment isn’t necessarily bad for end users but it’s not cost effective for the vendor. The vendors are likely to go through the pain of rewriting their apps in order to deliver them as true SaaS, meaning at some point there will have to be a transition to a new service, Bange says. That could present challenges for users.

— Nancy Gohring

He also wanted to let more people in the organization, like data analysts, assemble reports, rather than limiting report-making to technologists who know how to code and interact with back-end databases. Plus, he needed a system that was flexible so the software would be easy to maintain and it would be easy to create new use cases.

Hammond eliminated on-premises BI software options in part because he didn’t want to incur the costs associated with managing and maintaining it. Time to market was also important.

Millennial ended up choosing Good Data’s cloud BI offering and had its initial project in place in about three months. Subsequent projects have taken closer to a month to get up and running, Hammond says.

Sending on-premises data to Good didn’t turn out to be much of a problem for Millennial. Each day the company generates around 10TB of raw data but transfers only around 18MB of compressed data to Good. “We do all the transformation of raw data into only the specific data we want in our systems before we transfer it into the cloud,” he says.

Not all businesses do such a great job of managing that data transfer, though. “What we tend to see is it’s rather difficult to keep the amount of data moving between the database and the analytics tool small,” says Gartner’s Tapadinhas. In other words, keeping data transfers small is important in cloud BI to manage both costs and upload/download bandwidth issues.

At Millennial, engineers handle the job of extracting data from the various sources and uploading it to Good Data. In addition, two data analysts have now created 500 reports. Around 40 additional people at Millennial have access to those reports and can combine them, drill down into them and create portfolios of reports to share.

Building tiers of users, each with different permissions, allows more people in the organization to work with the data — but safely, Hammond says. That means business executives, who aren’t necessarily trained to be data scientists, have some latitude to combine and rework reports but are less likely to make mistakes because they don’t have the permission to, for instance, pull in new data from a back-end database, he says.

Speed and flexibility drive cloud adoption

Athenahealth, a provider of Web-based software and services to medical practices, had most of the data it wanted to analyze in one place internally. About a year ago, the company set out to find a better way to track the hundreds of customer implementations it might be working on at any given time, says Adam Weinstein, director of core analytics at Athenahealth.

Because we have a cloud-based platform, we have real-time access to see what’s going on. Adam Weinstein, director of core analytics, Athenahealth

“Because we have a cloud-based platform, we have real-time access to see what’s going on,” he says. The biggest challenge: “Taking the data we have about what our clients are doing and how they’re progressing in the implementation process and turning that into what we call a nerve center, or a way we can actively monitor exceptions to the process.”

Athenahealth wanted a system that would collect information about every point in the implementation life cycle in order to easily find problem areas. For instance, clients route their fax machines to the Athenahealth system. If no faxes are coming in for a given customer, it could mean the customer hasn’t yet rerouted the fax number. Or, for a long-time customer, if the percentage of fax information coming in increases relative to electronic information, that could mean someone mistakenly changed a setting.

When Athenahealth started looking for a BI product that could meet its needs, it had a few additional requirements. The vendor “had to be able to move quickly because we had a fairly strict timeline, in the two- to three-month time frame, to deliver on this project,” Weinstein says.

Also, the company wanted a product that would meet analytics needs going forward, too. “We wanted to invest in more of a platform, not just a one-time solution,” he says.

Source: Associated Press

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: