AtScale customer success points to the key critical capabilities required to make BI on any Big Data work
San Mateo, California, June 6, 2017 — AtScale, the only company to provide enterprises with a universal semantic platform for BI on Big Data, unveils today AtScale 5.5, its flagship and patented software solution.
This release is bound to set the standard for enterprise BI on Big Data. The companies deploying AtScale software are all large organizations with sophisticated requirements and the work they have done has been credited for advancing the field of Big Data across many industries: financial services, healthcare, insurance, telecommunications, media and entertainment and retail. For instance, Macy’s was recognized with the Ventana Research Business Technology Leadership Award. Yellow Pages received the Cloudera Data Platform Optimization. And further, next week, Home Depot and GoDaddy will be featured in keynotes and sessions at the upcoming Hortonworks’ DataWorks show (June 13-16).
“AtScale was early to spot the opportunity to bring high performance interactive analytics to big data platforms,” says Matt Aslett, Research Director, Data Platforms and Analytics, 451 Research. “The company’s expanded vision, driven by the best practices learnt from customer successes, broadens its applicability and addressable audience.”
To get access to AtScale’s latest platform, go to atscaleincstg.wpengine.com/try
__Making Data Lakes work
__
The adoption of the data lake over the past few years has proven that enterprises needed a way to store vast amounts of raw data in its native format until needed for consumption. New database platforms like Hadoop or Google BigQuery have provided affordable solutions to let enterprises store data. However, when this data was not put to use, Data Lakes became Data Swamps.
According to Gartner’s August 2016 Magic Quadrant for Metadata Management Solutions report, “Through 2018, 80% of data lakes will not include effective metadata management capabilities, making them inefficient.” AtScale provides a modern approach to the Enterprise Data Lake by adding the critical capabilities required to make Enterprise Data Lakes work.
“Historically, data stored in data lakes goes unused because the organization has not figured out a way to match the performance, security or business tool integration they created with their legacy system,” says Matt Baird, CTO and co-founder at AtScale. “When they deploy AtScale, customers not only get better performance than traditional systems, they can do it on unlimited data and seamless end-user experience.”
Consider the story of a major healthcare provider, who was spending tens of millions of dollars to run fraud analysis on data stored in a traditional MPP platform. When the time came to upgrade their hardware to match their data needs, the only option was to migrate to Hadoop. There was one problem though: while the cost of the data platform was significantly lower than their old MPP, performance was abysmal and their traditional BI tools couldn’t interface Hadoop’s unstructured data model. AtScale’s universal layer was deployed between users and their data, improving query performance by a 35X factor (compared to the old MPP option) and integrated smoothly with the existing employees’ BI tools.
The AtScale architecture is the only industry’s patented solution that has been optimized to take advantage of any BI tool (MDX or SQL-based) and to plug-and-lay in any enterprise data lake, in non-intrusive ways, regardless if it is on premises or in the cloud.
__From “Lock-In” to “Lift and Shift”
__
Beyond the need to make their data lakes perform like traditional MPP databases with limitless data, enterprises are also looking to mitigate their Big Data risks across on-premises or cloud deployments. Enterprises that have heavily invested in one particular data platform delivery mode are wary of what’s called “lock-in” (an excessive investment with one vendor or one type of technology, which can reduce an enterprise’s agility and competitive advantage).
AtScale’s unique architecture is the only solution that allows customers to create one semantic layer that works on an on-premise cluster and can be transferred ‘as-is’ to a completely different environment like, for instance Google BigQuery.
Consider the experience of another AtScale customer, a large retailer based in the US. Its thousands of store managers live by their daily inventory reports. These reports, run on Microsoft Excel, contain sophisticated calculations that span across a long historical period but, which must be refreshed in seconds to allow managers to make timely decisions. This organization’s IT department spent hundreds of millions of dollars in traditional data warehouse software and hardware to guarantee performance. Until it realized the potential of Hadoop and the Cloud. In less than six months, this retailer migrated its data from a traditional MPP to Hadoop, and then to the Cloud, saving millions of dollars. This enterprise future-proofed its data investment with a smart “lift and shift”, performed without store managers missing a beat.
__Data’s Lingua Franca: The Universal Semantic Layer
__
As enterprises look into the options they have for their enterprise optimization or offload, they quickly realize that performance is important, but it’s not the only thing that matters.
Consider the story of a large insurance provider, which has 10s of millions of members, and processes 100s of millions of claims a year. When assessing the efficacy of the healthcare services reimbursed, the insurance industry uses a key metric called “PMPM” or “Per Member Per Month” (it refers to the cost of service divided by the number of members within that month). This metric, albeit seemingly appearing fairly basic, is fairly hard to get right when insurance members fluctuate, when services cost vary and when the tools used to compute and analyze these numbers range from excel spreadsheets, to Tableau reports to custom-build applications.
“Without the ability to define key metrics in one place, secure them centrally but yet make them accessible everywhere, enterprises can’t run their businesses the way they need to,” says Baird. “In the end, it’s not about technology. Our customers want to break beyond the limitations of their outdated infrastructure. They want to reduce cost but they also want to give power and freedom to their users. That’s exactly why we invented AtScale.”
__Learn More:
__
Read AtScale’s blog post about the release here.
To find out more about how AtScale works, simply go to atscaleincstg.wpengine.com/try
__Gartner Disclaimer:
__
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
__About AtScale
__
AtScale makes BI work on Big Data. With AtScale, business users get interactive and multi-dimensional analysis capabilities, directly on Big Data, at maximum speed, using the tools they already know, own and love – from Microsoft Excel to Tableau Software to QlikView. Built by Big Data veterans from Yahoo!, Google and Oracle, AtScale is already enabling the BI on Big Data revolution at major corporations across healthcare, telecommunications, retail and online industries. To see how AtScale can help your company, go to atscaleincstg.wpengine.com/try
SHARE