tools

Results 1 - 25 of 2189Sort Results By: Published Date | Title | Company Name
Published By: Databricks     Published Date: Sep 13, 2018
Learn how to get started with Apache Spark™ Apache Spark™’s ability to speed analytic applications by orders of magnitude, its versatility, and ease of use are quickly winning the market. With Spark’s appeal to developers, end users, and integrators to solve complex data problems at scale, it is now the most active open source project with the big data community. With rapid adoption by enterprises across a wide range of industries, Spark has been deployed at massive scale, collectively processing multiple petabytes of data on clusters of over 8,000 nodes. If you are a developer or data scientist interested in big data, learn how Spark may be the tool for you. Databricks is happy to present this ebook as a practical introduction to Spark. Download this ebook to learn: • Spark’s basic architecture • Why Spark is a popular choice for data analytics • What tools and features are available • How to get started right away through interactive sample code
Tags : 
    
Databricks
Published By: Syncsort     Published Date: Oct 30, 2018
In today’s digital business world, accurate address and geolocation data is invaluable. But, address capture is something that is often overlooked from a user and business perspective. Clean address capture can enhance user experience, not only while entering an address, but also during the entire customer journey. Download this eBook to learn how global address validation tools can help: • Simplify address capture on web-based forms • Accurately pinpoint customer locations • Improve efficiencies in shipping and logistics • Define sales and marketing territories and enrich address information
Tags : 
    
Syncsort
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: Trillium Software     Published Date: Dec 17, 2015
Digital business and disruptive technologies continue to fuel solid growth in the data quality tools market, alongside traditional cost reduction and process optimization efforts. This Magic Quadrant will help CIOs, chief data officers and information leaders find the best vendor for their needs.
Tags : 
    
Trillium Software
Published By: Information Asset, LLC     Published Date: Feb 11, 2014
An In-Depth Review of Data Governance Software Tools: Reference Architecture, Evaluation Criteria, and Vendor Landscape
Tags : 
white paper, data governance, data, data management, data management white paper, data governance white paper
    
Information Asset, LLC
Published By: Paxata     Published Date: Nov 29, 2016
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Paxata     Published Date: Mar 21, 2017
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: Pentaho     Published Date: Nov 03, 2016
While a whole ecosystem of tools has sprung up around Hadoop to handle and analyze data, many of them are specialized to just one part of a larger process. In order to fulfill the promise of Hadoop, organizations need to step back and take an end-to-end view of their analytic data pipelines.
Tags : 
    
Pentaho
Published By: AT&T     Published Date: Sep 11, 2014
The age of Big Data is upon us. Storage costs are going down, and data analytics is becoming more capable and more user-friendly. Even your auto mechanic will be storing a petabyte of data soon. Big Data will give businesses new insights and help improve operations. With these new tools come questions about how to use them. But your mechanic knows more about fixing a transmission than developing a Hadoop cluster, and similar concerns hold true for larger enterprises. Businesses everywhere are looking for guidance.
Tags : 
    
AT&T
Published By: Melissa Data     Published Date: Mar 23, 2017
In this eBook published by Melissa, author David Loshin explores the challenges of determining when data values are or are not valid and correct, how these values can be corrected, and how data cleansing services can be integrated throughout the enterprise. This Data Quality Primer eBook gives an overview of the five key aspects of data quality management (data cleansing, address data quality, address standardization, data enhancement, and record linkage/matching), as well as provides practical aspects to introduce proactive data quality management into your organization.
Tags : 
    
Melissa Data
Published By: Melissa Data     Published Date: Jan 18, 2018
Maintaining high quality data is essential for operational efficiency, meaningful analytics and good long-term customer relationships. But, when dealing with multiple sources of data, data quality becomes complex, so you need to know when you should build a custom data quality tools over canned solutions. To answer this question, it is important to understand the difference between rules-based data quality, where internal subject matter expertise is necessary – and active data quality, where different domain expertise and resources are required.
Tags : 
    
Melissa Data
Published By: Trillium Software     Published Date: Apr 10, 2017
For the 11th consecutive year, the Gartner Magic Quadrant for Data Quality Tools1 research report positions Trillium Software as a leader in the Data Quality Software industry. Data Quality is vital to ensuring trust in your data-driven, decision making business processes. Confidence is the result of a well thought out and executed data quality management strategy and is critical to remaining competitive in a rapidly and ever-changing business world. The 2016 Gartner Magic Quadrant for Data Quality Tools report is a valuable reference, providing the latest insights into the strengths and cautions of leading vendors. Access the report to learn how a leading data quality solution can help you achieve your long-term strategic objectives.
Tags : 
    
Trillium Software
Published By: Silwood Technology     Published Date: Mar 02, 2016
Ever since organisations started to implement packaged software solutions to solve business problems and streamline their processes there has been a need to access their data for the purposes of reporting and analytics, integration, governance, master data and more. Information Management projects such as these rely on data professionals being able to understand the underlying data models for these packages in order to be able to answer the critical question “Where’s the data?”. Without this knowledge it is impossible to ensure accuracy of data or timely delivery of projects. In addition the lack of discovery tools designed to meet this challenge has meant that performing this task has commonly been frustrating, time-consuming and fraught with risk. This white paper offers insight into why the traditional methods are not effective and how an innovative software product from Silwood Technology provides a faster and more effective approach to solving the problem.
Tags : 
    
Silwood Technology
Published By: Looker     Published Date: Mar 15, 2016
Data centralization merges different data streams into a common source through unified variables. This process can provide context to overly-broad metrics and enable cross-platform analytics to guide better business decisions. Investments in analytics tools are now paying back a 13.01:1 return on investment (ROI), with increased returns when these tools integrate with three or more data sour- ces. While the perks of centralization are obvious in theory, the quantity and variety of data available in today’s landscape make this difficult to achieve. This report provides a roadmap for how to connect systems, data stores, and institutions (both technological and human). Learn: • How data centralization enables better analytics • How to redefine data as a vehicle for change • How the right BI tool eliminates the data analyst bottleneck • How to define single sources of truth for your organization • How to build a data-driven (not just data-rich) organization
Tags : 
    
Looker
Published By: Alation     Published Date: Mar 15, 2016
curation (noun): The act of organizing and maintaining a collection (such as artworks, artifacts, or data). Data curation is emerging as a technique to support data governance, especially in data-driven organizations. As self-service data visualization tools have taken off, sharing the nuances and best practices of how to use data becomes ever more critical. Analysts at companies from eBay to Safeway and Square are scaling their data knowledge through curation techniques. What are the 4 steps to successful data curation? Find out here:
Tags : 
data stewardship, self-service analytics, data curation, data governance
    
Alation
Published By: WhereScape     Published Date: Mar 16, 2016
Industry expert Wayne Eckerson provides an overview of the emerging data warehouse automation market and outlines the value of using automation tools for developing data warehouses, data marts, analytical environments and big data platforms. Eckerson details WhereScape’s architecture—which enables a data-driven approach to automation. Eckerson also discusses how agility and automation together encourage iterative development and closer collaboration between business and IT.
Tags : 
    
WhereScape
Published By: ROKITT     Published Date: Apr 11, 2016
Few things benefit an organization as much as information governance. Data is now one of the most valuable holdings for any business, but unfortunately in many environments much of the data is ignored and its potential value lost. Ignored data is also inherently less secure than data that’s tracked. Businesses need a way to bring hidden data out of the shadows and make it safe and useful again. Data discovery facilitates unearthing previously unknown data relationships. Mapping data flow and data lineage helps make data safe, compliant, and auditable. Good metadata makes a system more navigable. All these tools make data more accessible to staff and more useful for capitalizing on business opportunities.
Tags : 
    
ROKITT
Published By: Snowflake Computing     Published Date: Apr 19, 2016
Data warehouse as a service brings scalability and flexibility to organizations seeking to deliver data to all users and systems that need to analyze it. The ability to access and analyze data is the critical foundational element for competing in new and old industries alike. Yet, a recent survey of IT executives finds that most are still struggling— and frustrated — with widely used data analytics tools. Find out what your peers are saying, and see how your data analytics environment compares.
Tags : 
    
Snowflake Computing
Published By: Alteryx     Published Date: May 24, 2017
Spreadsheets are a mainstay in almost every organization. They are a great way to calculate and manipulate numeric data to make decisions. Unfortunately, as organizations grow, so does the data, and relying on spreadsheet-based tools like Excel for heavy data preparation, blending and analysis can be cumbersome and unreliable. Alteryx, Inc. is a leader in self-service data analytics and provides analysts with the ability to easily prep, blend, and analyze all data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. This paper highlights how transitioning from a spreadsheet-based environment to an Alteryx workflow approach can help analyst better understand their data, improve consistency, and operationalize analytics through a flexible deployment and consumption environment.
Tags : 
    
Alteryx
Published By: Syncsort     Published Date: Jan 04, 2018
The term Big Data doesn’t seem quite “big enough” anymore to properly describe the vast over-abundance of data available to organizations today. As the volume and variety of Big Data sources continue to grow, the level of trust in that data remains troublingly low. Read on and discover how a strong focus on data quality spanning the people, processes and technology of your organization will help keep your data lake pristine.
Tags : 
    
Syncsort
Published By: Octopai     Published Date: Sep 01, 2018
For many BI professionals, every task can feel like MISSION IMPOSSIBLE. All the manual mapping required to sort out inconsistencies in data and the lack of tools to simplify and shorten the process of finding and understanding data leaves BI groups frustrated and slows down the business. This whitepaper examines the revolutionary impact of automation on the cumbersome manual processes that have been dragging BI down for so long. • Data correction vs process correction • Root-cause analysis with data lineage: reverse-tracing the data flow • Data quality rules and data controls • Automated data lineage mapping
Tags : 
    
Octopai
Published By: Ataccama     Published Date: Mar 16, 2018
Discover how a major healthcare provider and facility operator in the United States implemented Ataccama ONE data quality tools to scale their data quality across the enterprise.
Tags : 
    
Ataccama
Published By: Embarcadero     Published Date: Apr 02, 2014
IT professionals in organizations developing an enterprise data modeling program may feel overwhelmed at the scope and complexity of initiating new methods, tools, and techniques. Whether their organization is just starting out or experienced in enterprise data modeling efforts, there are certain pitfalls that can become obstacles to success. This paper looks at the benefits of an effective enterprise data modeling effort and discusses seven common mistakes that organizations can make in developing enterprise data models.
Tags : 
    
Embarcadero
Published By: Google     Published Date: Oct 26, 2018
To remain competitive, executives must embrace change and equip employees to do the same. Conrad Electric, a 95-year-old online retail company, took just 4 months to migrate to G Suite, providing 4,000 of their employees with more transparent and agile collaboration capabilities. Read this Google Cloud report and get insights to creating a culture where people — combined with productivity tools — can reach their full potential.
Tags : 
    
Google
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.