approach

Results 1 - 25 of 2073Sort Results By: Published Date | Title | Company Name
Published By: Melissa Data     Published Date: Jan 18, 2018
Maintaining high quality data is essential for operational efficiency, meaningful analytics and good long-term customer relationships. But, when dealing with multiple sources of data, data quality becomes complex, so you need to know when you should build a custom data quality tools over canned solutions. To answer this question, it is important to understand the difference between rules-based data quality, where internal subject matter expertise is necessary – and active data quality, where different domain expertise and resources are required.
Tags : 
    
Melissa Data
Published By: Ted Hills     Published Date: Mar 08, 2017
This paper explores the differences between three situations that appear on the surface to be very similar: a data attribute that may occur zero or one times, a data attribute that is optional, and a data attribute whose value may be unknown. It shows how each of these different situations is represented in Concept and Object Modeling Notation (COMN, pronounced “common”). The theory behind the analysis is explained in greater detail by three papers: Three-Valued Logic, A Systematic Solution to Handling Unknown Data in Databases, and An Approach to Representing Non-Applicable Data in Relational Databases.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Ever since Codd introduced so-called “null values” to the relational model, there have been debates about exactly what they mean and their proper handling in relational databases. In this paper I examine the meaning of tuples and relations containing “null values”. For the type of “null value” representing that data are not applicable, I propose an interpretation and a solution that is more rigorously defined than the SQL NULL or other similar solutions, and which can be implemented in a systematic and application-independent manner in database management systems.
Tags : 
    
Ted Hills
Published By: Converseon     Published Date: Apr 02, 2018
Separating signals from noisy social listening data has long been a problem for data scientists. Poor precision due to slag, sarcasm and implicit meaning has often made it too challenging to effectively model. Today, however, new approaches that leverage active machine learning are rapidly over taking aging rules-based techniques and opening up use of this data in new and important ways. This paper provides some detail on the evolution of text analysis including current best practices and how AI can be used by data scientists to use this data for meaningful analysis.
Tags : 
    
Converseon
Published By: Zoomdata     Published Date: Apr 11, 2018
How do you imagine data? If you’re thinking about it in terms of uniform records and databases, it’s time to make a brain update. Back in the day, analytical engines were limited, so our perception of what could be considered data was, too. Today, the big data renaissance has begun, and actually, more of the data exists outside of databases than inside them, plus, EVERYTHING is data. We’re going to help you discover how business intelligence and data sources of today have changed, and as a result, so has our approach to analyzing data. An eBook on this very topic is waiting for you—all it takes is the click of a button.
Tags : 
    
Zoomdata
Published By: DATAVERSITY     Published Date: Nov 05, 2014
Ask any CEO if they want to better leverage their data assets to drive growth, revenues, and productivity, their answer will most likely be “yes, of course.” Ask many of them what that means or how they will do it and their answers will be as disparate as most enterprise’s data strategies. To successfully control, utilize, analyze, and store the vast amounts of data flowing through organization’s today, an enterprise-wide approach is necessary. The Chief Data Officer (CDO) is the newest member of the executive suite in many organizations worldwide. Their task is to develop and implement the strategies needed to harness the value of an enterprise’s data, while working alongside the CEO, CIO, CTO, and other executives. They are the vital “data” bridge between business and IT. This paper is sponsored by: Paxata and CA Technologies
Tags : 
chief data officer, cdo, data, data management, research paper, dataversity
    
DATAVERSITY
Published By: First San Francisco Partners     Published Date: Oct 29, 2015
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. That is why a proactive approach to aligning the organization around a common goal and plan is critical when launching a data management program.
Tags : 
    
First San Francisco Partners
Published By: Couchbase     Published Date: Jul 15, 2013
NoSQL database technology is increasingly chosen as viable alternative to relational databases, particularly for interactive web applications. Developers accustomed to the RDBMS structure and data models need to change their approach when transitioning to NoSQL. Download this white paper to learn about the main challenges that motivates the need for NoSQL, the differences between relational databases and distributed document-oriented databases, the key steps to perform document modeling in NoSQL databases, and how to handle concurrency, scaling and multiple-place updates in a non-relational database.
Tags : 
white paper, database, nosql, couchbase
    
Couchbase
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Adaptive     Published Date: May 10, 2017
Enterprise metadata management and data quality management are two important pillars of successful enterprise data management for any organization. A well implemented enterprise metadata management platform can enable a successful data quality management at the enterprise level. This paper describes in detail an approach to integrate data quality and metadata management leveraging the Adaptive Metadata Manager platform. It explains the various levels of integrations and the benefits associated with each.
Tags : 
    
Adaptive
Published By: CA Technologies     Published Date: Apr 24, 2013
This white paper by industry expert Alec Sharp illustrates these points and provides specific guidelines and techniques for a business-oriented approach to data modeling. Examples demonstrate how business professionals.
Tags : 
white paper, ca technologies, erwin, data, data management, data modeling, dataversity
    
CA Technologies
Published By: Cambridge Semantics     Published Date: Mar 13, 2015
As the quantity and diversity of relevant data grows within and outside the enterprise, how can IT easily deploy secure governed solutions that allow business users to identify, extract, link together and derive value from the right data at the right time, at big data scale, while keeping up with ever changing business needs? Smart Enterprise Data Management (Smart EDM) is new, sensible paradigm for managing enterprise data. Anzo Smart Data solutions allow IT departments and their business users to quickly and flexibly access all of their diverse data. Based upon graph data models and Semantic data standards, Anzo enables users to easily perform advanced data management and analytics through the lens of their business at a fraction of the time and cost of traditional approaches, while adhering to the governance and security required by enterprise IT groups. Download this whitepaper to learn more.
Tags : 
enterprise data management, data governance, data integration, cambridge semantics
    
Cambridge Semantics
Published By: Cambridge Semantics     Published Date: Aug 17, 2015
As the quantity and diversity of relevant data grows within and outside of the enterprise, business users and IT are struggling to extract maximum value from this data. Current approaches, including the rigid relational data warehouse and the unwieldy Hadoop-only Data Lake, are limited in their ability to provide users and IT with the answers they need with the proper governance and security required. Read this whitepaper to learn how The Anzo Smart Data Lake from Cambridge Semantics solves these problems by disrupting the way IT and business alike manage and analyze data at enterprise scale with unprecedented flexibility, insight and speed.
Tags : 
    
Cambridge Semantics
Published By: Aerospike     Published Date: Jul 17, 2014
This whitepaper defines provides a brief technical overview of ACID support in Aerospike. It includes a definition of ACID (Atomicity, Consistency, Isolation, Durability), and an overview of the CAP Theorem, which postulates that only two of the three properties of consistency, availability, and partition tolerance can be guaranteed in a distributed system at a specific time. Although Aerospike is an AP system with a proven track record of 100% uptime, this paper describes Aerospike's unique approach to avoiding network partitioning to also ensure high consistency. In addition, the paper describes how Aerospike will give users the option to support a CP configuration with complete consistency in the presence of networking partitioning by reducing availability.
Tags : 
whitepaper, data, data management, nosql, aerospike, acid
    
Aerospike
Published By: CMMI Institute     Published Date: Sep 03, 2014
To drive strategic insights that lead to competitive advantage, businesses must make the best and smartest use of today’s vast amount of data. To accomplish this, organizations need to apply a collaborative approach to optimizing their data assets. For organizations that seek to evaluate and improve their data management practices, CMMI® Institute has developed the Data Management Maturity (DMM)? model to bridge the perspective gap between business and IT. Download the white paper Why is Measurement of Data Management Maturity Important? to enable you to: - Empower your executives to make better and faster decisions using a strategic view of their data. - Achieve the elusive alignment and agreement between the business and IT - Create a clear path to increasing capabilities
Tags : 
white paper, enterprise data management, data model, data modeling, data maturity model, cmmi institute
    
CMMI Institute
Published By: Melissa Data     Published Date: Oct 27, 2014
Noted SQL Server MVP and founder/editor of SSWUG.org, Stephen Wynkoop shares his take on the challenge to achieve quality data and the importance of the “Golden Record” to an effective data quality regiment. Wynkoop explores the different approaches to achieving the Golden Record - which involves collapsing duplicate records into a single version of the truth – the one single customer view (SCV), and Melissa Data’s unique approach that takes into consideration the actual quality of the contact data as the basis of survivorship.
Tags : 
data, data management, melissa data, data quality
    
Melissa Data
Published By: iCEDQ     Published Date: Feb 05, 2015
The demand for using data as an asset has grown to a level where data-centric applications are now the norm in enterprises. Yet data-centric applications fall short of user expectations at a high rate. Part of this is due to inadequate quality assurance. This in turn arises from trying to develop data-centric projects using the old paradigm of the SDLC, which came into existence during an age of process automation. SDLC does not fit with data-centric projects and cannot address the QA needs of these projects. Instead, a new approach is needed where analysts develop business rules to test atomic items of data quality. These rules have to be run in an automated fashion in a business rules engine. Additionally, QA has to be carried past the point of application implementation and support the running of the production environment.
Tags : 
data, data management, data warehousing, data quality, etl testing, malcolm chisholm
    
iCEDQ
Published By: VoltDB     Published Date: Jul 09, 2015
What is fast data? It's data in motion, and it creates Big Data. But handling it requires a radically different approach. Download the Fast Data Stack white paper from VoltDB. Learn how to build fast data applications with an in-memory solution that’s powerful enough for real-time stateful operations.
Tags : 
data, data management, data stack, bug data, voltdb, database, nosql
    
VoltDB
Published By: VoltDB     Published Date: Feb 12, 2016
The need for fast data applications is growing rapidly, driven by the IoT, the surge in machine-to-machine (M2M) data, global mobile device proliferation, and the monetization of SaaS platforms. So how do you combine real-time, streaming analytics with real-time decisions in an architecture that’s reliable, scalable, and simple? In this report, Ryan Betts and John Hugg from VoltDB examine ways to develop apps for fast data, using pre-defined patterns. These patterns are general enough to suit both the do-it-yourself, hybrid batch/streaming approach, as well as the simpler, proven in-memory approach available with certain fast database offerings.
Tags : 
    
VoltDB
Published By: Silwood Technology     Published Date: Mar 02, 2016
Ever since organisations started to implement packaged software solutions to solve business problems and streamline their processes there has been a need to access their data for the purposes of reporting and analytics, integration, governance, master data and more. Information Management projects such as these rely on data professionals being able to understand the underlying data models for these packages in order to be able to answer the critical question “Where’s the data?”. Without this knowledge it is impossible to ensure accuracy of data or timely delivery of projects. In addition the lack of discovery tools designed to meet this challenge has meant that performing this task has commonly been frustrating, time-consuming and fraught with risk. This white paper offers insight into why the traditional methods are not effective and how an innovative software product from Silwood Technology provides a faster and more effective approach to solving the problem.
Tags : 
    
Silwood Technology
Published By: WhereScape     Published Date: Mar 16, 2016
Industry expert Wayne Eckerson provides an overview of the emerging data warehouse automation market and outlines the value of using automation tools for developing data warehouses, data marts, analytical environments and big data platforms. Eckerson details WhereScape’s architecture—which enables a data-driven approach to automation. Eckerson also discusses how agility and automation together encourage iterative development and closer collaboration between business and IT.
Tags : 
    
WhereScape
Published By: WhereScape     Published Date: Aug 18, 2016
Data Vault 2.0 leverages parallel database processing for large data sets and provides an extensible approach to design that enables agile development. WhereScape provides data warehouse automation software solutions that enable Data Vault agile project delivery through accelerated development, documentation and deployment without sacrificing quality or flexibility.
Tags : 
    
WhereScape
Published By: T4G     Published Date: Mar 15, 2017
About to embark on an advanced analytics project? Or have already started and things aren’t going as planned? This white paper will share our approach to ensure you set up for success. We will discuss aspects of data strategy and data stewardship, and why they are so important. We will outline the benefits of having a solid data strategy and how to start the data strategy conversation within your organization. We will then outline how a project’s entry point (the initial impetus for the project) impacts the scope and approach for the project. And will show you how to avoid missteps and gaps that can lead to less than stellar results or wasted effort. The white paper will touch on the importance of understanding your business drivers and how to use them as your guide to get the most out of your data driven decision making projects. Our proven approach will help ensure a successful start on your advanced analytics journey.
Tags : 
    
T4G
Published By: Alteryx     Published Date: May 24, 2017
Spreadsheets are a mainstay in almost every organization. They are a great way to calculate and manipulate numeric data to make decisions. Unfortunately, as organizations grow, so does the data, and relying on spreadsheet-based tools like Excel for heavy data preparation, blending and analysis can be cumbersome and unreliable. Alteryx, Inc. is a leader in self-service data analytics and provides analysts with the ability to easily prep, blend, and analyze all data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. This paper highlights how transitioning from a spreadsheet-based environment to an Alteryx workflow approach can help analyst better understand their data, improve consistency, and operationalize analytics through a flexible deployment and consumption environment.
Tags : 
    
Alteryx
Published By: TopQuadrant     Published Date: Jul 18, 2016
With information streaming in from more varied sources and at a faster pace than ever before, organizations are having an increasingly difficult time deriving accurate meaning from their data. Data governance systems that were once able to organize and process enterprise information are becoming too slow and limited.   Semantic information management makes it easier to reconcile data from different sources by compiling and organizing information about that data, its metadata. By connecting all kinds of data and metadata in a more accessible way, semantic information systems empower users, data stewards and analysts to unlock and use the true meaning and value of their organization’s data.     Learn more about the challenges in the evolving data landscape and how a semantic approach can help.
Tags : 
    
TopQuadrant
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.