at

Results 1 - 25 of 37345Sort Results By: Published Date | Title | Company Name
Published By: Syncsort     Published Date: Feb 21, 2019
With data lakes offering the means to capture, store and utilize a broad array of internal, third party and external data from a variety of sources, organizations of all types are ready to gain greater and better insights. While this promise of Big Data and improved visibility is substantial, data is pretty much useless if it can’t be trusted. The only way to be certain that your data governance policies are consistently followed and enforced is to ensure data quality across your IT systems. Download this white paper with Information Management, Discover the Value of Data Quality for Data Governance Success, to learn more about empowering your data governance program with quality data.
Tags : 
    
Syncsort
Published By: Denodo     Published Date: Mar 01, 2019
With the advent of big data and the proliferation of multiple information channels, organizations must store, discover, access, and share massive volumes of traditional and new data sources. Data virtualization transcends the limitations of traditional data integration techniques such as ETL by delivering a simplified, unified, and integrated view of trusted business data. Learn how you can: • Conquer siloed data in the enterprise • Integrate all data sources and types • Cope with regulatory requirements • Deliver big data solutions that work • Take the pain out of cloud adoption • Drive digital transformation
Tags : 
    
Denodo
Published By: DATAVERSITY     Published Date: Jul 24, 2014
Will the “programmable era” of computers be replaced by Cognitive Computing systems which can learn from interactions and reason through dynamic experience just like humans? With rapidly increasing volumes of Big Data, there is a compelling need for smarter machines to organize data faster, make better sense of it, discover insights, then learn, adapt, and improve over time without direct programming. This paper is sponsored by: Cognitive Scale.
Tags : 
data, data management, cognitive computing, machine learning, artificial intelligence, research paper
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Nov 05, 2014
Ask any CEO if they want to better leverage their data assets to drive growth, revenues, and productivity, their answer will most likely be “yes, of course.” Ask many of them what that means or how they will do it and their answers will be as disparate as most enterprise’s data strategies. To successfully control, utilize, analyze, and store the vast amounts of data flowing through organization’s today, an enterprise-wide approach is necessary. The Chief Data Officer (CDO) is the newest member of the executive suite in many organizations worldwide. Their task is to develop and implement the strategies needed to harness the value of an enterprise’s data, while working alongside the CEO, CIO, CTO, and other executives. They are the vital “data” bridge between business and IT. This paper is sponsored by: Paxata and CA Technologies
Tags : 
chief data officer, cdo, data, data management, research paper, dataversity
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Jul 06, 2015
The growth of NoSQL data storage solutions have revolutionized the way enterprises are dealing with their data. The older, relational platforms are still being utilized by most organizations, while the implementation of varying NoSQL platforms including Key-Value, Wide Column, Document, Graph, and Hybrid data stores are increasing at faster rates than ever seen before. Such implementations are causing enterprises to revise their Data Management procedures across-the-board from governance to analytics, metadata management to software development, data modeling to regulation and compliance. The time-honored techniques for data modeling are being rewritten, reworked, and modified in a multitude of different ways, often wholly dependent on the NoSQL platform under development. The research report analyzes a 2015 DATAVERSITY® survey titled “Modeling NoSQL.” The survey examined a number of crucial issues within the NoSQL world today, with focus on data modeling in particular.
Tags : 
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Nov 20, 2015
The competitive advantages realized from a dependable Business Intelligence and Analytics (BI/A) are well documented. Everything from reduced business costs and increased customer retention to better decision making and the ability to forecast opportunities have been observed outcomes in response to such programs. The implementation of such a program remains a necessity for any growing or mature enterprise. The establishment of a comprehensive BI/A program that includes traditional Descriptive Analytics along with next generation categories such as Predictive or Prescriptive Analytics is indispensable for business success.
Tags : 
data, data management, analytics, business intelligence, data science
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Oct 04, 2016
This report evaluates each question posed in a recent survey and provides subsequent analysis in a detailed format that includes the most noteworthy statistics, direct comments from survey respondents, and the influence on the industry as a whole. It seeks to present readers with a thorough review of the state of Metadata Management as it exists today.
Tags : 
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Oct 11, 2018
The foundation of this report is a survey conducted by DATAVERSITY® that included a range of different question types and topics on the current state of Data Governance and Data Stewardship. The report evaluates the topic through a discussion and analysis of each presented survey question, as well as a deeper examination of the present and future trends.
Tags : 
    
DATAVERSITY
Published By: Melissa Data     Published Date: Jan 31, 2019
Noted SQL Server MVP and founder/editor of SSWUG.org Stephen Wynkoop shares his take on the challenge to achieve quality data, and the importance of the "Golden Record" to an effective data quality regiment. Achieving the Golden Record involves collapsing duplicate records into a single version of the truth - the one single customer view (SCV). There are different approaches to achieving the Golden Record. Wynkoop explores Melissa's unique approach that takes into consideration the actual quality of the contact data as the basis of survivorship. Learn How: • Poor data quality negatively affects your business • Different data quality implementations in SQL Server • Melissa's unique approach to achieving the Golden Record based on a data quality score
Tags : 
    
Melissa Data
Published By: Ted Hills     Published Date: Mar 08, 2017
NoSQL database management systems give us the opportunity to store our data according to more than one data storage model, but our entity-relationship data modeling notations are stuck in SQL land. Is there any need to model schema-less databases, and is it even possible? In this short white paper, Ted Hills examines these questions in light of a recent paper from MarkLogic on the hybrid data model.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
This document provides a complete reference for the Concept and Object Modeling Notation (COMN, pronounced “common”), release 1.1. The book NoSQL and SQL Data Modeling (Technics Publications, 2016) reflects release 1.0 of COMN. This is a reference, not a tutorial. This document is designed to support a quick check of how to draw or notate something in COMN.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
This paper explores the differences between three situations that appear on the surface to be very similar: a data attribute that may occur zero or one times, a data attribute that is optional, and a data attribute whose value may be unknown. It shows how each of these different situations is represented in Concept and Object Modeling Notation (COMN, pronounced “common”). The theory behind the analysis is explained in greater detail by three papers: Three-Valued Logic, A Systematic Solution to Handling Unknown Data in Databases, and An Approach to Representing Non-Applicable Data in Relational Databases.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Much has been written and debated about the use of SQL NULLs to represent unknown values, and the possible use of three-valued logic. However, there has never been a systematic application of any three-valued logic to use in the logical expressions of computer programs. This paper lays the foundation for a systematic application of three-valued logic to one of the two problems inadequately addressed by SQL NULLs.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Ever since Codd introduced so-called “null values” to the relational model, there have been debates about exactly what they mean and their proper handling in relational databases. In this paper I examine the meaning of tuples and relations containing “null values”. For the type of “null value” representing unknown data, I propose an interpretation and a solution that is more rigorously defined than the SQL NULL or other similar solutions, and which can be implemented in a systematic and application-independent manner in database management systems.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Ever since Codd introduced so-called “null values” to the relational model, there have been debates about exactly what they mean and their proper handling in relational databases. In this paper I examine the meaning of tuples and relations containing “null values”. For the type of “null value” representing that data are not applicable, I propose an interpretation and a solution that is more rigorously defined than the SQL NULL or other similar solutions, and which can be implemented in a systematic and application-independent manner in database management systems.
Tags : 
    
Ted Hills
Published By: Innovative Systems     Published Date: Feb 21, 2019
From years of data quality initiatives, hundreds of case studies, and research by industry experts, a number of common data quality success factors have emerged. This paper discusses key characteristics of data quality initiatives and provides actionable guidelines to help make your project a success, from conception through implementation and tracking your ROI. Readers will learn how to: • Quantify the effect of poor data quality on the organization • Prioritize projects for faster ROI • Gain buy-in, from employees through senior management
Tags : 
    
Innovative Systems
Published By: Ataccama     Published Date: Feb 26, 2019
Machine learning is a powerful tool for the transformation of data management, and with increasing amounts of data in an organization’s possession, it is more vital than ever. Learn how Ataccama leverages cutting-edge machine learning techniques to automate manual and time-consuming tasks and optimize performance. By developing AI-powered data cataloging and master data management solutions, we can help stem the tide of data chaos and ensure that your data is orderly, correct, and easy to find.
Tags : 
    
Ataccama
Published By: Stardog Union     Published Date: Mar 13, 2019
Enterprises must transition to contextualizing their data instead of just collecting it in order to fully leverage their data as a strategic asset. Existing data management solutions such as databases and data lakes encourage data sprawl and duplication. However, true data unification can be achieved with a Knowledge Graph, which seamlessly layers on top of your existing data infrastructure to reveal the interrelationships in your data, no matter its source or format. The Knowledge Graph is also a highly scalable solution since it retains every analysis performed as a reusable asset -- drastically reducing the need for data wrangling over time. Download Knowledge Graphs 101 to learn how this technology differs from a graph database, how it compares to MDM and data lake solutions, and how to leverage artificial intelligence and machine learning within a Knowledge Graphs.
Tags : 
    
Stardog Union
Published By: CData     Published Date: Jan 04, 2019
The growth of NoSQL continues to accelerate as the industry is increasingly forced to develop new and more specialized data structures to deal with the explosion of application and device data. At the same time, new data products for BI, Analytics, Reporting, Data Warehousing, AI, and Machine Learning continue along a similar growth trajectory. Enabling interoperability between applications and data sources, each with a unique interface and value proposition, is a tremendous challenge. This paper discusses a variety of mapping and flattening techniques, and continues with examples that highlight performance and usability differences between approaches.
Tags : 
data architecture, data, data management, business intelligence, data warehousing
    
CData
Published By: Denodo     Published Date: Feb 27, 2019
Organizations continue to struggle with integrating data quickly enough to support the needs of business stakeholders, who need integrated data faster and faster with each passing day. Traditional data integration technologies have not been able to solve the fundamental problem, as they deliver data in scheduled batches, and cannot support many of today’s rich and complex data types. Data virtualization is a modern data integration approach that is already meeting today’s data integration challenges, providing the foundation for data integration in the future. Download this whitepaper to learn more about: The fundamental challenge for organizations today. Why traditional solutions fall short. Why data virtualization is the core solution.
Tags : 
    
Denodo
Published By: Tamr, Inc.     Published Date: Feb 08, 2019
Traditional data management practices, such as master data management (MDM), have been around for decades – as have the approaches vendors take in developing these capabilities. And they were well-equipped for the problem at hand: managing data at modest size and complexity. However, as enterprises mature and start to view their data assets as a source of competitive advantage, new methods to managing enterprise data become desirable. Enterprises now need approaches to data management that can solve critical issues around speed and scale in an increasingly complex data environment. This paper explores how data curation technology can be used to solve data mastering challenges at scale.
Tags : 
    
Tamr, Inc.
Published By: MEGA International     Published Date: Mar 07, 2019
The effectiveness of agile processes is often jeopardized because the architecture and organizational pre-requisites of agility are neglected. This White Paper proposes a new architecture framework, the Agile Architecture Framework (AAF), that meets the needs of the digital enterprise. It develops a vision that uniquely combines: • Methods for decomposing the system into loosely-coupled services and autonomous teams • Alignment mechanisms rooted in business strategy • Architecture patterns that leverage the latest software innovations • Results from large enterprises that started their agile-at-scale journey several ago This proposed architecture approach will enable organizations at all scales to better realize the Boundaryless Information Flow™ vision achieved through global interoperability in a secure, reliable, and timely manner.
Tags : 
    
MEGA International
Published By: TigerGraph     Published Date: Mar 13, 2019
As the world’s first and only Native Parallel Graph (NPG) system, TigerGraph is a complete, distributed, graph analytics platform supporting web-scale data analytics in real time. The TigerGraph NPG is built around both local storage and computation, supports real-time graph updates, and works like a parallel computation engine. Download our white paper to learn of TigerGraph's unique advantages.
Tags : 
    
TigerGraph
Published By: DATAVERSITY     Published Date: Feb 27, 2013
In its most basic definition, unstructured data simply means any form of data that does not easily fit into a relational model or a set of database tables.
Tags : 
white paper, dataversity, unstructured data, enterprise data management, data, data management
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept