one

Results 1 - 25 of 7388Sort Results By: Published Date | Title | Company Name
Published By: Melissa Data     Published Date: Jan 31, 2019
Noted SQL Server MVP and founder/editor of SSWUG.org Stephen Wynkoop shares his take on the challenge to achieve quality data, and the importance of the "Golden Record" to an effective data quality regiment. Achieving the Golden Record involves collapsing duplicate records into a single version of the truth - the one single customer view (SCV). There are different approaches to achieving the Golden Record. Wynkoop explores Melissa's unique approach that takes into consideration the actual quality of the contact data as the basis of survivorship. Learn How: • Poor data quality negatively affects your business • Different data quality implementations in SQL Server • Melissa's unique approach to achieving the Golden Record based on a data quality score
Tags : 
    
Melissa Data
Published By: Ted Hills     Published Date: Mar 08, 2017
NoSQL database management systems give us the opportunity to store our data according to more than one data storage model, but our entity-relationship data modeling notations are stuck in SQL land. Is there any need to model schema-less databases, and is it even possible? In this short white paper, Ted Hills examines these questions in light of a recent paper from MarkLogic on the hybrid data model.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
This paper explores the differences between three situations that appear on the surface to be very similar: a data attribute that may occur zero or one times, a data attribute that is optional, and a data attribute whose value may be unknown. It shows how each of these different situations is represented in Concept and Object Modeling Notation (COMN, pronounced “common”). The theory behind the analysis is explained in greater detail by three papers: Three-Valued Logic, A Systematic Solution to Handling Unknown Data in Databases, and An Approach to Representing Non-Applicable Data in Relational Databases.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Much has been written and debated about the use of SQL NULLs to represent unknown values, and the possible use of three-valued logic. However, there has never been a systematic application of any three-valued logic to use in the logical expressions of computer programs. This paper lays the foundation for a systematic application of three-valued logic to one of the two problems inadequately addressed by SQL NULLs.
Tags : 
    
Ted Hills
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: First San Francisco Partners     Published Date: Oct 29, 2015
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. That is why a proactive approach to aligning the organization around a common goal and plan is critical when launching a data management program.
Tags : 
    
First San Francisco Partners
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: ASG     Published Date: Apr 02, 2014
This Case Study focuses on a highly successful data lineage project between ASG Software Solutions and a major global financial institution. The initial project which began in 2011 with the primary goal of achieving greater control, awareness, and ownership over the institution’s data assets due to new regulatory and federal audit controls. As the project progressed and the positive relationship between ASG and the Bank deepened, all stakeholders involved began to see much broader potential for the entire project than originally envisioned.
Tags : 
metadata, data, data management, white paper, case study
    
ASG
Published By: ASG     Published Date: May 08, 2017
One Chief Data Officer’s Story of Creating a Data-Centric Organization: ASG Enterprise Data Intelligence and American Fidelity Assurance
Tags : 
    
ASG
Published By: TopQuadrant     Published Date: Mar 21, 2015
Data management is becoming more and more central to the business model of enterprises. The time when data was looked at as little more than the byproduct of automation is long gone, and today we see enterprises vigorously engaged in trying to unlock maximum value from their data, even to the extent of directly monetizing it. Yet, many of these efforts are hampered by immature data governance and management practices stemming from a legacy that did not pay much attention to data. Part of this problem is a failure to understand that there are different types of data, and each type of data has its own special characteristics, challenges and concerns. Reference data is a special type of data. It is essentially codes whose basic job is to turn other data into meaningful business information and to provide an informational context for the wider world in which the enterprise functions. This paper discusses the challenges associated with implementing a reference data management solution and the essential components of any vision for the governance and management of reference data. It covers the following topics in some detail: · What is reference data? · Why is reference data management important? · What are the challenges of reference data management? · What are some best practices for the governance and management of reference data? · What capabilities should you look for in a reference data solution?
Tags : 
data management, data, reference data, reference data management, top quadrant, malcolm chisholm
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 01, 2017
This paper presents a practitioner informed roadmap intended to assist enterprises in maturing their Enterprise Information Management (EIM) practices, with a specific focus on improving Reference Data Management (RDM). Reference data is found in every application used by an enterprise including back-end systems, front-end commerce applications, data exchange formats, and in outsourced, hosted systems, big data platforms, and data warehouses. It can easily be 20–50% of the tables in a data store. And the values are used throughout the transactional and mastered data sets to make the system internally consistent.
Tags : 
    
TopQuadrant
Published By: CA Technologies     Published Date: Apr 24, 2013
Using ERwin Data Modeler & Microsoft SQL Azure to Move Data to the Cloud within the DaaS Lifecycle by Nuccio Piscopo Cloud computing is one of the major growth areas in the world of IT. This article provides an analysis of how to apply the DaaS (Database as a Service) lifecycle working with ERwin and the SQL Azure platform. It should help enterprises to obtain the benefits of DaaS and take advantage of its potential for improvement and transformation of data models in the Cloud. The use case introduced identifies key actions, requirements and practices that can support activities to help formulate a plan for successfully moving data to the Cloud.
Tags : 
    
CA Technologies
Published By: CA Technologies     Published Date: Feb 25, 2016
As combinations of both internal and externally-imposed business policies imply dependencies on managed data artifacts, organizations are increasingly instituting data governance programs to implement processes for ensuring compliance with business expectations. One fundamental aspect of data governance involves practical application of business rules to data assets based on data elements and their assigned values. Yet despite the intent of harmonizing data element definitions and resolution of data semantics and valid reference values, most organizations rarely have complete visibility into the metadata associated with enterprise data assets.
Tags : 
    
CA Technologies
Published By: MapR Technologies     Published Date: Mar 29, 2016
Add Big Data Technologies to Get More Value from Your Stack Taking advantage of big data starts with understanding how to optimize and augment your existing infrastructure. Relational databases have endured for a reason – they fit well with the types of data that organizations use to run their business. These types of data in business applications such as ERP, CRM, EPM, etc., are not fundamentally changing, which suggests that relational databases will continue to play a foundational role in enterprise architectures for the foreseeable future. One area where emerging technologies can complement relational database technologies is big data. With the rapidly growing volumes of data, along with the many new sources of data, organizations look for ways to relieve pressure from their existing systems. That’s where Hadoop and NoSQL come in.
Tags : 
    
MapR Technologies
Published By: Paxata     Published Date: Apr 02, 2014
Why Sift Through Data Landfills? Better business insight comes from data - but data is often dirty, incomplete and complicated. As any analyst would admit, what passes for data science is more like janitorial work. Find out why that is - and how you can avoid the painful, manual and error-prone processes that have bogged down the analytics process for 30 years.
Tags : 
data, data management, big data, white paper, paxata, analytics
    
Paxata
Published By: Paxata     Published Date: Nov 29, 2016
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Paxata     Published Date: Mar 21, 2017
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Pentaho     Published Date: Nov 03, 2016
While a whole ecosystem of tools has sprung up around Hadoop to handle and analyze data, many of them are specialized to just one part of a larger process. In order to fulfill the promise of Hadoop, organizations need to step back and take an end-to-end view of their analytic data pipelines.
Tags : 
    
Pentaho
Published By: Melissa Data     Published Date: Oct 27, 2014
Noted SQL Server MVP and founder/editor of SSWUG.org, Stephen Wynkoop shares his take on the challenge to achieve quality data and the importance of the “Golden Record” to an effective data quality regiment. Wynkoop explores the different approaches to achieving the Golden Record - which involves collapsing duplicate records into a single version of the truth – the one single customer view (SCV), and Melissa Data’s unique approach that takes into consideration the actual quality of the contact data as the basis of survivorship.
Tags : 
data, data management, melissa data, data quality
    
Melissa Data
Published By: CapTech     Published Date: May 26, 2015
Big Data is the future of business. According to CloudTweaks.com, as much as 2.5 quintillion bytes of data are produced each day, with most of this data being captured by Big Data. With its ability to transfer all data sources all into one centralized place, Big Data provides opportunities, clearer visions, customer conversations and transactions. However, with the dazzling big promise of Big Data comes a potentially huge letdown. If this vast pool of information resources is not accessible or usable, it becomes useless. This paper examines strategies for building the most value into your Big Data system by enabling process controls to effectively mine, access and secure Big Data.
Tags : 
big data, captech, data, data management, nosql
    
CapTech
Published By: Neo Technology     Published Date: Jun 28, 2015
The future of Master Data Management is deriving value from data relationships which reveal more data stories that become more and more important to competitive advantage as we enter into the future of data and business analytics. MDM will be about supplying consistent, meaningful views of master data and being able to unify data into one location, especially to optimize for query performance and data fit. Graph databases offer exactly that type of data/performance fit. Use data relationships to unlock real business value in MDM: - Graphs can easily model both hierarchical and non-hierarchical master data - The logical model IS the physical model making it easier for business users to visualize data relationships - Deliver insights in real-time from data relationships in your master data - Stay ahead of the business with faster development Download and read the white paper Your Master Data Is a Graph: Are You Ready? to learn why your master data is a graph and how graph databases like Neo4j are the best technologies for MDM.
Tags : 
database, nosql, graph database, big data, master data management, mdm
    
Neo Technology
Published By: GBG Loqate     Published Date: Jul 09, 2015
Businesses are vulnerable when they assume that their data is accurate, because they are almost always losing money without their knowledge. When it comes to data quality, the problems that you don’t suspect are often worse and more pervasive than the ones you are aware of. Addresses are subject to their own specific set of rules. Detecting and correcting address errors is a complex problem, and one that can only be solved with specialized software.
Tags : 
data, data management, data quality, loqate
    
GBG Loqate
Published By: VoltDB     Published Date: Feb 12, 2016
The need for fast data applications is growing rapidly, driven by the IoT, the surge in machine-to-machine (M2M) data, global mobile device proliferation, and the monetization of SaaS platforms. So how do you combine real-time, streaming analytics with real-time decisions in an architecture that’s reliable, scalable, and simple? In this report, Ryan Betts and John Hugg from VoltDB examine ways to develop apps for fast data, using pre-defined patterns. These patterns are general enough to suit both the do-it-yourself, hybrid batch/streaming approach, as well as the simpler, proven in-memory approach available with certain fast database offerings.
Tags : 
    
VoltDB
Published By: Experian     Published Date: May 17, 2016
Every year, Experian Data Quality conducts a study to look at the global trends in data quality. This year, research findings reveal how data practitioners are leveraging and managing data to generate actionable insight, and how proper data management is becoming an organization-wide imperative. This study polled more than 1,400 people across eight countries globally from a variety of roles and departments. Respondents were chosen based on their visibility into their orgazation's customer data management practices. Read through our research report to learn: - The changes in channel usage over the last 12 months - Expected changes in big data and data management initiatives - Multi-industry benchmarks, comparisons, and challenges in data quality - And more! Our annual global benchmark report takes a close look at the data quality and data management initiatives driving today's businesses. See where you line up and where you can improve.
Tags : 
    
Experian
Published By: Reltio     Published Date: Jan 20, 2017
If you invested in master data management (MDM), you are part of an elite association of those who have been able to afford the time, effort and resources to deploy what has characteristically been a tool, and discipline reserved for only the largest enterprises. Feedback from top industry analysts and companies that transitioned from legacy MDM to modern data management platforms, led to the compilation of a list of 10 warning signs you can use as a handy guide. If one or more of these signs get your attention, it warrants a serious conversation with your current provider about these issues, and how they compare to modern offerings available today.
Tags : 
    
Reltio
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept