data management

Results 26 - 50 of 2470Sort Results By: Published Date | Title | Company Name
Published By: TopQuadrant     Published Date: Mar 21, 2015
Data management is becoming more and more central to the business model of enterprises. The time when data was looked at as little more than the byproduct of automation is long gone, and today we see enterprises vigorously engaged in trying to unlock maximum value from their data, even to the extent of directly monetizing it. Yet, many of these efforts are hampered by immature data governance and management practices stemming from a legacy that did not pay much attention to data. Part of this problem is a failure to understand that there are different types of data, and each type of data has its own special characteristics, challenges and concerns. Reference data is a special type of data. It is essentially codes whose basic job is to turn other data into meaningful business information and to provide an informational context for the wider world in which the enterprise functions. This paper discusses the challenges associated with implementing a reference data management solution and the essential components of any vision for the governance and management of reference data. It covers the following topics in some detail: · What is reference data? · Why is reference data management important? · What are the challenges of reference data management? · What are some best practices for the governance and management of reference data? · What capabilities should you look for in a reference data solution?
Tags : 
data management, data, reference data, reference data management, top quadrant, malcolm chisholm
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 01, 2017
This paper presents a practitioner informed roadmap intended to assist enterprises in maturing their Enterprise Information Management (EIM) practices, with a specific focus on improving Reference Data Management (RDM). Reference data is found in every application used by an enterprise including back-end systems, front-end commerce applications, data exchange formats, and in outsourced, hosted systems, big data platforms, and data warehouses. It can easily be 20–50% of the tables in a data store. And the values are used throughout the transactional and mastered data sets to make the system internally consistent.
Tags : 
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 11, 2018
Data governance is a lifecycle-centric asset management activity. To understand and realize the value of data assets, it is necessary to capture information about them (their metadata) in the connected way. Capturing the meaning and context of diverse enterprise data in connection to all assets in the enterprise ecosystem is foundational to effective data governance. Therefore, a data governance environment must represent assets and their role in the enterprise using an open, extensible and “smart” approach. Knowledge graphs are the most viable and powerful way to do this. This short paper outlines how knowledge graphs are flexible, evolvable, semantic and intelligent. It is these characteristics that enable them to: • capture the description of data as an interconnected set of information that meaningfully bridges enterprise metadata silos. • deliver integrated data governance by addressing all three aspects of data governance — Executive Governance, Representative Governance, and App
Tags : 
    
TopQuadrant
Published By: CA Technologies     Published Date: Apr 24, 2013
This white paper by industry expert Alec Sharp illustrates these points and provides specific guidelines and techniques for a business-oriented approach to data modeling. Examples demonstrate how business professionals.
Tags : 
white paper, ca technologies, erwin, data, data management, data modeling, dataversity
    
CA Technologies
Published By: CA Technologies     Published Date: Dec 03, 2015
This 2nd paper in a 3-part series by David Loshin explores some challenges in bootstrapping a data governance program, and then considers key methods for using metadata to establish the starting point for data governance. The paper will focus on how metadata management facilitates progress along three facets of the data governance program including assessment, collaboration and operationalization.
Tags : 
    
CA Technologies
Published By: MapR Technologies     Published Date: Aug 01, 2018
How do you get a machine learning system to deliver value from big data? Turns out that 90% of the effort required for success in machine learning is not the algorithm or the model or the learning - it's the logistics. Ted Dunning and Ellen Friedman identify what matters in machine learning logistics, what challenges arise, especially in a production setting, and they introduce an innovative solution: the rendezvous architecture. This new design for model management is based on a streaming approach in a microservices style. Rendezvous addresses the need to preserve and share raw data, to do effective model-to-model comparisons and to have new models on standby, ready for a hot hand-off when a production model needs to be replaced.
Tags : 
    
MapR Technologies
Published By: Semantic Arts     Published Date: Aug 01, 2013
This White Paper explains how Semantic Technology can help organizations leverage their legacy investments into new solutions through the use of a Semantic Layer so they can improve IT productivity by reducing complexity, thereby reducing total cost of ownership.
Tags : 
white paper, semantic technology, data, data management
    
Semantic Arts
Published By: Information Asset, LLC     Published Date: Feb 11, 2014
An In-Depth Review of Data Governance Software Tools: Reference Architecture, Evaluation Criteria, and Vendor Landscape
Tags : 
white paper, data governance, data, data management, data management white paper, data governance white paper
    
Information Asset, LLC
Published By: Paxata     Published Date: Apr 02, 2014
Why Sift Through Data Landfills? Better business insight comes from data - but data is often dirty, incomplete and complicated. As any analyst would admit, what passes for data science is more like janitorial work. Find out why that is - and how you can avoid the painful, manual and error-prone processes that have bogged down the analytics process for 30 years.
Tags : 
data, data management, big data, white paper, paxata, analytics
    
Paxata
Published By: Cambridge Semantics     Published Date: Mar 13, 2015
As the quantity and diversity of relevant data grows within and outside the enterprise, how can IT easily deploy secure governed solutions that allow business users to identify, extract, link together and derive value from the right data at the right time, at big data scale, while keeping up with ever changing business needs? Smart Enterprise Data Management (Smart EDM) is new, sensible paradigm for managing enterprise data. Anzo Smart Data solutions allow IT departments and their business users to quickly and flexibly access all of their diverse data. Based upon graph data models and Semantic data standards, Anzo enables users to easily perform advanced data management and analytics through the lens of their business at a fraction of the time and cost of traditional approaches, while adhering to the governance and security required by enterprise IT groups. Download this whitepaper to learn more.
Tags : 
enterprise data management, data governance, data integration, cambridge semantics
    
Cambridge Semantics
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: Aerospike     Published Date: Jul 17, 2014
This whitepaper defines provides a brief technical overview of ACID support in Aerospike. It includes a definition of ACID (Atomicity, Consistency, Isolation, Durability), and an overview of the CAP Theorem, which postulates that only two of the three properties of consistency, availability, and partition tolerance can be guaranteed in a distributed system at a specific time. Although Aerospike is an AP system with a proven track record of 100% uptime, this paper describes Aerospike's unique approach to avoiding network partitioning to also ensure high consistency. In addition, the paper describes how Aerospike will give users the option to support a CP configuration with complete consistency in the presence of networking partitioning by reducing availability.
Tags : 
whitepaper, data, data management, nosql, aerospike, acid
    
Aerospike
Published By: CMMI Institute     Published Date: Sep 03, 2014
To drive strategic insights that lead to competitive advantage, businesses must make the best and smartest use of today’s vast amount of data. To accomplish this, organizations need to apply a collaborative approach to optimizing their data assets. For organizations that seek to evaluate and improve their data management practices, CMMI® Institute has developed the Data Management Maturity (DMM)? model to bridge the perspective gap between business and IT. Download the white paper Why is Measurement of Data Management Maturity Important? to enable you to: - Empower your executives to make better and faster decisions using a strategic view of their data. - Achieve the elusive alignment and agreement between the business and IT - Create a clear path to increasing capabilities
Tags : 
white paper, enterprise data management, data model, data modeling, data maturity model, cmmi institute
    
CMMI Institute
Published By: CMMI Institute     Published Date: Mar 21, 2016
CMMI Institute®’s Data Management Maturity (DMM)SM framework enables organizations to improve data management practices across the full spectrum of their business. It is a unique, comprehensive reference model that provides organizations with a standard set of best practices to assess their capabilities, strengthen their data management program, and develop a custom roadmap for improvements that align with their business goals.
Tags : 
    
CMMI Institute
Published By: Access Sciences     Published Date: Sep 07, 2014
Few organizations have fully integrated the role of the Data Steward due to concerns about additional project complexity, time away from other responsibilities or insufficient value in return. The principles of the Agile methodology (whether or not Agile is followed for projects) can offer guidance in making the commitment to designating and empowering the Data Steward role. By placing insightful people in a position to connect innovators, respond to change and spur development aligned with business activities, organizations can expect to see a more efficient and effective use of their information assets.
Tags : 
    
Access Sciences
Published By: Melissa Data     Published Date: Oct 27, 2014
Noted SQL Server MVP and founder/editor of SSWUG.org, Stephen Wynkoop shares his take on the challenge to achieve quality data and the importance of the “Golden Record” to an effective data quality regiment. Wynkoop explores the different approaches to achieving the Golden Record - which involves collapsing duplicate records into a single version of the truth – the one single customer view (SCV), and Melissa Data’s unique approach that takes into consideration the actual quality of the contact data as the basis of survivorship.
Tags : 
data, data management, melissa data, data quality
    
Melissa Data
Published By: Melissa Data     Published Date: Mar 23, 2017
In this eBook published by Melissa, author David Loshin explores the challenges of determining when data values are or are not valid and correct, how these values can be corrected, and how data cleansing services can be integrated throughout the enterprise. This Data Quality Primer eBook gives an overview of the five key aspects of data quality management (data cleansing, address data quality, address standardization, data enhancement, and record linkage/matching), as well as provides practical aspects to introduce proactive data quality management into your organization.
Tags : 
    
Melissa Data
Published By: Basho     Published Date: Mar 08, 2015
Many companies still use relational databases as part of the technology stack. However, others are innovating and incorporating NoSQL solutions and as a result they have simplified their deployments, enhanced their availability and reduced their costs. In this whitepaper you will learn: - Why companies choose Riak over a relational database. - How to analyze the decision points you should consider when choosing between relational and Nosql databases - Simple patters for building common applications in Riak using its key/value design Learn how you can lead your organization into this new frontier.
Tags : 
data, data management, basho, database, nosql, data models
    
Basho
Published By: iCEDQ     Published Date: Feb 05, 2015
The demand for using data as an asset has grown to a level where data-centric applications are now the norm in enterprises. Yet data-centric applications fall short of user expectations at a high rate. Part of this is due to inadequate quality assurance. This in turn arises from trying to develop data-centric projects using the old paradigm of the SDLC, which came into existence during an age of process automation. SDLC does not fit with data-centric projects and cannot address the QA needs of these projects. Instead, a new approach is needed where analysts develop business rules to test atomic items of data quality. These rules have to be run in an automated fashion in a business rules engine. Additionally, QA has to be carried past the point of application implementation and support the running of the production environment.
Tags : 
data, data management, data warehousing, data quality, etl testing, malcolm chisholm
    
iCEDQ
Published By: Expert System     Published Date: Mar 19, 2015
Establishing context and knowledge capture In today’s knowledge-infused world, it is vitally important for organizations of any size to deploy an intuitive knowledge platform that enables delivery of the right information at the right time, in a way that is useful and helpful. Semantic technology processes content for meaning, allowing for the ability to understand words in context: it allows for better content processing and interpretation, therefore enabling content organization and navigation, which in turn increases findability.
Tags : 
enterprise data management, unstructured data, semantic technology, expert system
    
Expert System
Published By: CapTech     Published Date: May 26, 2015
Big Data is the future of business. According to CloudTweaks.com, as much as 2.5 quintillion bytes of data are produced each day, with most of this data being captured by Big Data. With its ability to transfer all data sources all into one centralized place, Big Data provides opportunities, clearer visions, customer conversations and transactions. However, with the dazzling big promise of Big Data comes a potentially huge letdown. If this vast pool of information resources is not accessible or usable, it becomes useless. This paper examines strategies for building the most value into your Big Data system by enabling process controls to effectively mine, access and secure Big Data.
Tags : 
big data, captech, data, data management, nosql
    
CapTech
Published By: Neo Technology     Published Date: Jun 28, 2015
The future of Master Data Management is deriving value from data relationships which reveal more data stories that become more and more important to competitive advantage as we enter into the future of data and business analytics. MDM will be about supplying consistent, meaningful views of master data and being able to unify data into one location, especially to optimize for query performance and data fit. Graph databases offer exactly that type of data/performance fit. Use data relationships to unlock real business value in MDM: - Graphs can easily model both hierarchical and non-hierarchical master data - The logical model IS the physical model making it easier for business users to visualize data relationships - Deliver insights in real-time from data relationships in your master data - Stay ahead of the business with faster development Download and read the white paper Your Master Data Is a Graph: Are You Ready? to learn why your master data is a graph and how graph databases like Neo4j are the best technologies for MDM.
Tags : 
database, nosql, graph database, big data, master data management, mdm
    
Neo Technology
Published By: GBG Loqate     Published Date: Jul 09, 2015
Businesses are vulnerable when they assume that their data is accurate, because they are almost always losing money without their knowledge. When it comes to data quality, the problems that you don’t suspect are often worse and more pervasive than the ones you are aware of. Addresses are subject to their own specific set of rules. Detecting and correcting address errors is a complex problem, and one that can only be solved with specialized software.
Tags : 
data, data management, data quality, loqate
    
GBG Loqate
Published By: VoltDB     Published Date: Jul 09, 2015
What is fast data? It's data in motion, and it creates Big Data. But handling it requires a radically different approach. Download the Fast Data Stack white paper from VoltDB. Learn how to build fast data applications with an in-memory solution that’s powerful enough for real-time stateful operations.
Tags : 
data, data management, data stack, bug data, voltdb, database, nosql
    
VoltDB
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept