it management

Results 1 - 25 of 6457Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 06, 2015
The growth of NoSQL data storage solutions have revolutionized the way enterprises are dealing with their data. The older, relational platforms are still being utilized by most organizations, while the implementation of varying NoSQL platforms including Key-Value, Wide Column, Document, Graph, and Hybrid data stores are increasing at faster rates than ever seen before. Such implementations are causing enterprises to revise their Data Management procedures across-the-board from governance to analytics, metadata management to software development, data modeling to regulation and compliance. The time-honored techniques for data modeling are being rewritten, reworked, and modified in a multitude of different ways, often wholly dependent on the NoSQL platform under development. The research report analyzes a 2015 DATAVERSITY® survey titled “Modeling NoSQL.” The survey examined a number of crucial issues within the NoSQL world today, with focus on data modeling in particular.
Tags : 
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Oct 04, 2016
This report evaluates each question posed in a recent survey and provides subsequent analysis in a detailed format that includes the most noteworthy statistics, direct comments from survey respondents, and the influence on the industry as a whole. It seeks to present readers with a thorough review of the state of Metadata Management as it exists today.
Tags : 
    
DATAVERSITY
Published By: Erwin     Published Date: Sep 13, 2018
Do you know what data you have, where it is and how to wring all the possible value from it? By harmonizing your data management and data governance efforts, you can accelerate your time to data preparation, data visibility and data-driven insights. Then you’ll know how to get the results you need.
Tags : 
    
Erwin
Published By: TD Bank Group     Published Date: Aug 10, 2018
This paper examines whether blockchain distributed ledger technology could improve the management of trusted information, specifically considering data quality. Improvement was determined by considering the impact of a distributed ledger as an authoritative source in TD Bank Group's Enterprise Data Quality Management Process versus the use of standard authoritative sources such as databases and files. Distributed ledger technology is not expected, or proven, to result in a change in the Data Quality Management process. Our analysis focused on execution advantages possible due to distributed ledger properties that make it an attractive resource for data quality management (DQM).
Tags : 
    
TD Bank Group
Published By: DATAVERSITY     Published Date: Feb 27, 2013
In its most basic definition, unstructured data simply means any form of data that does not easily fit into a relational model or a set of database tables.
Tags : 
white paper, dataversity, unstructured data, enterprise data management, data, data management
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Nov 11, 2013
This report investigates the level of Information Architecture (IA) implementation and usage at the enterprise level. The primary support for the report is an analysis of a 2013 DATAVERSITY™ survey on Data and Information Architecture. This paper is sponsored by: HP, Vertica, Denodo, Embarcadero and CA Technologies.
Tags : 
information architecture, data architecture, white paper, mdm, master data management, data, data management, enterprise information management, enterprise data management, data virtualization, metadata, data modeling, research paper, survey, data integration
    
DATAVERSITY
Published By: CA Technologies     Published Date: Oct 22, 2015
As the interest in managing information and enforcing corporate data management policies increases, data governance programs to manage data sets are becoming more and more vital to the business operation. However, in this rush for data governance programs, sometimes the true utility and importance of metadata can be missed. In this white paper, David Loshin of Knowledge Integrity, Inc. discusses the importance of data governance and the role of metadata management as a way to empower data governance and enforce data policies.
Tags : 
white paper, metadata, data management, data modeling, david loshin, data governance, data governance strategy
    
CA Technologies
Published By: First San Francisco Partners     Published Date: Oct 29, 2015
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. That is why a proactive approach to aligning the organization around a common goal and plan is critical when launching a data management program.
Tags : 
    
First San Francisco Partners
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Adaptive     Published Date: May 10, 2017
Enterprise metadata management and data quality management are two important pillars of successful enterprise data management for any organization. A well implemented enterprise metadata management platform can enable a successful data quality management at the enterprise level. This paper describes in detail an approach to integrate data quality and metadata management leveraging the Adaptive Metadata Manager platform. It explains the various levels of integrations and the benefits associated with each.
Tags : 
    
Adaptive
Published By: TopQuadrant     Published Date: Mar 21, 2015
Data management is becoming more and more central to the business model of enterprises. The time when data was looked at as little more than the byproduct of automation is long gone, and today we see enterprises vigorously engaged in trying to unlock maximum value from their data, even to the extent of directly monetizing it. Yet, many of these efforts are hampered by immature data governance and management practices stemming from a legacy that did not pay much attention to data. Part of this problem is a failure to understand that there are different types of data, and each type of data has its own special characteristics, challenges and concerns. Reference data is a special type of data. It is essentially codes whose basic job is to turn other data into meaningful business information and to provide an informational context for the wider world in which the enterprise functions. This paper discusses the challenges associated with implementing a reference data management solution and the essential components of any vision for the governance and management of reference data. It covers the following topics in some detail: · What is reference data? · Why is reference data management important? · What are the challenges of reference data management? · What are some best practices for the governance and management of reference data? · What capabilities should you look for in a reference data solution?
Tags : 
data management, data, reference data, reference data management, top quadrant, malcolm chisholm
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 01, 2017
This paper presents a practitioner informed roadmap intended to assist enterprises in maturing their Enterprise Information Management (EIM) practices, with a specific focus on improving Reference Data Management (RDM). Reference data is found in every application used by an enterprise including back-end systems, front-end commerce applications, data exchange formats, and in outsourced, hosted systems, big data platforms, and data warehouses. It can easily be 20–50% of the tables in a data store. And the values are used throughout the transactional and mastered data sets to make the system internally consistent.
Tags : 
    
TopQuadrant
Published By: CA Technologies     Published Date: Apr 24, 2013
This white paper by industry expert Alec Sharp illustrates these points and provides specific guidelines and techniques for a business-oriented approach to data modeling. Examples demonstrate how business professionals.
Tags : 
white paper, ca technologies, erwin, data, data management, data modeling, dataversity
    
CA Technologies
Published By: MapR Technologies     Published Date: Aug 01, 2018
How do you get a machine learning system to deliver value from big data? Turns out that 90% of the effort required for success in machine learning is not the algorithm or the model or the learning - it's the logistics. Ted Dunning and Ellen Friedman identify what matters in machine learning logistics, what challenges arise, especially in a production setting, and they introduce an innovative solution: the rendezvous architecture. This new design for model management is based on a streaming approach in a microservices style. Rendezvous addresses the need to preserve and share raw data, to do effective model-to-model comparisons and to have new models on standby, ready for a hot hand-off when a production model needs to be replaced.
Tags : 
    
MapR Technologies
Published By: Semantic Arts     Published Date: Aug 01, 2013
This White Paper explains how Semantic Technology can help organizations leverage their legacy investments into new solutions through the use of a Semantic Layer so they can improve IT productivity by reducing complexity, thereby reducing total cost of ownership.
Tags : 
white paper, semantic technology, data, data management
    
Semantic Arts
Published By: Information Asset, LLC     Published Date: Feb 11, 2014
An In-Depth Review of Data Governance Software Tools: Reference Architecture, Evaluation Criteria, and Vendor Landscape
Tags : 
white paper, data governance, data, data management, data management white paper, data governance white paper
    
Information Asset, LLC
Published By: Cambridge Semantics     Published Date: Mar 13, 2015
As the quantity and diversity of relevant data grows within and outside the enterprise, how can IT easily deploy secure governed solutions that allow business users to identify, extract, link together and derive value from the right data at the right time, at big data scale, while keeping up with ever changing business needs? Smart Enterprise Data Management (Smart EDM) is new, sensible paradigm for managing enterprise data. Anzo Smart Data solutions allow IT departments and their business users to quickly and flexibly access all of their diverse data. Based upon graph data models and Semantic data standards, Anzo enables users to easily perform advanced data management and analytics through the lens of their business at a fraction of the time and cost of traditional approaches, while adhering to the governance and security required by enterprise IT groups. Download this whitepaper to learn more.
Tags : 
enterprise data management, data governance, data integration, cambridge semantics
    
Cambridge Semantics
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Aerospike     Published Date: Jul 17, 2014
This whitepaper defines provides a brief technical overview of ACID support in Aerospike. It includes a definition of ACID (Atomicity, Consistency, Isolation, Durability), and an overview of the CAP Theorem, which postulates that only two of the three properties of consistency, availability, and partition tolerance can be guaranteed in a distributed system at a specific time. Although Aerospike is an AP system with a proven track record of 100% uptime, this paper describes Aerospike's unique approach to avoiding network partitioning to also ensure high consistency. In addition, the paper describes how Aerospike will give users the option to support a CP configuration with complete consistency in the presence of networking partitioning by reducing availability.
Tags : 
whitepaper, data, data management, nosql, aerospike, acid
    
Aerospike
Published By: CMMI Institute     Published Date: Sep 03, 2014
To drive strategic insights that lead to competitive advantage, businesses must make the best and smartest use of today’s vast amount of data. To accomplish this, organizations need to apply a collaborative approach to optimizing their data assets. For organizations that seek to evaluate and improve their data management practices, CMMI® Institute has developed the Data Management Maturity (DMM)? model to bridge the perspective gap between business and IT. Download the white paper Why is Measurement of Data Management Maturity Important? to enable you to: - Empower your executives to make better and faster decisions using a strategic view of their data. - Achieve the elusive alignment and agreement between the business and IT - Create a clear path to increasing capabilities
Tags : 
white paper, enterprise data management, data model, data modeling, data maturity model, cmmi institute
    
CMMI Institute
Published By: CMMI Institute     Published Date: Mar 21, 2016
CMMI Institute®’s Data Management Maturity (DMM)SM framework enables organizations to improve data management practices across the full spectrum of their business. It is a unique, comprehensive reference model that provides organizations with a standard set of best practices to assess their capabilities, strengthen their data management program, and develop a custom roadmap for improvements that align with their business goals.
Tags : 
    
CMMI Institute
Published By: Melissa Data     Published Date: Mar 23, 2017
In this eBook published by Melissa, author David Loshin explores the challenges of determining when data values are or are not valid and correct, how these values can be corrected, and how data cleansing services can be integrated throughout the enterprise. This Data Quality Primer eBook gives an overview of the five key aspects of data quality management (data cleansing, address data quality, address standardization, data enhancement, and record linkage/matching), as well as provides practical aspects to introduce proactive data quality management into your organization.
Tags : 
    
Melissa Data
Published By: Trillium Software     Published Date: Apr 10, 2017
For the 11th consecutive year, the Gartner Magic Quadrant for Data Quality Tools1 research report positions Trillium Software as a leader in the Data Quality Software industry. Data Quality is vital to ensuring trust in your data-driven, decision making business processes. Confidence is the result of a well thought out and executed data quality management strategy and is critical to remaining competitive in a rapidly and ever-changing business world. The 2016 Gartner Magic Quadrant for Data Quality Tools report is a valuable reference, providing the latest insights into the strengths and cautions of leading vendors. Access the report to learn how a leading data quality solution can help you achieve your long-term strategic objectives.
Tags : 
    
Trillium Software
Published By: Experian     Published Date: May 17, 2016
Every year, Experian Data Quality conducts a study to look at the global trends in data quality. This year, research findings reveal how data practitioners are leveraging and managing data to generate actionable insight, and how proper data management is becoming an organization-wide imperative. This study polled more than 1,400 people across eight countries globally from a variety of roles and departments. Respondents were chosen based on their visibility into their orgazation's customer data management practices. Read through our research report to learn: - The changes in channel usage over the last 12 months - Expected changes in big data and data management initiatives - Multi-industry benchmarks, comparisons, and challenges in data quality - And more! Our annual global benchmark report takes a close look at the data quality and data management initiatives driving today's businesses. See where you line up and where you can improve.
Tags : 
    
Experian
Published By: Experian     Published Date: Mar 30, 2017
Businesses today recognize the importance of the data they hold, but a general lack of trust in the quality of their data prevents them from achieving strategic business objectives. Nearly half of organizations globally say that a lack of trust in their data contributes to increased risk of non-compliance and regulatory penalties (52%) and a downturn in customer loyalty (51%). To be of value to organizations, data needs to be trustworthy. In this report, you will read about the findings from this unique study, including: · How data powers business opportunities · Why trusted data is essential for performance · Challenges that affect data quality · The current state of data management practices · Upcoming data-related projects in 2017
Tags : 
    
Experian
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.