rate

Results 1 - 25 of 10880Sort Results By: Published Date | Title | Company Name
Published By: Denodo     Published Date: Mar 01, 2019
With the advent of big data and the proliferation of multiple information channels, organizations must store, discover, access, and share massive volumes of traditional and new data sources. Data virtualization transcends the limitations of traditional data integration techniques such as ETL by delivering a simplified, unified, and integrated view of trusted business data. Learn how you can: • Conquer siloed data in the enterprise • Integrate all data sources and types • Cope with regulatory requirements • Deliver big data solutions that work • Take the pain out of cloud adoption • Drive digital transformation
Tags : 
    
Denodo
Published By: DATAVERSITY     Published Date: Nov 05, 2014
Ask any CEO if they want to better leverage their data assets to drive growth, revenues, and productivity, their answer will most likely be “yes, of course.” Ask many of them what that means or how they will do it and their answers will be as disparate as most enterprise’s data strategies. To successfully control, utilize, analyze, and store the vast amounts of data flowing through organization’s today, an enterprise-wide approach is necessary. The Chief Data Officer (CDO) is the newest member of the executive suite in many organizations worldwide. Their task is to develop and implement the strategies needed to harness the value of an enterprise’s data, while working alongside the CEO, CIO, CTO, and other executives. They are the vital “data” bridge between business and IT. This paper is sponsored by: Paxata and CA Technologies
Tags : 
chief data officer, cdo, data, data management, research paper, dataversity
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Jul 06, 2015
The growth of NoSQL data storage solutions have revolutionized the way enterprises are dealing with their data. The older, relational platforms are still being utilized by most organizations, while the implementation of varying NoSQL platforms including Key-Value, Wide Column, Document, Graph, and Hybrid data stores are increasing at faster rates than ever seen before. Such implementations are causing enterprises to revise their Data Management procedures across-the-board from governance to analytics, metadata management to software development, data modeling to regulation and compliance. The time-honored techniques for data modeling are being rewritten, reworked, and modified in a multitude of different ways, often wholly dependent on the NoSQL platform under development. The research report analyzes a 2015 DATAVERSITY® survey titled “Modeling NoSQL.” The survey examined a number of crucial issues within the NoSQL world today, with focus on data modeling in particular.
Tags : 
    
DATAVERSITY
Published By: Stardog Union     Published Date: Mar 13, 2019
Enterprises must transition to contextualizing their data instead of just collecting it in order to fully leverage their data as a strategic asset. Existing data management solutions such as databases and data lakes encourage data sprawl and duplication. However, true data unification can be achieved with a Knowledge Graph, which seamlessly layers on top of your existing data infrastructure to reveal the interrelationships in your data, no matter its source or format. The Knowledge Graph is also a highly scalable solution since it retains every analysis performed as a reusable asset -- drastically reducing the need for data wrangling over time. Download Knowledge Graphs 101 to learn how this technology differs from a graph database, how it compares to MDM and data lake solutions, and how to leverage artificial intelligence and machine learning within a Knowledge Graphs.
Tags : 
    
Stardog Union
Published By: CData     Published Date: Jan 04, 2019
The growth of NoSQL continues to accelerate as the industry is increasingly forced to develop new and more specialized data structures to deal with the explosion of application and device data. At the same time, new data products for BI, Analytics, Reporting, Data Warehousing, AI, and Machine Learning continue along a similar growth trajectory. Enabling interoperability between applications and data sources, each with a unique interface and value proposition, is a tremendous challenge. This paper discusses a variety of mapping and flattening techniques, and continues with examples that highlight performance and usability differences between approaches.
Tags : 
data architecture, data, data management, business intelligence, data warehousing
    
CData
Published By: Denodo     Published Date: Feb 27, 2019
Organizations continue to struggle with integrating data quickly enough to support the needs of business stakeholders, who need integrated data faster and faster with each passing day. Traditional data integration technologies have not been able to solve the fundamental problem, as they deliver data in scheduled batches, and cannot support many of today’s rich and complex data types. Data virtualization is a modern data integration approach that is already meeting today’s data integration challenges, providing the foundation for data integration in the future. Download this whitepaper to learn more about: The fundamental challenge for organizations today. Why traditional solutions fall short. Why data virtualization is the core solution.
Tags : 
    
Denodo
Published By: MEGA International     Published Date: Mar 07, 2019
The effectiveness of agile processes is often jeopardized because the architecture and organizational pre-requisites of agility are neglected. This White Paper proposes a new architecture framework, the Agile Architecture Framework (AAF), that meets the needs of the digital enterprise. It develops a vision that uniquely combines: • Methods for decomposing the system into loosely-coupled services and autonomous teams • Alignment mechanisms rooted in business strategy • Architecture patterns that leverage the latest software innovations • Results from large enterprises that started their agile-at-scale journey several ago This proposed architecture approach will enable organizations at all scales to better realize the Boundaryless Information Flow™ vision achieved through global interoperability in a secure, reliable, and timely manner.
Tags : 
    
MEGA International
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: CA Technologies     Published Date: Oct 22, 2015
As the interest in managing information and enforcing corporate data management policies increases, data governance programs to manage data sets are becoming more and more vital to the business operation. However, in this rush for data governance programs, sometimes the true utility and importance of metadata can be missed. In this white paper, David Loshin of Knowledge Integrity, Inc. discusses the importance of data governance and the role of metadata management as a way to empower data governance and enforce data policies.
Tags : 
white paper, metadata, data management, data modeling, david loshin, data governance, data governance strategy
    
CA Technologies
Published By: SAP     Published Date: May 19, 2016
SAP® solutions for enterprise information management (EIM) support the critical abilities to architect, integrate, improve, manage, associate, and archive all information. By effectively managing enterprise information, your organization can improve its business outcomes. You can better understand and retain customers, work better with suppliers, achieve compliance while controlling risk, and provide internal transparency to drive operational and strategic decisions.
Tags : 
    
SAP
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Adaptive     Published Date: May 10, 2017
Enterprise metadata management and data quality management are two important pillars of successful enterprise data management for any organization. A well implemented enterprise metadata management platform can enable a successful data quality management at the enterprise level. This paper describes in detail an approach to integrate data quality and metadata management leveraging the Adaptive Metadata Manager platform. It explains the various levels of integrations and the benefits associated with each.
Tags : 
    
Adaptive
Published By: Embarcadero     Published Date: Oct 21, 2014
Metadata defines the structure of data in files and databases, providing detailed information about entities and objects. In this white paper, Dr. Robin Bloor and Rebecca Jowiak of The Bloor Group discuss the value of metadata and the importance of organizing it well, which enables you to: - Collaborate on metadata across your organization - Manage disparate data sources and definitions - Establish an enterprise glossary of business definitions and data elements - Improve communication between teams
Tags : 
data, data management, enterprise data management, enterprise information management, metadata, robin bloor, rebecca jozwiak, embarcadero
    
Embarcadero
Published By: Embarcadero     Published Date: Jan 23, 2015
There are multiple considerations for collaborating on metadata within an organization, and you need a good metadata strategy to define and manage the right processes for a successful implementation. In this white paper, David Loshin describes how to enhance enterprise knowledge sharing by using collaborative metadata for structure, content, and semantics.
Tags : 
data, data management, metadata, enterprise information management, data modeling, embarcadero
    
Embarcadero
Published By: Embarcadero     Published Date: Apr 29, 2015
Everything about data has changed, but that only means that data models are even more essential to understanding that data so that businesses can know what it means. As development methodologies change to incorporate Agile workflows, data architects must adapt to ensure models stay relevant and accurate. This whitepaper describes key requirements for Agile data modeling and shows how ER/Studio supports this methodology.
Tags : 
data, data management, data modeling, agile, agile data modeling, it management
    
Embarcadero
Published By: MarkLogic     Published Date: Aug 04, 2014
The Age of Information and the associated growth of the World Wide Web has brought with it a new problem: how to actually make sense of all the information available. The overarching goal of the Semantic Web is to change that. Semantic Web technologies accomplish this goal by providing a universal framework to describe and link data so that it can be better understood and searched holistically, allowing both people and computers to see and discover relationships in the data. Today, organizations are leveraging the power of the Semantic Web to aggregate and link disparate data, improve search navigation, provide holistic search and discovery, dynamically publish content, and complete ETL processes faster. Read this white paper to gain insight into why Semantics is important, understand how Semantics works, and see examples of Semantics in practice.
Tags : 
data, data management, whitepaper, marklogic, semantic, semantic technology, nosql, database, semantic web, big data
    
MarkLogic
Published By: TopQuadrant     Published Date: Jun 11, 2018
Data governance is a lifecycle-centric asset management activity. To understand and realize the value of data assets, it is necessary to capture information about them (their metadata) in the connected way. Capturing the meaning and context of diverse enterprise data in connection to all assets in the enterprise ecosystem is foundational to effective data governance. Therefore, a data governance environment must represent assets and their role in the enterprise using an open, extensible and “smart” approach. Knowledge graphs are the most viable and powerful way to do this. This short paper outlines how knowledge graphs are flexible, evolvable, semantic and intelligent. It is these characteristics that enable them to: • capture the description of data as an interconnected set of information that meaningfully bridges enterprise metadata silos. • deliver integrated data governance by addressing all three aspects of data governance — Executive Governance, Representative Governance, and App
Tags : 
    
TopQuadrant
Published By: CA Technologies     Published Date: Apr 24, 2013
This white paper by industry expert Alec Sharp illustrates these points and provides specific guidelines and techniques for a business-oriented approach to data modeling. Examples demonstrate how business professionals.
Tags : 
white paper, ca technologies, erwin, data, data management, data modeling, dataversity
    
CA Technologies
Published By: Cambridge Semantics     Published Date: May 11, 2016
With the explosive growth of Big Data, IT professionals find their time and resources squeezed between managing increasingly large and diverse siloed data stores and increased user demands for timely, accurate data. The graph-based ANZO Smart Data Manager is built to relieve these burdens by automating the process of managing, cataloging and governing data at enterprise scale and security. Anzo Smart Data Manager allows companies to truly understand their data ecosystems and leverage the metadata within it.
Tags : 
    
Cambridge Semantics
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: CMMI Institute     Published Date: Sep 03, 2014
To drive strategic insights that lead to competitive advantage, businesses must make the best and smartest use of today’s vast amount of data. To accomplish this, organizations need to apply a collaborative approach to optimizing their data assets. For organizations that seek to evaluate and improve their data management practices, CMMI® Institute has developed the Data Management Maturity (DMM)? model to bridge the perspective gap between business and IT. Download the white paper Why is Measurement of Data Management Maturity Important? to enable you to: - Empower your executives to make better and faster decisions using a strategic view of their data. - Achieve the elusive alignment and agreement between the business and IT - Create a clear path to increasing capabilities
Tags : 
white paper, enterprise data management, data model, data modeling, data maturity model, cmmi institute
    
CMMI Institute
Published By: Access Sciences     Published Date: Sep 07, 2014
Few organizations have fully integrated the role of the Data Steward due to concerns about additional project complexity, time away from other responsibilities or insufficient value in return. The principles of the Agile methodology (whether or not Agile is followed for projects) can offer guidance in making the commitment to designating and empowering the Data Steward role. By placing insightful people in a position to connect innovators, respond to change and spur development aligned with business activities, organizations can expect to see a more efficient and effective use of their information assets.
Tags : 
    
Access Sciences
Published By: Melissa Data     Published Date: Mar 23, 2017
In this eBook published by Melissa, author David Loshin explores the challenges of determining when data values are or are not valid and correct, how these values can be corrected, and how data cleansing services can be integrated throughout the enterprise. This Data Quality Primer eBook gives an overview of the five key aspects of data quality management (data cleansing, address data quality, address standardization, data enhancement, and record linkage/matching), as well as provides practical aspects to introduce proactive data quality management into your organization.
Tags : 
    
Melissa Data
Published By: Basho     Published Date: Sep 30, 2016
The Internet of Things (IoT) or the Internet of Everything is changing the way companies interact with their customers and manage their data. These connected devices generate high volume time series data that can be created in milliseconds. This fast growth of IoT data and other time series data is producing challenges for enterprise applications where data must be collected, saved, and analyzed in the blink of an eye. Your application needs a database built to uniquely handle time series data to ensure your data is continuously available and accurate.Learn about the only NoSQL database optimized for IoT and Time Series data in this technical overview. Riak TS stores and analyzes massive amounts of data and is designed to be faster than Cassandra.
Tags : 
    
Basho
Published By: iCEDQ     Published Date: Feb 05, 2015
The demand for using data as an asset has grown to a level where data-centric applications are now the norm in enterprises. Yet data-centric applications fall short of user expectations at a high rate. Part of this is due to inadequate quality assurance. This in turn arises from trying to develop data-centric projects using the old paradigm of the SDLC, which came into existence during an age of process automation. SDLC does not fit with data-centric projects and cannot address the QA needs of these projects. Instead, a new approach is needed where analysts develop business rules to test atomic items of data quality. These rules have to be run in an automated fashion in a business rules engine. Additionally, QA has to be carried past the point of application implementation and support the running of the production environment.
Tags : 
data, data management, data warehousing, data quality, etl testing, malcolm chisholm
    
iCEDQ
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept