work

Results 1 - 25 of 10623Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Nov 05, 2014
Ask any CEO if they want to better leverage their data assets to drive growth, revenues, and productivity, their answer will most likely be “yes, of course.” Ask many of them what that means or how they will do it and their answers will be as disparate as most enterprise’s data strategies. To successfully control, utilize, analyze, and store the vast amounts of data flowing through organization’s today, an enterprise-wide approach is necessary. The Chief Data Officer (CDO) is the newest member of the executive suite in many organizations worldwide. Their task is to develop and implement the strategies needed to harness the value of an enterprise’s data, while working alongside the CEO, CIO, CTO, and other executives. They are the vital “data” bridge between business and IT. This paper is sponsored by: Paxata and CA Technologies
Tags : 
chief data officer, cdo, data, data management, research paper, dataversity
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Jul 06, 2015
The growth of NoSQL data storage solutions have revolutionized the way enterprises are dealing with their data. The older, relational platforms are still being utilized by most organizations, while the implementation of varying NoSQL platforms including Key-Value, Wide Column, Document, Graph, and Hybrid data stores are increasing at faster rates than ever seen before. Such implementations are causing enterprises to revise their Data Management procedures across-the-board from governance to analytics, metadata management to software development, data modeling to regulation and compliance. The time-honored techniques for data modeling are being rewritten, reworked, and modified in a multitude of different ways, often wholly dependent on the NoSQL platform under development. The research report analyzes a 2015 DATAVERSITY® survey titled “Modeling NoSQL.” The survey examined a number of crucial issues within the NoSQL world today, with focus on data modeling in particular.
Tags : 
    
DATAVERSITY
Published By: Denodo     Published Date: Feb 07, 2019
With the advent of big data and the proliferation of multiple information channels, organizations must store, discover, access, and share massive volumes of traditional and new data sources. Data virtualization transcends the limitations of traditional data integration techniques such as ETL by delivering a simplified, unified, and integrated view of trusted business data. Learn how you can: • Conquer siloed data in the enterprise • Integrate all data sources and types • Cope with regulatory requirements • Deliver big data solutions that work • Take the pain out of cloud adoption • Drive digital transformation
Tags : 
    
Denodo
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Dec 27, 2013
There are actually many elements of such a vision that are working together. ACID and NoSQL are not the antagonists they were once thought to be; NoSQL works well under a BASE model, but also some of the innovative NoSQL systems fully conform to ACID requirements. Database engineers have puzzled out how to get non-relational systems to work within an environment that demands high availability, scalability, with differing levels of recovery and partition tolerance. BASE is still a leading innovation that is wedded to the NoSQL model, and the evolution of both together is harmonious. But that doesn’t mean they always have to be in partnership; there are several options. So while the opening anecdote is true in many cases, organizations that need more diverse possibilities can move into the commercial arena and get the specific option that works best for them. This paper is sponsored by: MarkLogic.
Tags : 
nosql, database, acid v base, white paper
    
DATAVERSITY
Published By: Alation     Published Date: Jan 06, 2017
90% of the time that is spent creating new reports is recreating information that already exists. Without a way to effectively share prior work and identify verified data sources, analysts and other data consumers lack shared context on how to apply data to analytic inquiries and business decision making. Time is wasted tracking down subject matter experts and trying to unearth tribal knowledge. Leading analytic organizations in retail, healthcare, financial services and technology are using data catalogs to help their analysts find, understand and use data appropriately. What are the 5 critical capabilities of a data catalog? Learn more here:
Tags : 
    
Alation
Published By: Alation     Published Date: Jan 06, 2017
90% of the time that is spent creating new reports is recreating information that already exists. Without a way to effectively share prior work and identify verified data sources, analysts and other data consumers lack shared context on how to apply data to analytic inquiries and business decision making. Time is wasted tracking down subject matter experts and trying to unearth tribal knowledge. Leading analytic organizations in retail, healthcare, financial services and technology are using data catalogs to help their analysts find, understand and use data appropriately. What are the 5 critical capabilities of a data catalog? Learn more here:
Tags : 
    
Alation
Published By: Stardog Union     Published Date: Jul 27, 2018
When enterprises consider the benefits of data analysis, what's often overlooked is the challenge of data variety, and that most successful outcomes are driven by it. Yet businesses are still struggling with how to query distributed, heterogeneous data using a unified data model. Fortunately, Knowledge Graphs provide a schema flexible solution based on modular, extensible data models that evolve over time to create a truly unified solution. How is this possible? Download and discover: • Why businesses should organize information using nodes and edges instead of rows, columns and tables • Why schema free and schema rigid solutions eventually prove to be impractical • The three categories of data diversity including semantic and structural variety
Tags : 
    
Stardog Union
Published By: SAP     Published Date: May 19, 2016
SAP® solutions for enterprise information management (EIM) support the critical abilities to architect, integrate, improve, manage, associate, and archive all information. By effectively managing enterprise information, your organization can improve its business outcomes. You can better understand and retain customers, work better with suppliers, achieve compliance while controlling risk, and provide internal transparency to drive operational and strategic decisions.
Tags : 
    
SAP
Published By: First San Francisco Partners     Published Date: Oct 29, 2015
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. That is why a proactive approach to aligning the organization around a common goal and plan is critical when launching a data management program.
Tags : 
    
First San Francisco Partners
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Embarcadero     Published Date: Apr 29, 2015
Everything about data has changed, but that only means that data models are even more essential to understanding that data so that businesses can know what it means. As development methodologies change to incorporate Agile workflows, data architects must adapt to ensure models stay relevant and accurate. This whitepaper describes key requirements for Agile data modeling and shows how ER/Studio supports this methodology.
Tags : 
data, data management, data modeling, agile, agile data modeling, it management
    
Embarcadero
Published By: Embarcadero     Published Date: Jul 23, 2015
Whether you’re working with relational data, schema-less (NoSQL) data, or model metadata, you need a data architecture that can actively leverage information assets for business value. The most valuable data has high quality, business context, and visibility across the organization. Check out this must-read eBook for essential insights on important data architecture topics.
Tags : 
    
Embarcadero
Published By: MarkLogic     Published Date: Aug 04, 2014
The Age of Information and the associated growth of the World Wide Web has brought with it a new problem: how to actually make sense of all the information available. The overarching goal of the Semantic Web is to change that. Semantic Web technologies accomplish this goal by providing a universal framework to describe and link data so that it can be better understood and searched holistically, allowing both people and computers to see and discover relationships in the data. Today, organizations are leveraging the power of the Semantic Web to aggregate and link disparate data, improve search navigation, provide holistic search and discovery, dynamically publish content, and complete ETL processes faster. Read this white paper to gain insight into why Semantics is important, understand how Semantics works, and see examples of Semantics in practice.
Tags : 
data, data management, whitepaper, marklogic, semantic, semantic technology, nosql, database, semantic web, big data
    
MarkLogic
Published By: CA Technologies     Published Date: Apr 24, 2013
Using ERwin Data Modeler & Microsoft SQL Azure to Move Data to the Cloud within the DaaS Lifecycle by Nuccio Piscopo Cloud computing is one of the major growth areas in the world of IT. This article provides an analysis of how to apply the DaaS (Database as a Service) lifecycle working with ERwin and the SQL Azure platform. It should help enterprises to obtain the benefits of DaaS and take advantage of its potential for improvement and transformation of data models in the Cloud. The use case introduced identifies key actions, requirements and practices that can support activities to help formulate a plan for successfully moving data to the Cloud.
Tags : 
    
CA Technologies
Published By: MapR Technologies     Published Date: Aug 04, 2018
Legacy infrastructures simply cannot handle the workloads or power the applications that will drive business decisively forward in the years ahead. New infrastructure, new thinking and new approaches are in the offing, all driven by the mantra 'transform or die.' This book is meant for IT architects; developers and development managers; platform architects; cloud specialists; and big data specialists. For you, the goal is to help create a sense of urgency you can present to your CXOs and others whose buy-in is needed to make essential infrastructure investments along the journey to digital transformation.
Tags : 
    
MapR Technologies
Published By: Paxata     Published Date: Apr 02, 2014
Why Sift Through Data Landfills? Better business insight comes from data - but data is often dirty, incomplete and complicated. As any analyst would admit, what passes for data science is more like janitorial work. Find out why that is - and how you can avoid the painful, manual and error-prone processes that have bogged down the analytics process for 30 years.
Tags : 
data, data management, big data, white paper, paxata, analytics
    
Paxata
Published By: Paxata     Published Date: Nov 29, 2016
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Paxata     Published Date: Mar 21, 2017
Every organization looks for ways to reduce costs and run more efficiently. In fact, those are key drivers for the mainstream adoption of Hadoop and self-service BI tools. And while we can now collect and store more data than ever before, and we have enabled every information worker into a data-hungry analyst, not much consideration has been paid to the cost - including time and effort - of preparing data. Download this report to learn more about the hidden cost of data preparation.
Tags : 
    
Paxata
Published By: Aerospike     Published Date: Jul 17, 2014
This whitepaper defines provides a brief technical overview of ACID support in Aerospike. It includes a definition of ACID (Atomicity, Consistency, Isolation, Durability), and an overview of the CAP Theorem, which postulates that only two of the three properties of consistency, availability, and partition tolerance can be guaranteed in a distributed system at a specific time. Although Aerospike is an AP system with a proven track record of 100% uptime, this paper describes Aerospike's unique approach to avoiding network partitioning to also ensure high consistency. In addition, the paper describes how Aerospike will give users the option to support a CP configuration with complete consistency in the presence of networking partitioning by reducing availability.
Tags : 
whitepaper, data, data management, nosql, aerospike, acid
    
Aerospike
Published By: CMMI Institute     Published Date: Mar 21, 2016
CMMI Institute®’s Data Management Maturity (DMM)SM framework enables organizations to improve data management practices across the full spectrum of their business. It is a unique, comprehensive reference model that provides organizations with a standard set of best practices to assess their capabilities, strengthen their data management program, and develop a custom roadmap for improvements that align with their business goals.
Tags : 
    
CMMI Institute
Published By: Ted Hills     Published Date: Jul 02, 2015
Entity-relationship (E-R) modeling is a tried and true notation for use in designing Structured Query Language (SQL) databases, but the new data structures that Not-Only SQL (NOSQL) DBMSs make possible can’t be represented in E-R notation. Furthermore, E-R notation has some limitations even for SQL database design. This article shows how a new notation, the Conceptual and Objective Modeling (COM) notation, is able to represent NOSQL designs that are beyond the reach of E-R notation. At the end, it gives a peek into the tutorial workshop to be given at the 2015 NOSQL Conference in San Jose, CA, US, in August, which will provide opportunities to apply COM notation to practical problems.
Tags : 
nosql, sql, data modeling, data model, er modeling, entity relationship, database, relational, dbms, schema-less, xml, conceptual, logical, physical
    
Ted Hills
Published By: Ontotext     Published Date: Dec 21, 2015
Learn how semantic technologies make any content intelligent and turn it into revenue for your publishing business There is a smarter, cost-effective way for publishers to create, maintain and reuse content assets with higher accuracy. It is called dynamic semantic publishing. Putting Semantic Technologies at Work for the Publishing Industry An efficient blend of semantic technologies, dynamic semantic publishing enables powerful experiences when it comes to publishers’ main stock of trade: processing and representing information.
Tags : 
    
Ontotext
Published By: Alation     Published Date: Mar 15, 2016
curation (noun): The act of organizing and maintaining a collection (such as artworks, artifacts, or data). Data curation is emerging as a technique to support data governance, especially in data-driven organizations. As self-service data visualization tools have taken off, sharing the nuances and best practices of how to use data becomes ever more critical. Analysts at companies from eBay to Safeway and Square are scaling their data knowledge through curation techniques. What are the 4 steps to successful data curation? Find out here:
Tags : 
data stewardship, self-service analytics, data curation, data governance
    
Alation
Published By: Amazon Web Services     Published Date: Apr 04, 2016
Amazon DynamoDB is a fully managed, NoSQL database service. Many workloads implemented using a traditional Relational Database Management System (RDBMS) are good candidates for a NoSQL database such as DynamoDB. This whitepaper details the process for identifying these candidate workloads and planning and executing a migration to DynamoDB.
Tags : 
    
Amazon Web Services
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept