erm

Results 1 - 25 of 2148Sort Results By: Published Date | Title | Company Name
Published By: TD Bank Group     Published Date: Aug 10, 2018
This paper examines whether blockchain distributed ledger technology could improve the management of trusted information, specifically considering data quality. Improvement was determined by considering the impact of a distributed ledger as an authoritative source in TD Bank Group's Enterprise Data Quality Management Process versus the use of standard authoritative sources such as databases and files. Distributed ledger technology is not expected, or proven, to result in a change in the Data Quality Management process. Our analysis focused on execution advantages possible due to distributed ledger properties that make it an attractive resource for data quality management (DQM).
Tags : 
    
TD Bank Group
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: First San Francisco Partners     Published Date: Oct 29, 2015
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. That is why a proactive approach to aligning the organization around a common goal and plan is critical when launching a data management program.
Tags : 
    
First San Francisco Partners
Published By: Melissa Data     Published Date: Mar 23, 2017
In this eBook published by Melissa, author David Loshin explores the challenges of determining when data values are or are not valid and correct, how these values can be corrected, and how data cleansing services can be integrated throughout the enterprise. This Data Quality Primer eBook gives an overview of the five key aspects of data quality management (data cleansing, address data quality, address standardization, data enhancement, and record linkage/matching), as well as provides practical aspects to introduce proactive data quality management into your organization.
Tags : 
    
Melissa Data
Published By: Melissa Data     Published Date: Jan 18, 2018
Maintaining high quality data is essential for operational efficiency, meaningful analytics and good long-term customer relationships. But, when dealing with multiple sources of data, data quality becomes complex, so you need to know when you should build a custom data quality tools over canned solutions. To answer this question, it is important to understand the difference between rules-based data quality, where internal subject matter expertise is necessary – and active data quality, where different domain expertise and resources are required.
Tags : 
    
Melissa Data
Published By: Ted Hills     Published Date: Jul 02, 2015
Entity-relationship (E-R) modeling is a tried and true notation for use in designing Structured Query Language (SQL) databases, but the new data structures that Not-Only SQL (NOSQL) DBMSs make possible can’t be represented in E-R notation. Furthermore, E-R notation has some limitations even for SQL database design. This article shows how a new notation, the Conceptual and Objective Modeling (COM) notation, is able to represent NOSQL designs that are beyond the reach of E-R notation. At the end, it gives a peek into the tutorial workshop to be given at the 2015 NOSQL Conference in San Jose, CA, US, in August, which will provide opportunities to apply COM notation to practical problems.
Tags : 
nosql, sql, data modeling, data model, er modeling, entity relationship, database, relational, dbms, schema-less, xml, conceptual, logical, physical
    
Ted Hills
Published By: Trillium Software     Published Date: Apr 10, 2017
For the 11th consecutive year, the Gartner Magic Quadrant for Data Quality Tools1 research report positions Trillium Software as a leader in the Data Quality Software industry. Data Quality is vital to ensuring trust in your data-driven, decision making business processes. Confidence is the result of a well thought out and executed data quality management strategy and is critical to remaining competitive in a rapidly and ever-changing business world. The 2016 Gartner Magic Quadrant for Data Quality Tools report is a valuable reference, providing the latest insights into the strengths and cautions of leading vendors. Access the report to learn how a leading data quality solution can help you achieve your long-term strategic objectives.
Tags : 
    
Trillium Software
Published By: FairCom     Published Date: May 25, 2016
As companies embrace NoSQL as the “next big thing,” they are rightly cautious of abandoning their investment in SQL. The question a responsible developer or IT manager must investigate is “in which cases are each of these technologies, SQL and NoSQL, the appropriate solution?” For example, cloud provider BigStep offered this assessment: “NoSQL is not the best model for OLTP, ad hoc queries, complicated relationships among the data, and situations when stability and reliability outweigh the importance of speed.” While that statement may be true of many NoSQL databases, c-treeACE is the exception. Its unique, No+SQL architecture offers the advantages of SQL on top of a robust, high-performance NoSQL core engine. In this white paper, you'll read five ways c-treeACE breaks the NoSQL mold in terms of: • Data Integrity • Availability and Reliability • Complex Data Relationships • Flexible Queries • Performance
Tags : 
    
FairCom
Published By: Infogix     Published Date: May 04, 2018
Over the last few years, the term “data governance” has evolved to more prominence, concurrently with big data. While organizations understand the need for governance around big data, the implementation of a successful data governance solution continues to remain elusive as organizations grapple with what exactly data governance is. This whitepaper provides a concise definition of data governance and offers some key considerations for a successful data governance solution.
Tags : 
    
Infogix
Published By: CloverETL     Published Date: Nov 24, 2017
The volume of data is increasing by 40% per year (Source: IDC). In addition, the structure and quality of data differs vastly with a growing number of data sources. More agile ways of working with data are required. This whitepaper discusses the vast options available for managing and storing data using data architectures, and offers use cases for each architecture. Furthermore, the whitepaper explores the benefits, drawbacks and challenges of each data architecture and commonly used practices for building these architectures.
Tags : 
    
CloverETL
Published By: Syncsort     Published Date: Jan 04, 2018
The term Big Data doesn’t seem quite “big enough” anymore to properly describe the vast over-abundance of data available to organizations today. As the volume and variety of Big Data sources continue to grow, the level of trust in that data remains troublingly low. Read on and discover how a strong focus on data quality spanning the people, processes and technology of your organization will help keep your data lake pristine.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Jul 17, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Oct 25, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Datawatch     Published Date: Apr 06, 2018
Enterprises are focusing on becoming ever more data-driven, meaning that it is simply unacceptable to allow data to go to waste. Yet, as the amount of data businesses collect and control continues to increase exponentially, many organizations are failing to derive enough business value from their data. Companies are feeling the pressure to extract maximum value from all of their data, both defensive and offensive. Defensive analytics are the “plumbing aspects” of data management that must be captured to mitigate risk and establish a basic understanding of business performance. Offensive analytics build on defensive analytics and support overarching business objectives, strategic initiatives and long-term goals using predictive models. In this whitepaper, you will learn how to address many challenges, including streamlining operational reporting, delivering insight and providing a single, unified platform for everyone.
Tags : 
    
Datawatch
Published By: Zoomdata     Published Date: Apr 11, 2018
How do you imagine data? If you’re thinking about it in terms of uniform records and databases, it’s time to make a brain update. Back in the day, analytical engines were limited, so our perception of what could be considered data was, too. Today, the big data renaissance has begun, and actually, more of the data exists outside of databases than inside them, plus, EVERYTHING is data. We’re going to help you discover how business intelligence and data sources of today have changed, and as a result, so has our approach to analyzing data. An eBook on this very topic is waiting for you—all it takes is the click of a button.
Tags : 
    
Zoomdata
Published By: birst     Published Date: Jan 21, 2013
This Dive Deep analyst report looks at the process of building an environment for what can be aptly termed Agile Business Analytics.
Tags : 
data, data management, data governance, big data, cloud, business intelligence, semantic technology, nosql, information quality, data quality, metadata, enterprise information management, master data management, mdm, analytics, database
    
birst
Published By: First San Francisco Partners     Published Date: Nov 20, 2013
One of the biggest challenges in a data management initiative is aligning different and sometimes competing organizations to work towards the same long-term vision. It is very difficult to execute a data management program all at once, or as a “big bang” approach. Rather, the program should be deployed in phases over time, starting in one area and incrementally building out and adding value to the rest of the organization over time.
Tags : 
data, data management, enterprise information management, enterprise data management, white paper
    
First San Francisco Partners
Published By: MapR Technologies     Published Date: Jul 26, 2013
Enterprises are faced with new requirements for data. We now have big data that is different from the structured, cleansed corporate data repositories of the past.Before, we had to plan out structured queries. In the Hadoop world, we don’t have to sort data according to a predetermined schema when we collect it. We can store data as it arrives and decide what to do with it later. Today, there are different ways to analyze data collected in Hadoop—but which one is the best way forward?
Tags : 
white paper, hadoop, nosql, mapr, mapr technologies
    
MapR Technologies
Published By: Semarchy     Published Date: Aug 18, 2016
David Loshin reexamines the way we ingest, manage, consume, and transform data into actionable information and intelligence. Read how this industry expert makes the case for data governance with an unconventional business-first focus. The conventional wisdom on data governance proposes hierarchies, operating models, and processes for data policy definition and implementation. Unfortunately, poorly-designed and minimally-planned data governance processes are ineffective because they are bureaucratic and overwhelming. This is especially true when processes are imposed by fiat, take a long time, and don't result in any short-term improvement in information value. But proper data governance is a critical success factor for master data management! In this paper, we examine the motivations for coupling data governance with master data management and consider how to evolve data policies and processes to position master data management for success.
Tags : 
    
Semarchy
Published By: DATAVERSITY     Published Date: Jun 17, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate,if needed) the strategic and tactical business drivers every enterprise needs for market success.
Tags : 
research paper, data, data management, data governance, data steward
    
DATAVERSITY
Published By: Spectrum Enterprise     Published Date: Oct 29, 2018
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively. When you buy an Internet connection from Spectrum Enterprise, you’re buying a pipe between your office and the Internet with a set capacity, whether it is 25 Mbps, 10 Gbps, or any increment in between. However, the bandwidth we provide does not tell the whole story; it is the throughput of the entire system that matters. Throughput is affected by obstacles, overhead and latency, meaning the throughput of the system will never equal the bandwidth of your Internet connection. The good news is that an Internet connection from Spectrum Enterprise is engineered to ensure you receive the capacity you purchase; we proactively monitor your bandwidth to ensure problems are dealt with promptly, and we are your advocates across the Internet w
Tags : 
    
Spectrum Enterprise
Published By: Dell     Published Date: Nov 02, 2018
For many organizations, digital transformation (DX) is the most strategically important initiative for the organization and may determine its ability to compete in the coming decade. IDC estimates that 60% of organizations will have created and begun implementation of a digital transformation strategy by 2020. These DX initiatives are designed to take the organization forward as a proactive, data-driven company that uses and monetizes data to gain competitive advantage in the marketplace.
Tags : 
    
Dell
Published By: Zendesk Ltd     Published Date: Sep 11, 2018
Even if companies understand that an omnichannel approach performs better in terms of operational metrics and meeting customer expectations, there’s still the question of how to do it. How should companies go about adopting an omnichannel support solution?
Tags : 
    
Zendesk Ltd
Published By: Dell     Published Date: Nov 12, 2018
Today, IT leaders address the PC lifecycle across a continuum from control to transformation. Control is geared to optimization, while transformation focuses on the business impact of technology. Though the two approaches differ, they are not in opposition. They strive for the same goals and face similar challenges. As IT leaders provide their workforce with the tools to carry out the corporate mission, they should develop a PC lifecycle strategy that encompasses the key organizational needs of systems management, end-user productivity, business innovation and data-centric security. Read this Dell whitepaper to learn more about the findings of a recent Forrester Consulting study, “Digital Transformers Innovate, Digital Controllers Optimize”. This paper will help clarify the PC lifecycle continuum, from the basics of control to the advanced levels of transformation, so you will be better equipped to determine the needs of your organization on that spectrum.
Tags : 
    
Dell
Published By: Dell     Published Date: Nov 12, 2018
In December 2017, Dell commissioned Forrester Consulting to conduct a study refresh to determine how enterprise organizations are structured from an IT departmental perspective. The study explored two types of IT: digital controllers and digital transformers; and the trends and challenges seen in PC provisioning. Digital controllers are often associated with top-down approach, linear structure, and emphasize security and accuracy. In contrast, digital transformers focus on innovation, employee-and customer-centricity, and prioritize speed and flexibility. By understanding the two groups, enterprises can overcome challenges that arise from PC life-cycle management. By investing in existing PC management tools and partnering with a company that specializes in PC deployment and management, firms can empower employees to better serve customers. Download this Forrester report to learn more about the approach and strategy differences in how these two groups address the dynamic digital demand
Tags : 
    
Dell
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.