read

Results 1 - 25 of 5002Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Oct 04, 2016
This report evaluates each question posed in a recent survey and provides subsequent analysis in a detailed format that includes the most noteworthy statistics, direct comments from survey respondents, and the influence on the industry as a whole. It seeks to present readers with a thorough review of the state of Metadata Management as it exists today.
Tags : 
    
DATAVERSITY
Published By: Alation     Published Date: Jan 06, 2017
90% of the time that is spent creating new reports is recreating information that already exists. Without a way to effectively share prior work and identify verified data sources, analysts and other data consumers lack shared context on how to apply data to analytic inquiries and business decision making. Time is wasted tracking down subject matter experts and trying to unearth tribal knowledge. Leading analytic organizations in retail, healthcare, financial services and technology are using data catalogs to help their analysts find, understand and use data appropriately. What are the 5 critical capabilities of a data catalog? Learn more here:
Tags : 
    
Alation
Published By: Alation     Published Date: Jan 06, 2017
90% of the time that is spent creating new reports is recreating information that already exists. Without a way to effectively share prior work and identify verified data sources, analysts and other data consumers lack shared context on how to apply data to analytic inquiries and business decision making. Time is wasted tracking down subject matter experts and trying to unearth tribal knowledge. Leading analytic organizations in retail, healthcare, financial services and technology are using data catalogs to help their analysts find, understand and use data appropriately. What are the 5 critical capabilities of a data catalog? Learn more here:
Tags : 
    
Alation
Published By: SAP     Published Date: May 19, 2016
SAP® solutions for enterprise information management (EIM) support the critical abilities to architect, integrate, improve, manage, associate, and archive all information. By effectively managing enterprise information, your organization can improve its business outcomes. You can better understand and retain customers, work better with suppliers, achieve compliance while controlling risk, and provide internal transparency to drive operational and strategic decisions.
Tags : 
    
SAP
Published By: Embarcadero     Published Date: Jul 23, 2015
Whether you’re working with relational data, schema-less (NoSQL) data, or model metadata, you need a data architecture that can actively leverage information assets for business value. The most valuable data has high quality, business context, and visibility across the organization. Check out this must-read eBook for essential insights on important data architecture topics.
Tags : 
    
Embarcadero
Published By: MarkLogic     Published Date: Aug 04, 2014
The Age of Information and the associated growth of the World Wide Web has brought with it a new problem: how to actually make sense of all the information available. The overarching goal of the Semantic Web is to change that. Semantic Web technologies accomplish this goal by providing a universal framework to describe and link data so that it can be better understood and searched holistically, allowing both people and computers to see and discover relationships in the data. Today, organizations are leveraging the power of the Semantic Web to aggregate and link disparate data, improve search navigation, provide holistic search and discovery, dynamically publish content, and complete ETL processes faster. Read this white paper to gain insight into why Semantics is important, understand how Semantics works, and see examples of Semantics in practice.
Tags : 
data, data management, whitepaper, marklogic, semantic, semantic technology, nosql, database, semantic web, big data
    
MarkLogic
Published By: MapR Technologies     Published Date: Aug 01, 2018
How do you get a machine learning system to deliver value from big data? Turns out that 90% of the effort required for success in machine learning is not the algorithm or the model or the learning - it's the logistics. Ted Dunning and Ellen Friedman identify what matters in machine learning logistics, what challenges arise, especially in a production setting, and they introduce an innovative solution: the rendezvous architecture. This new design for model management is based on a streaming approach in a microservices style. Rendezvous addresses the need to preserve and share raw data, to do effective model-to-model comparisons and to have new models on standby, ready for a hot hand-off when a production model needs to be replaced.
Tags : 
    
MapR Technologies
Published By: Cambridge Semantics     Published Date: Aug 17, 2015
As the quantity and diversity of relevant data grows within and outside of the enterprise, business users and IT are struggling to extract maximum value from this data. Current approaches, including the rigid relational data warehouse and the unwieldy Hadoop-only Data Lake, are limited in their ability to provide users and IT with the answers they need with the proper governance and security required. Read this whitepaper to learn how The Anzo Smart Data Lake from Cambridge Semantics solves these problems by disrupting the way IT and business alike manage and analyze data at enterprise scale with unprecedented flexibility, insight and speed.
Tags : 
    
Cambridge Semantics
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: Cloudant - an IBM Company     Published Date: Aug 01, 2015
The database you pick for your next web or mobile application matters now more than ever. Today’s applications are expected to run non-stop and must efficiently manage continuously growing amounts of transactional and multi-structured data in order to do so. This has caused NoSQL to grow from a buzzword to a serious consideration for every database, from small shops to the enterprise. Read this whitepaper to learn why NoSQL databases have become such a popular option, explore the various types available, and assess whether you should consider implementing a NoSQL solution for your next application.
Tags : 
    
Cloudant - an IBM Company
Published By: AnalytixDS     Published Date: Mar 21, 2017
AnalytiX DS Mapping Manager is an award winning platform that has enabled organizations worldwide to meet challenges, bring wide-spread collaboration, and put structure and governance in place, regardless of the size of their architecture, data, or user base. Scalable, adaptable, and value-added through its feature rich modules, it can make a difference as to whether you see BASEL III as a challenge or an opportunity.
Tags : 
    
AnalytixDS
Published By: AnalytixDS     Published Date: May 04, 2018
The General Data Protection Regulation (GDPR) is a regulation by which the European Parliament, the Council of the European Union and the European Commission intend to strengthen and unify data protection for all individuals within the European Union (EU). It also addresses the export of personal data outside the EU. The General Data Protection Regulation (GDPR) will go into effect from May 25 2018, making organizations accountable for personal data protection including how and where data is stored and how it is processed within the organization. Get ready for the most comprehensive governance and automation platform in the industry.
Tags : 
    
AnalytixDS
Published By: Neo Technology     Published Date: Jun 28, 2015
The future of Master Data Management is deriving value from data relationships which reveal more data stories that become more and more important to competitive advantage as we enter into the future of data and business analytics. MDM will be about supplying consistent, meaningful views of master data and being able to unify data into one location, especially to optimize for query performance and data fit. Graph databases offer exactly that type of data/performance fit. Use data relationships to unlock real business value in MDM: - Graphs can easily model both hierarchical and non-hierarchical master data - The logical model IS the physical model making it easier for business users to visualize data relationships - Deliver insights in real-time from data relationships in your master data - Stay ahead of the business with faster development Download and read the white paper Your Master Data Is a Graph: Are You Ready? to learn why your master data is a graph and how graph databases like Neo4j are the best technologies for MDM.
Tags : 
database, nosql, graph database, big data, master data management, mdm
    
Neo Technology
Published By: Experian     Published Date: May 17, 2016
Every year, Experian Data Quality conducts a study to look at the global trends in data quality. This year, research findings reveal how data practitioners are leveraging and managing data to generate actionable insight, and how proper data management is becoming an organization-wide imperative. This study polled more than 1,400 people across eight countries globally from a variety of roles and departments. Respondents were chosen based on their visibility into their orgazation's customer data management practices. Read through our research report to learn: - The changes in channel usage over the last 12 months - Expected changes in big data and data management initiatives - Multi-industry benchmarks, comparisons, and challenges in data quality - And more! Our annual global benchmark report takes a close look at the data quality and data management initiatives driving today's businesses. See where you line up and where you can improve.
Tags : 
    
Experian
Published By: Experian     Published Date: Mar 30, 2017
Businesses today recognize the importance of the data they hold, but a general lack of trust in the quality of their data prevents them from achieving strategic business objectives. Nearly half of organizations globally say that a lack of trust in their data contributes to increased risk of non-compliance and regulatory penalties (52%) and a downturn in customer loyalty (51%). To be of value to organizations, data needs to be trustworthy. In this report, you will read about the findings from this unique study, including: · How data powers business opportunities · Why trusted data is essential for performance · Challenges that affect data quality · The current state of data management practices · Upcoming data-related projects in 2017
Tags : 
    
Experian
Published By: Reltio     Published Date: May 21, 2018
Effective May 25, 2018, General Data Protection Regulation (GDPR) will represent the most rigorous data protection regulation ever. Complying is not optional, and the penalties are very high. As companies scurry for total compliance, it makes sense to pause, assess, and use this opportunity not only for compliance but for managing customer data efficiently and gainfully. Reltio proposes ten simple steps to ensure your data management strategy is ready for GDPR, not just for assured compliance but going beyond and enabling better customer experiences and building business competence.
Tags : 
    
Reltio
Published By: Alation     Published Date: Mar 26, 2018
When analysts and data scientists have access to data governance policies and best practices directly within the flow of their analysis, the result is both more consistent compliance and more broadly adopted best practices. With the combination of Tableau and Alation, organizations can not only balance the demands of agility and governance, they can actually optimize for both at the same time. We call this Governance for Insight. Learn more about how Tableau users can have the best of worlds - agility and governance. Read Enabling Governance for Insight: Trust in Data with Tableau and Alation’s Data Catalogs. Register on the right to access a complimentary copy of this joint white paper from Tableau and Alation.
Tags : 
    
Alation
Published By: FairCom     Published Date: May 25, 2016
As companies embrace NoSQL as the “next big thing,” they are rightly cautious of abandoning their investment in SQL. The question a responsible developer or IT manager must investigate is “in which cases are each of these technologies, SQL and NoSQL, the appropriate solution?” For example, cloud provider BigStep offered this assessment: “NoSQL is not the best model for OLTP, ad hoc queries, complicated relationships among the data, and situations when stability and reliability outweigh the importance of speed.” While that statement may be true of many NoSQL databases, c-treeACE is the exception. Its unique, No+SQL architecture offers the advantages of SQL on top of a robust, high-performance NoSQL core engine. In this white paper, you'll read five ways c-treeACE breaks the NoSQL mold in terms of: • Data Integrity • Availability and Reliability • Complex Data Relationships • Flexible Queries • Performance
Tags : 
    
FairCom
Published By: T4G     Published Date: Mar 15, 2017
About to embark on an advanced analytics project? Or have already started and things aren’t going as planned? This white paper will share our approach to ensure you set up for success. We will discuss aspects of data strategy and data stewardship, and why they are so important. We will outline the benefits of having a solid data strategy and how to start the data strategy conversation within your organization. We will then outline how a project’s entry point (the initial impetus for the project) impacts the scope and approach for the project. And will show you how to avoid missteps and gaps that can lead to less than stellar results or wasted effort. The white paper will touch on the importance of understanding your business drivers and how to use them as your guide to get the most out of your data driven decision making projects. Our proven approach will help ensure a successful start on your advanced analytics journey.
Tags : 
    
T4G
Published By: Infogix     Published Date: Apr 21, 2017
Data Governance and GDPR go Together Like Peanut Butter and Jelly May 2018 might sound far away, but any organization that does business in the EU must be prepared on day one to comply with the new General Data Protection Regulation (GDPR). This regulation carries stiff penalties for non-compliance – first time violators should expect to pay up to the greater of 4% of global annual revenue or 20 million EUR in fines – so it behooves organizations to cross their i's and dot their t's in regards to their GDPR plan. One integral component that is vital is instituting data governance to understand the organization’s data from a business perspective. Learn more about "What is considered Personally Identifiable Information?”, “What are the GDPR compliance obligations?”, and “Why data governance is vital?” in an easy to read white paper titled: General Data Protection Regulation (GDPR) and the Vital Role of Data Governance.
Tags : 
    
Infogix
Published By: Alteryx     Published Date: May 24, 2017
Spreadsheets are a mainstay in almost every organization. They are a great way to calculate and manipulate numeric data to make decisions. Unfortunately, as organizations grow, so does the data, and relying on spreadsheet-based tools like Excel for heavy data preparation, blending and analysis can be cumbersome and unreliable. Alteryx, Inc. is a leader in self-service data analytics and provides analysts with the ability to easily prep, blend, and analyze all data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. This paper highlights how transitioning from a spreadsheet-based environment to an Alteryx workflow approach can help analyst better understand their data, improve consistency, and operationalize analytics through a flexible deployment and consumption environment.
Tags : 
    
Alteryx
Published By: Syncsort     Published Date: Jan 04, 2018
The term Big Data doesn’t seem quite “big enough” anymore to properly describe the vast over-abundance of data available to organizations today. As the volume and variety of Big Data sources continue to grow, the level of trust in that data remains troublingly low. Read on and discover how a strong focus on data quality spanning the people, processes and technology of your organization will help keep your data lake pristine.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Oct 31, 2018
Assessing data quality on an ongoing basis is necessary to know how well the organization is doing at maximizing data quality. Otherwise, you’ll be investing time and money in a data quality strategy that may or may not be paying off. To measure data quality and track the effectiveness of data quality improvement efforts, you need – well...data. What does a data quality assessment look like in practice? Read this eBook for a further look into four ways to measure data quality.
Tags : 
    
Syncsort
Published By: Dataiku     Published Date: Feb 01, 2018
A proof of concept (POC) is a popular way for businesses to evaluate the viability of a system, product, or service to ensure it meets specific needs or sets of predefined requirements. But what does running a POC mean in practice specifically for data science? POCs should prove not just that a solution solves one particular, specific problem, but that the solution in question will provide widespread value to the company: that it's capable of bringing a data-driven perspective to a range of the business's strategic objectives. Get the 7 steps to running an efficient POC in this white paper.
Tags : 
    
Dataiku
Published By: Dataiku     Published Date: Feb 19, 2018
A proof of concept (POC) is a popular way for businesses to evaluate the viability of a system, product, or service to ensure it meets specific needs or sets of predefined requirements. But what does running a POC mean in practice specifically for data science? POCs should prove not just that a solution solves one particular, specific problem, but that the solution in question will provide widespread value to the company: that it's capable of bringing a data-driven perspective to a range of the business's strategic objectives. Get the 7 steps to running an efficient POC in this white paper.
Tags : 
    
Dataiku
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept