it system

Results 1 - 25 of 3996Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 24, 2014
Will the “programmable era” of computers be replaced by Cognitive Computing systems which can learn from interactions and reason through dynamic experience just like humans? With rapidly increasing volumes of Big Data, there is a compelling need for smarter machines to organize data faster, make better sense of it, discover insights, then learn, adapt, and improve over time without direct programming. This paper is sponsored by: Cognitive Scale.
Tags : 
data, data management, cognitive computing, machine learning, artificial intelligence, research paper
    
DATAVERSITY
Published By: Ted Hills     Published Date: Mar 08, 2017
This paper explores the differences between three situations that appear on the surface to be very similar: a data attribute that may occur zero or one times, a data attribute that is optional, and a data attribute whose value may be unknown. It shows how each of these different situations is represented in Concept and Object Modeling Notation (COMN, pronounced “common”). The theory behind the analysis is explained in greater detail by three papers: Three-Valued Logic, A Systematic Solution to Handling Unknown Data in Databases, and An Approach to Representing Non-Applicable Data in Relational Databases.
Tags : 
    
Ted Hills
Published By: Ted Hills     Published Date: Mar 08, 2017
Much has been written and debated about the use of SQL NULLs to represent unknown values, and the possible use of three-valued logic. However, there has never been a systematic application of any three-valued logic to use in the logical expressions of computer programs. This paper lays the foundation for a systematic application of three-valued logic to one of the two problems inadequately addressed by SQL NULLs.
Tags : 
    
Ted Hills
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: TopQuadrant     Published Date: Jun 01, 2017
This paper presents a practitioner informed roadmap intended to assist enterprises in maturing their Enterprise Information Management (EIM) practices, with a specific focus on improving Reference Data Management (RDM). Reference data is found in every application used by an enterprise including back-end systems, front-end commerce applications, data exchange formats, and in outsourced, hosted systems, big data platforms, and data warehouses. It can easily be 20–50% of the tables in a data store. And the values are used throughout the transactional and mastered data sets to make the system internally consistent.
Tags : 
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 11, 2018
Data governance is a lifecycle-centric asset management activity. To understand and realize the value of data assets, it is necessary to capture information about them (their metadata) in the connected way. Capturing the meaning and context of diverse enterprise data in connection to all assets in the enterprise ecosystem is foundational to effective data governance. Therefore, a data governance environment must represent assets and their role in the enterprise using an open, extensible and “smart” approach. Knowledge graphs are the most viable and powerful way to do this. This short paper outlines how knowledge graphs are flexible, evolvable, semantic and intelligent. It is these characteristics that enable them to: • capture the description of data as an interconnected set of information that meaningfully bridges enterprise metadata silos. • deliver integrated data governance by addressing all three aspects of data governance — Executive Governance, Representative Governance, and App
Tags : 
    
TopQuadrant
Published By: MapR Technologies     Published Date: Mar 29, 2016
Add Big Data Technologies to Get More Value from Your Stack Taking advantage of big data starts with understanding how to optimize and augment your existing infrastructure. Relational databases have endured for a reason – they fit well with the types of data that organizations use to run their business. These types of data in business applications such as ERP, CRM, EPM, etc., are not fundamentally changing, which suggests that relational databases will continue to play a foundational role in enterprise architectures for the foreseeable future. One area where emerging technologies can complement relational database technologies is big data. With the rapidly growing volumes of data, along with the many new sources of data, organizations look for ways to relieve pressure from their existing systems. That’s where Hadoop and NoSQL come in.
Tags : 
    
MapR Technologies
Published By: Cambridge Semantics     Published Date: May 11, 2016
With the explosive growth of Big Data, IT professionals find their time and resources squeezed between managing increasingly large and diverse siloed data stores and increased user demands for timely, accurate data. The graph-based ANZO Smart Data Manager is built to relieve these burdens by automating the process of managing, cataloging and governing data at enterprise scale and security. Anzo Smart Data Manager allows companies to truly understand their data ecosystems and leverage the metadata within it.
Tags : 
    
Cambridge Semantics
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: Aerospike     Published Date: Jul 17, 2014
This whitepaper defines provides a brief technical overview of ACID support in Aerospike. It includes a definition of ACID (Atomicity, Consistency, Isolation, Durability), and an overview of the CAP Theorem, which postulates that only two of the three properties of consistency, availability, and partition tolerance can be guaranteed in a distributed system at a specific time. Although Aerospike is an AP system with a proven track record of 100% uptime, this paper describes Aerospike's unique approach to avoiding network partitioning to also ensure high consistency. In addition, the paper describes how Aerospike will give users the option to support a CP configuration with complete consistency in the presence of networking partitioning by reducing availability.
Tags : 
whitepaper, data, data management, nosql, aerospike, acid
    
Aerospike
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data, data analysis
    
Skytree
Published By: AnalytixDS     Published Date: Feb 28, 2015
With future business intelligence solutions clearly evolving from data that comes from highly efficient and well behaved systems, to data that comes from the extended enterprise where data is not necessarily so well structured and behaved - Organizations are forced into a more collaborative mode of operation with their core infrastructure being adapted from the consumer space, and to the extent possible, conformed to their existing repositories. This whitepaper attempts to address various challenges consumers face while managing enormous data sets within the context of this complex scenario. Further, we’ll try to answer the question: Is Big Data Governance really that different from traditional data governance initiatives? Finally, we’ll see how AnalytiX™ Mapping Manager™ can help organizations accelerate the development and deployment of a successful Big Data/ Business Intelligence platform and accelerate delivery of all sorts of data – structured, semi-structured as well as unstruc
Tags : 
big data, big data governance, data governance, analytixds
    
AnalytixDS
Published By: CapTech     Published Date: May 26, 2015
Big Data is the future of business. According to CloudTweaks.com, as much as 2.5 quintillion bytes of data are produced each day, with most of this data being captured by Big Data. With its ability to transfer all data sources all into one centralized place, Big Data provides opportunities, clearer visions, customer conversations and transactions. However, with the dazzling big promise of Big Data comes a potentially huge letdown. If this vast pool of information resources is not accessible or usable, it becomes useless. This paper examines strategies for building the most value into your Big Data system by enabling process controls to effectively mine, access and secure Big Data.
Tags : 
big data, captech, data, data management, nosql
    
CapTech
Published By: GBG Loqate     Published Date: Jul 09, 2015
Businesses are vulnerable when they assume that their data is accurate, because they are almost always losing money without their knowledge. When it comes to data quality, the problems that you don’t suspect are often worse and more pervasive than the ones you are aware of. Addresses are subject to their own specific set of rules. Detecting and correcting address errors is a complex problem, and one that can only be solved with specialized software.
Tags : 
data, data management, data quality, loqate
    
GBG Loqate
Published By: Reltio     Published Date: Aug 11, 2017
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management, 2016 states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.”
Tags : 
    
Reltio
Published By: Reltio     Published Date: May 22, 2018
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.” Reltio executes the vision for next-generation MDM by converging trusted data management with business insight solutions at scale and in the cloud. Machine learning and graph technology capabilities enable a contextual data model while also maintaining temporal and lineage changes of the master data.
Tags : 
    
Reltio
Published By: Looker     Published Date: Mar 15, 2016
Data centralization merges different data streams into a common source through unified variables. This process can provide context to overly-broad metrics and enable cross-platform analytics to guide better business decisions. Investments in analytics tools are now paying back a 13.01:1 return on investment (ROI), with increased returns when these tools integrate with three or more data sour- ces. While the perks of centralization are obvious in theory, the quantity and variety of data available in today’s landscape make this difficult to achieve. This report provides a roadmap for how to connect systems, data stores, and institutions (both technological and human). Learn: • How data centralization enables better analytics • How to redefine data as a vehicle for change • How the right BI tool eliminates the data analyst bottleneck • How to define single sources of truth for your organization • How to build a data-driven (not just data-rich) organization
Tags : 
    
Looker
Published By: Amazon Web Services     Published Date: Apr 04, 2016
Amazon DynamoDB is a fully managed, NoSQL database service. Many workloads implemented using a traditional Relational Database Management System (RDBMS) are good candidates for a NoSQL database such as DynamoDB. This whitepaper details the process for identifying these candidate workloads and planning and executing a migration to DynamoDB.
Tags : 
    
Amazon Web Services
Published By: ROKITT     Published Date: Apr 11, 2016
Few things benefit an organization as much as information governance. Data is now one of the most valuable holdings for any business, but unfortunately in many environments much of the data is ignored and its potential value lost. Ignored data is also inherently less secure than data that’s tracked. Businesses need a way to bring hidden data out of the shadows and make it safe and useful again. Data discovery facilitates unearthing previously unknown data relationships. Mapping data flow and data lineage helps make data safe, compliant, and auditable. Good metadata makes a system more navigable. All these tools make data more accessible to staff and more useful for capitalizing on business opportunities.
Tags : 
    
ROKITT
Published By: Snowflake Computing     Published Date: Apr 19, 2016
Data warehouse as a service brings scalability and flexibility to organizations seeking to deliver data to all users and systems that need to analyze it. The ability to access and analyze data is the critical foundational element for competing in new and old industries alike. Yet, a recent survey of IT executives finds that most are still struggling— and frustrated — with widely used data analytics tools. Find out what your peers are saying, and see how your data analytics environment compares.
Tags : 
    
Snowflake Computing
Published By: Dataiku     Published Date: Feb 01, 2018
A proof of concept (POC) is a popular way for businesses to evaluate the viability of a system, product, or service to ensure it meets specific needs or sets of predefined requirements. But what does running a POC mean in practice specifically for data science? POCs should prove not just that a solution solves one particular, specific problem, but that the solution in question will provide widespread value to the company: that it's capable of bringing a data-driven perspective to a range of the business's strategic objectives. Get the 7 steps to running an efficient POC in this white paper.
Tags : 
    
Dataiku
Published By: Dataiku     Published Date: Feb 19, 2018
A proof of concept (POC) is a popular way for businesses to evaluate the viability of a system, product, or service to ensure it meets specific needs or sets of predefined requirements. But what does running a POC mean in practice specifically for data science? POCs should prove not just that a solution solves one particular, specific problem, but that the solution in question will provide widespread value to the company: that it's capable of bringing a data-driven perspective to a range of the business's strategic objectives. Get the 7 steps to running an efficient POC in this white paper.
Tags : 
    
Dataiku
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: Attunity     Published Date: Oct 19, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why it’s needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
    
Attunity
Published By: graphgrid     Published Date: Oct 02, 2018
Whether it’s for a specific application, optimizing your existing operations, or innovating new customer services, graph databases are a powerful technology that turn accessing and analyzing your data into a competitive advantage. Graph databases resolve the Big Data limitations and free up data architects and developers to build amazing solutions that predict behaviors, enable data driven decisions and make insightful recommendations. Yet just as cars aren’t functional with only engines, graph databases require surrounding capabilities including ingesting multi-source data, building data models that are unique to your business needs, ease of data interaction and visualization, seamless co-existence with legacy systems, high performance search capabilities, and integration of data analysis applications. Collectively, this comprehensive data platform turns graph capabilities into tangible insights that drive your business forward.
Tags : 
    
graphgrid
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept