machine data

Results 1 - 25 of 236Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 24, 2014
Will the “programmable era” of computers be replaced by Cognitive Computing systems which can learn from interactions and reason through dynamic experience just like humans? With rapidly increasing volumes of Big Data, there is a compelling need for smarter machines to organize data faster, make better sense of it, discover insights, then learn, adapt, and improve over time without direct programming. This paper is sponsored by: Cognitive Scale.
Tags : 
data, data management, cognitive computing, machine learning, artificial intelligence, research paper
    
DATAVERSITY
Published By: Ataccama     Published Date: Feb 26, 2019
Machine learning is a powerful tool for the transformation of data management, and with increasing amounts of data in an organization’s possession, it is more vital than ever. Learn how Ataccama leverages cutting-edge machine learning techniques to automate manual and time-consuming tasks and optimize performance. By developing AI-powered data cataloging and master data management solutions, we can help stem the tide of data chaos and ensure that your data is orderly, correct, and easy to find.
Tags : 
    
Ataccama
Published By: CData     Published Date: Jan 04, 2019
The growth of NoSQL continues to accelerate as the industry is increasingly forced to develop new and more specialized data structures to deal with the explosion of application and device data. At the same time, new data products for BI, Analytics, Reporting, Data Warehousing, AI, and Machine Learning continue along a similar growth trajectory. Enabling interoperability between applications and data sources, each with a unique interface and value proposition, is a tremendous challenge. This paper discusses a variety of mapping and flattening techniques, and continues with examples that highlight performance and usability differences between approaches.
Tags : 
data architecture, data, data management, business intelligence, data warehousing
    
CData
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: MapR Technologies     Published Date: Aug 01, 2018
How do you get a machine learning system to deliver value from big data? Turns out that 90% of the effort required for success in machine learning is not the algorithm or the model or the learning - it's the logistics. Ted Dunning and Ellen Friedman identify what matters in machine learning logistics, what challenges arise, especially in a production setting, and they introduce an innovative solution: the rendezvous architecture. This new design for model management is based on a streaming approach in a microservices style. Rendezvous addresses the need to preserve and share raw data, to do effective model-to-model comparisons and to have new models on standby, ready for a hot hand-off when a production model needs to be replaced.
Tags : 
    
MapR Technologies
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data, data analysis
    
Skytree
Published By: VoltDB     Published Date: Feb 12, 2016
The need for fast data applications is growing rapidly, driven by the IoT, the surge in machine-to-machine (M2M) data, global mobile device proliferation, and the monetization of SaaS platforms. So how do you combine real-time, streaming analytics with real-time decisions in an architecture that’s reliable, scalable, and simple? In this report, Ryan Betts and John Hugg from VoltDB examine ways to develop apps for fast data, using pre-defined patterns. These patterns are general enough to suit both the do-it-yourself, hybrid batch/streaming approach, as well as the simpler, proven in-memory approach available with certain fast database offerings.
Tags : 
    
VoltDB
Published By: Reltio     Published Date: May 22, 2018
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.” Reltio executes the vision for next-generation MDM by converging trusted data management with business insight solutions at scale and in the cloud. Machine learning and graph technology capabilities enable a contextual data model while also maintaining temporal and lineage changes of the master data.
Tags : 
    
Reltio
Published By: Reltio     Published Date: Nov 16, 2018
Big data is growing faster than the capabilities available to manage and analyze it. Get this vendor comparison to learn how a modern master data management platform will help you to achieve better outcomes.
Tags : 
    
Reltio
Published By: Syncsort     Published Date: Jul 17, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Oct 25, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Attivio     Published Date: Mar 14, 2018
Did you ever consider all of the examples of machine learning in your personal life? Google’s page ranking system, photo tagging on Facebook, and customized product recommendations from Amazon are all driven by machine learning under the hood. How do these same techniques improve productivity for your business? Search is the new data and content curation. Improved relevance translates to faster search results and better business outcomes across the line. Download the Five-Minute Guide to Machine Learning to find out how self-learning technologies drive increasingly relevant answers and better context for cognitive search.
Tags : 
    
Attivio
Published By: Converseon     Published Date: Apr 02, 2018
Separating signals from noisy social listening data has long been a problem for data scientists. Poor precision due to slag, sarcasm and implicit meaning has often made it too challenging to effectively model. Today, however, new approaches that leverage active machine learning are rapidly over taking aging rules-based techniques and opening up use of this data in new and important ways. This paper provides some detail on the evolution of text analysis including current best practices and how AI can be used by data scientists to use this data for meaningful analysis.
Tags : 
    
Converseon
Published By: Semantic Web Company     Published Date: Jun 27, 2018
Get a comprehensive introduction to AI technologies and learn why semantics should be a fundamental element of any AI strategy. Semantic enhanced artificial intelligence (Semantic AI) is based on the fusion of semantic technologies and machine learning. In this white paper, you will understand how to align the work of data scientists and subject matter experts to increase the business value of your data lake.
Tags : 
    
Semantic Web Company
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: Splice Machine     Published Date: May 19, 2014
SQL-on-Hadoop solutions have become very popular recently as companies solve the data access issues with Hadoop or seek a scale-out alternative for traditional relational database management systems. However, with all of the options available, choosing which solution is right for your business can be a daunting task. This white paper discusses the options you should consider and questions to ask, including: Is it really “Real-Time”? Is it true SQL? Does it support secondary indexes? Can it efficiently handle sparse data? Can it deliver fast performance on massive joins? Read this white paper to get a better understanding of the SQL-on-Hadoop landscape and what questions you should ask to identify best solution for your business.
Tags : 
white paper, splice machine, sql, hadoop, nosql, nosql white paper, hadoop white paper, dataversity
    
Splice Machine
Published By: IBM APAC     Published Date: Mar 06, 2019
A single environment to build, train, and deploy machine-learning and deep learning models. Take advantage of machine learning and AI to analyze your data. Catalog your data to make it easy to find. All applications are free and without time limit!
Tags : 
    
IBM APAC
Published By: Intel     Published Date: Feb 28, 2019
Keeping the lights on in a manufacturing environment remains top priority for industrial companies. All too often, factories are in a reactive mode, relying on manual inspections that risk downtime because they don’t usually reveal actionable problem data. Find out how the Nexcom Predictive Diagnostic Maintenance (PDM) system enables uninterrupted production during outages by monitoring each unit in the Diesel Uninterrupted Power Supplies (DUPS) system noninvasively. • Using vibration analysis, the system can detect 85% of power supply problems before they do damage or cause failure • Information processing for machine diagnostics is done at the edge, providing real-time alerts on potential issues with ample of lead time for managers to rectify • Graphic user interface offers visual representation and analysis of historical and trending data that is easily consumable
Tags : 
    
Intel
Published By: TIBCO Software APAC     Published Date: Feb 14, 2019
With the new TIBCO Spotfire® A(X) Experience, we are revolutionizing analytics and business intelligence. This new platform accelerates the personal and enterprise analytics experience so you can get from data to insights in the fastest possible way. With the fusion of technology enablers like machine learning, artificial intelligence, and natural language search, the Spotfire® X platform redefines what’s possible for analytics and business intelligence, simplifying for everyone how data and insights are generated, consumed, and acted on. Download this whitepaper to learn more, then check out the new Spotfire analytics. It’s unlike anything you have ever seen. Simple, yet powerful, it changes everything.
Tags : 
    
TIBCO Software APAC
Published By: Entrust Datacard     Published Date: Mar 20, 2017
As digital business evolves, however, we’re finding that the best form of security and enablement will likely remove any real responsibility from users. They will not be required to carry tokens, recall passwords or execute on any security routines. Leveraging machine learning, artificial intelligence, device identity and other technologies will make security stronger, yet far more transparent. From a security standpoint, this will lead to better outcomes for enterprises in terms of breach prevention and data protection. Just as important, however, it will enable authorized users in new ways. They will be able to access the networks, data and collaboration tools they need without friction, saving time and frustration. More time drives increased employee productivity and frictionless access to critical data leads to business agility. Leveraging cloud, mobile and Internet of Things (IoT) infrastructures, enterprises will be able to transform key metrics such as productivity, profitabilit
Tags : 
    
Entrust Datacard
Published By: SAP     Published Date: Sep 28, 2018
The Industrial Machinery industry is changing slower than it ever will and faster than it ever has. And customer demands are evolving at speeds never seen before. For companies serious about innovating at scale and transforming their business in order to dominate their market, it will take innovative thinking, disruptive technology and near flawless execution. This challenge, perhaps best described as the perfect blend of art and science, is more than achievable, but only if you have the right partner. Which is why we want you to meet Leonardo, by SAP. SAP Leonardo is a digital innovation system that enables organizations of all sizes to transform at scale with minimal risk and disruption. SAP Leonardo brings new technologies and services together to help businesses power their digital transformation. SAP Leonardo proves that truly transformative and sustainable innovation happens when technology, people, and data are combined.
Tags : 
customer experience & engagement, digital transformation, sap
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
The bar for success is rising in higher education.  University leaders and IT administrators are aware of the compelling benefits of digital transformation overall—and artificial intelligence (AI) in particular. AI can amplify human capabilities by using machine learning, or deep learning, to convert the fast-growing and plentiful sources of data about all aspects of a university into actionable insights that drive better decisions. But when planning a transformational strategy, these leaders must prioritize operational continuity. It’s critical to protect the everyday activities of learning, research, and administration that rely on the IT infrastructure to consistently deliver data to its applications.
Tags : 
    
Hewlett Packard Enterprise
Published By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA
Published By: Commvault     Published Date: Jul 06, 2016
Today, nearly every datacenter has become heavily virtualized. In fact, according to Gartner as many as 75% of X86 server workloads are already virtualized in the enterprise datacenter. Yet even with the growth rate of virtual machines outpacing the rate of physical servers, industry wide, most virtual environments continue to be protected by backup systems designed for physical servers, not the virtual infrastructure they are used on. Even still, data protection products that are virtualization-focused may deliver additional support for virtual processes, but there are pitfalls in selecting the right approach. This paper will discuss five common costs that can remain hidden until after a virtualization backup system has been fully deployed.
Tags : 
storage, backup, recovery, best practices, networking, it management, enterprise applications, data management
    
Commvault
Start   Previous   1 2 3 4 5 6 7 8 9 10    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept