machine

Results 1 - 25 of 761Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 24, 2014
Will the “programmable era” of computers be replaced by Cognitive Computing systems which can learn from interactions and reason through dynamic experience just like humans? With rapidly increasing volumes of Big Data, there is a compelling need for smarter machines to organize data faster, make better sense of it, discover insights, then learn, adapt, and improve over time without direct programming. This paper is sponsored by: Cognitive Scale.
Tags : 
data, data management, cognitive computing, machine learning, artificial intelligence, research paper
    
DATAVERSITY
Published By: Ataccama     Published Date: Feb 26, 2019
Machine learning is a powerful tool for the transformation of data management, and with increasing amounts of data in an organization’s possession, it is more vital than ever. Learn how Ataccama leverages cutting-edge machine learning techniques to automate manual and time-consuming tasks and optimize performance. By developing AI-powered data cataloging and master data management solutions, we can help stem the tide of data chaos and ensure that your data is orderly, correct, and easy to find.
Tags : 
    
Ataccama
Published By: Stardog Union     Published Date: Mar 13, 2019
Enterprises must transition to contextualizing their data instead of just collecting it in order to fully leverage their data as a strategic asset. Existing data management solutions such as databases and data lakes encourage data sprawl and duplication. However, true data unification can be achieved with a Knowledge Graph, which seamlessly layers on top of your existing data infrastructure to reveal the interrelationships in your data, no matter its source or format. The Knowledge Graph is also a highly scalable solution since it retains every analysis performed as a reusable asset -- drastically reducing the need for data wrangling over time. Download Knowledge Graphs 101 to learn how this technology differs from a graph database, how it compares to MDM and data lake solutions, and how to leverage artificial intelligence and machine learning within a Knowledge Graphs.
Tags : 
    
Stardog Union
Published By: CData     Published Date: Jan 04, 2019
The growth of NoSQL continues to accelerate as the industry is increasingly forced to develop new and more specialized data structures to deal with the explosion of application and device data. At the same time, new data products for BI, Analytics, Reporting, Data Warehousing, AI, and Machine Learning continue along a similar growth trajectory. Enabling interoperability between applications and data sources, each with a unique interface and value proposition, is a tremendous challenge. This paper discusses a variety of mapping and flattening techniques, and continues with examples that highlight performance and usability differences between approaches.
Tags : 
data architecture, data, data management, business intelligence, data warehousing
    
CData
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: MapR Technologies     Published Date: Aug 01, 2018
How do you get a machine learning system to deliver value from big data? Turns out that 90% of the effort required for success in machine learning is not the algorithm or the model or the learning - it's the logistics. Ted Dunning and Ellen Friedman identify what matters in machine learning logistics, what challenges arise, especially in a production setting, and they introduce an innovative solution: the rendezvous architecture. This new design for model management is based on a streaming approach in a microservices style. Rendezvous addresses the need to preserve and share raw data, to do effective model-to-model comparisons and to have new models on standby, ready for a hot hand-off when a production model needs to be replaced.
Tags : 
    
MapR Technologies
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data, data analysis
    
Skytree
Published By: VoltDB     Published Date: Feb 12, 2016
The need for fast data applications is growing rapidly, driven by the IoT, the surge in machine-to-machine (M2M) data, global mobile device proliferation, and the monetization of SaaS platforms. So how do you combine real-time, streaming analytics with real-time decisions in an architecture that’s reliable, scalable, and simple? In this report, Ryan Betts and John Hugg from VoltDB examine ways to develop apps for fast data, using pre-defined patterns. These patterns are general enough to suit both the do-it-yourself, hybrid batch/streaming approach, as well as the simpler, proven in-memory approach available with certain fast database offerings.
Tags : 
    
VoltDB
Published By: Reltio     Published Date: May 22, 2018
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.” Reltio executes the vision for next-generation MDM by converging trusted data management with business insight solutions at scale and in the cloud. Machine learning and graph technology capabilities enable a contextual data model while also maintaining temporal and lineage changes of the master data.
Tags : 
    
Reltio
Published By: Reltio     Published Date: Nov 16, 2018
Big data is growing faster than the capabilities available to manage and analyze it. Get this vendor comparison to learn how a modern master data management platform will help you to achieve better outcomes.
Tags : 
    
Reltio
Published By: Syncsort     Published Date: Jul 17, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Oct 25, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Attivio     Published Date: Mar 14, 2018
Did you ever consider all of the examples of machine learning in your personal life? Google’s page ranking system, photo tagging on Facebook, and customized product recommendations from Amazon are all driven by machine learning under the hood. How do these same techniques improve productivity for your business? Search is the new data and content curation. Improved relevance translates to faster search results and better business outcomes across the line. Download the Five-Minute Guide to Machine Learning to find out how self-learning technologies drive increasingly relevant answers and better context for cognitive search.
Tags : 
    
Attivio
Published By: Converseon     Published Date: Apr 02, 2018
Separating signals from noisy social listening data has long been a problem for data scientists. Poor precision due to slag, sarcasm and implicit meaning has often made it too challenging to effectively model. Today, however, new approaches that leverage active machine learning are rapidly over taking aging rules-based techniques and opening up use of this data in new and important ways. This paper provides some detail on the evolution of text analysis including current best practices and how AI can be used by data scientists to use this data for meaningful analysis.
Tags : 
    
Converseon
Published By: Semantic Web Company     Published Date: Jun 27, 2018
Get a comprehensive introduction to AI technologies and learn why semantics should be a fundamental element of any AI strategy. Semantic enhanced artificial intelligence (Semantic AI) is based on the fusion of semantic technologies and machine learning. In this white paper, you will understand how to align the work of data scientists and subject matter experts to increase the business value of your data lake.
Tags : 
    
Semantic Web Company
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: Splice Machine     Published Date: May 19, 2014
SQL-on-Hadoop solutions have become very popular recently as companies solve the data access issues with Hadoop or seek a scale-out alternative for traditional relational database management systems. However, with all of the options available, choosing which solution is right for your business can be a daunting task. This white paper discusses the options you should consider and questions to ask, including: Is it really “Real-Time”? Is it true SQL? Does it support secondary indexes? Can it efficiently handle sparse data? Can it deliver fast performance on massive joins? Read this white paper to get a better understanding of the SQL-on-Hadoop landscape and what questions you should ask to identify best solution for your business.
Tags : 
white paper, splice machine, sql, hadoop, nosql, nosql white paper, hadoop white paper, dataversity
    
Splice Machine
Published By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
The bar for success is rising in higher education.  University leaders and IT administrators are aware of the compelling benefits of digital transformation overall—and artificial intelligence (AI) in particular. AI can amplify human capabilities by using machine learning, or deep learning, to convert the fast-growing and plentiful sources of data about all aspects of a university into actionable insights that drive better decisions. But when planning a transformational strategy, these leaders must prioritize operational continuity. It’s critical to protect the everyday activities of learning, research, and administration that rely on the IT infrastructure to consistently deliver data to its applications.
Tags : 
    
Hewlett Packard Enterprise
Published By: TIBCO Software     Published Date: Mar 15, 2019
On-demand Webinar The current trend in manufacturing is towards tailor-made products in smaller lots with shorter delivery times. This change may lead to frequent production modifications resulting in increased machine downtime, higher production cost, product waste—and the need to rework faulty products. Watch this webinar to learn how TIBCO’s Smart Manufacturing solutions can help you overcome these challenges. You will also see a demonstration of TIBCO technology in action around improving yield and optimizing processes while also saving costs. What You Will Learn: Applying advanced analytics & machine learning / AI techniques to optimize complex manufacturing processes How multi-variate statistical process control can help to detect deviations from a baseline How to monitor in real time the OEE and produce a 360 view of your factory The webinar also highlights customer case studies from our clients who have already successfully implemented process optimization models. Speakers:
Tags : 
    
TIBCO Software
Published By: IBM APAC     Published Date: Mar 06, 2019
The Forrester Study on cost savings and business benefits enabled by Watson Studio and Watson Knowledge Catalog. Watson Studio provides a suite of tools for data scientists, application developers, and subject matter experts to collaboratively and easily work with data and use that data to build, train and deploy machine learning models at scale. The Forrester provides readers a framework to evaluate the potential financial impact of the Watson Studio and Watson Knowledge Catalog investment on their organizations.
Tags : 
    
IBM APAC
Published By: IBM APAC     Published Date: Mar 06, 2019
A single environment to build, train, and deploy machine-learning and deep learning models. Take advantage of machine learning and AI to analyze your data. Catalog your data to make it easy to find. All applications are free and without time limit!
Tags : 
    
IBM APAC
Published By: Avi Networks     Published Date: Mar 12, 2019
TCPdump may be old, but does that matter? Application proliferation as apps move from bare metal to virtual machine to containers Network teams are asked to do more with the same number of people Network analysis in public clouds and containers requires different tools
Tags : 
    
Avi Networks
Published By: Intel     Published Date: Feb 28, 2019
The confluence of AI and Industry 4.0 is transforming image processing. As image vision becomes widespread, there is an increasing need to transition stand-alone imaging to an integrated driver of automation feeding insights back into the business systems that monitor overall factory performance. Download the whitepaper to learn more about fitting multiple demands into a single platform— • Building an industrial system with advanced functions like machine vision and Industry 4.0 connectivity • Minimizing the footprint of the systems to save space, cost and power consumption • Adhering to principles of long life, safety, reliability, real-time control functionality alongside AI and IIOT capabilities
Tags : 
    
Intel
Published By: Intel     Published Date: Feb 28, 2019
Keeping the lights on in a manufacturing environment remains top priority for industrial companies. All too often, factories are in a reactive mode, relying on manual inspections that risk downtime because they don’t usually reveal actionable problem data. Find out how the Nexcom Predictive Diagnostic Maintenance (PDM) system enables uninterrupted production during outages by monitoring each unit in the Diesel Uninterrupted Power Supplies (DUPS) system noninvasively. • Using vibration analysis, the system can detect 85% of power supply problems before they do damage or cause failure • Information processing for machine diagnostics is done at the edge, providing real-time alerts on potential issues with ample of lead time for managers to rectify • Graphic user interface offers visual representation and analysis of historical and trending data that is easily consumable
Tags : 
    
Intel
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept