processing

Results 1 - 25 of 417Sort Results By: Published Date | Title | Company Name
Published By: Databricks     Published Date: Sep 13, 2018
Learn how to get started with Apache Spark™ Apache Spark™’s ability to speed analytic applications by orders of magnitude, its versatility, and ease of use are quickly winning the market. With Spark’s appeal to developers, end users, and integrators to solve complex data problems at scale, it is now the most active open source project with the big data community. With rapid adoption by enterprises across a wide range of industries, Spark has been deployed at massive scale, collectively processing multiple petabytes of data on clusters of over 8,000 nodes. If you are a developer or data scientist interested in big data, learn how Spark may be the tool for you. Databricks is happy to present this ebook as a practical introduction to Spark. Download this ebook to learn: • Spark’s basic architecture • Why Spark is a popular choice for data analytics • What tools and features are available • How to get started right away through interactive sample code
Tags : 
    
Databricks
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: graphgrid     Published Date: Oct 19, 2018
Graph databases are about to catapult across the famous technology adoption chasm and land in start-ups, enterprises and government agencies across the globe. The adoption antibodies are subsiding as the power of natively connected data becomes fundamental to any organization looking for data-driven insights across operations, suppliers, and customers. Moore’s Law increases in storage capacity and processing power can no longer keep up with the pace of data expansion, yet how companies structure and analyze their data ultimately will impact their ability to compete. Unstructured, disconnected data is useless. Graph databases will rapidly jump from niche use cases to a transformative IT technology as they enable turning the data you collect into actionable insights. Data will become the single most differentiating asset for your organization.
Tags : 
    
graphgrid
Published By: DATAVERSITY     Published Date: Dec 27, 2013
There are actually many elements of such a vision that are working together. ACID and NoSQL are not the antagonists they were once thought to be; NoSQL works well under a BASE model, but also some of the innovative NoSQL systems fully conform to ACID requirements. Database engineers have puzzled out how to get non-relational systems to work within an environment that demands high availability, scalability, with differing levels of recovery and partition tolerance. BASE is still a leading innovation that is wedded to the NoSQL model, and the evolution of both together is harmonious. But that doesn’t mean they always have to be in partnership; there are several options. So while the opening anecdote is true in many cases, organizations that need more diverse possibilities can move into the commercial arena and get the specific option that works best for them. This paper is sponsored by: MarkLogic.
Tags : 
nosql, database, acid v base, white paper
    
DATAVERSITY
Published By: MemSQL     Published Date: Jun 25, 2014
Emerging business innovations focused on realizing quick business value on new and growing data sources require “hybrid transactional and analytical processing” (HTAP), the notion of performing analysis on data directly in an operational data store. While this is not a new idea, Gartner reports that the potential for HTAP has not been fully realized due to technology limitations and inertia in IT departments. MemSQL offers a unique combination of performance, flexibility, and ease of use that allows companies to implement HTAP to power their business applications.
Tags : 
    
MemSQL
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data, data analysis
    
Skytree
Published By: Expert System     Published Date: Mar 19, 2015
Establishing context and knowledge capture In today’s knowledge-infused world, it is vitally important for organizations of any size to deploy an intuitive knowledge platform that enables delivery of the right information at the right time, in a way that is useful and helpful. Semantic technology processes content for meaning, allowing for the ability to understand words in context: it allows for better content processing and interpretation, therefore enabling content organization and navigation, which in turn increases findability.
Tags : 
enterprise data management, unstructured data, semantic technology, expert system
    
Expert System
Published By: Ontotext     Published Date: Dec 21, 2015
Learn how semantic technologies make any content intelligent and turn it into revenue for your publishing business There is a smarter, cost-effective way for publishers to create, maintain and reuse content assets with higher accuracy. It is called dynamic semantic publishing. Putting Semantic Technologies at Work for the Publishing Industry An efficient blend of semantic technologies, dynamic semantic publishing enables powerful experiences when it comes to publishers’ main stock of trade: processing and representing information.
Tags : 
    
Ontotext
Published By: WhereScape     Published Date: Aug 18, 2016
Data Vault 2.0 leverages parallel database processing for large data sets and provides an extensible approach to design that enables agile development. WhereScape provides data warehouse automation software solutions that enable Data Vault agile project delivery through accelerated development, documentation and deployment without sacrificing quality or flexibility.
Tags : 
    
WhereScape
Published By: Basho     Published Date: Nov 25, 2015
The landscape of Scalable Operational and Analytical Systems is changing and disrupting the norm of using relational databases for all workloads. With the growing need to process and analyze Big Data at Scale, the demand for alternative strategies has grown and has given rise to the emergence of NoSQL databases for scalable processing. Mike Ferguson, Managing Director of Intelligent Business Strategies, is an independent IT Analyst who specializes in Big Data, BI/Analytics, Data Management and Enterprise Business Integration. In this whitepaper he will discuss the movement towards NoSQL databases for scalable operational and analytical systems, what’s driving Big Data analytics from Hadoop to the emergence of Apache Spark, the value of operational analytics and the importance of in-memory processing, and why use Apache Spark as your in-memory analytical platform for operational analytics.
Tags : 
    
Basho
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: Ricoh     Published Date: Oct 02, 2018
Your business is changing. As a finance leader, you know that accounting is a labour-intensive, costly process where systems often don’t allow for expedient exception handling and many days are fraught with difficulty in matching invoices to other databases for reconciliation. Like most companies, you know where you want to go but may not have infrastructure or internal expertise to handle electronic fund transfers, credit card payments or cheque processing— all the pieces required to make your vision for an efficient, integrated operation a reality.
Tags : 
    
Ricoh
Published By: Cognizant     Published Date: Oct 23, 2018
A group of emerging technologies is rapidly creating numerous opportunities for life sciences companies to improve productivity, enhance patient care and ensure regulatory compliance. These technologies include robotic process automation (RPA), artificial intelligence (AI), machine learning (ML), blockchain, the Internet of Things (IoT), 3-D printing and augmented reality/ virtual reality (AR/ VR). This whitepaper presents a preview of five pivotal technology trends remaking the life sciences industry: AI and automation, human augmentation, edge analytics/ processing, data ownership and protection, and the intermingling of products and services.
Tags : 
cognizant, life sciences, patient care
    
Cognizant
Published By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
    
Group M_IBM Q418
Published By: Group M_IBM Q418     Published Date: Oct 23, 2018
The General Data Protection Regulation (GDPR) framework seeks to create a harmonized data protection framework across the European Union, and aims to give back EU citizens control of their personal data by imposing stricter requirements for those hosting and processing this data, anywhere in the world. IBM is committed to putting data responsibility first and providing solutions that are secure to the core for all customers. As such, IBM Cloud has fully adopted the EU Data Protection Code of Conduct for Cloud Service providers – meaning we agree to meet the entirety of its stringent requirements.
Tags : 
    
Group M_IBM Q418
Published By: DocuSign UK     Published Date: Aug 08, 2018
"Many financial services firms have automated the vast majority of key processes and customer experiences. However, the “last mile” of most transactions – completing the agreement– far too often relies on the same inefficient pen-and-paper processes of yesteryear. Digitising agreements using DocuSign lets you keep processes digital from end to end. Completing transactions no longer requires documents to be printed and shipped, and re-keyed on the back end. Read the whitepaper to learn how leading financial services organisations use straight-through processing by automating the last mile of business transactions to: - Speed processes by 80% or more, often going from days or weeks to just minutes - Reduce NIGO by anywhere from 55% to 93% - Achieve a 300% average ROI "
Tags : 
    
DocuSign UK
Published By: TIBCO Software EMEA     Published Date: Sep 12, 2018
By processing real-time data from machine sensors using artificial intelligence and machine learning, it’s possible to predict critical events and take preventive action to avoid problems. TIBCO helps manufacturers around the world predict issues with greater accuracy, reduce downtime, increase quality, and improve yield. Read about our top data science best practices for becoming a smart manufacturer.
Tags : 
inter-company connectivity, real-time tracking, automate analytic models, efficient analytics, collaboration
    
TIBCO Software EMEA
Published By: Magnetrol     Published Date: Nov 05, 2018
U.S. Department of Energy surveys show that minor adjustments in process management can incrementally improve efficiency in commercial and heavy industries. These include pulp & paper, chemical, petroleum refining, mining and food processing where as much as 60% of their total energy consumption goes to the production of steam. The information-packed Steam Generation & Condensate Recovery Process Optimization kit from Magnetrol explains how effective instrumentation solutions can:
Tags : 
    
Magnetrol
Published By: NEC     Published Date: Sep 29, 2009
Written by: IDC Abner Germanow, Jonathan Edwards, Lee Doyle IDC believes the convergence of communications and mainstream IT architectures will drive significant innovation in business processes over the next decade.
Tags : 
it architecture, idc, automation, automated order processing, soa, service oriented architecture, soap, http, xml, wsdl, uddi, esbs, jee, .net, crm, san
    
NEC
Published By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : 
vision, high availability, ibm, aix, cdp, core union, database security
    
Vision Solutions
Published By: SAS     Published Date: Aug 17, 2018
This SAS and Intel collaborated piece demonstrates the value of modernizing your analytics infrastructure using SAS® software on Intel processing. Readers will learn: • Benefits of applying a consistent analytic vision across all functions within the organization to make more insight-driven decisions. • How IT plays a pivotal role in modernizing analytics infrastructures. • Competitive advantages of modern analytics.
Tags : 
    
SAS
Published By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
    
SAP
Published By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
    
SAP
Published By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools, analytical applications
    
SAP
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.