data processing

Results 1 - 25 of 126Sort Results By: Published Date | Title | Company Name
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: MemSQL     Published Date: Jun 25, 2014
Emerging business innovations focused on realizing quick business value on new and growing data sources require “hybrid transactional and analytical processing” (HTAP), the notion of performing analysis on data directly in an operational data store. While this is not a new idea, Gartner reports that the potential for HTAP has not been fully realized due to technology limitations and inertia in IT departments. MemSQL offers a unique combination of performance, flexibility, and ease of use that allows companies to implement HTAP to power their business applications.
Tags : 
    
MemSQL
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data, data analysis
    
Skytree
Published By: WhereScape     Published Date: Aug 18, 2016
Data Vault 2.0 leverages parallel database processing for large data sets and provides an extensible approach to design that enables agile development. WhereScape provides data warehouse automation software solutions that enable Data Vault agile project delivery through accelerated development, documentation and deployment without sacrificing quality or flexibility.
Tags : 
    
WhereScape
Published By: Basho     Published Date: Nov 25, 2015
The landscape of Scalable Operational and Analytical Systems is changing and disrupting the norm of using relational databases for all workloads. With the growing need to process and analyze Big Data at Scale, the demand for alternative strategies has grown and has given rise to the emergence of NoSQL databases for scalable processing. Mike Ferguson, Managing Director of Intelligent Business Strategies, is an independent IT Analyst who specializes in Big Data, BI/Analytics, Data Management and Enterprise Business Integration. In this whitepaper he will discuss the movement towards NoSQL databases for scalable operational and analytical systems, what’s driving Big Data analytics from Hadoop to the emergence of Apache Spark, the value of operational analytics and the importance of in-memory processing, and why use Apache Spark as your in-memory analytical platform for operational analytics.
Tags : 
    
Basho
Published By: DATAVERSITY     Published Date: Dec 27, 2013
There are actually many elements of such a vision that are working together. ACID and NoSQL are not the antagonists they were once thought to be; NoSQL works well under a BASE model, but also some of the innovative NoSQL systems fully conform to ACID requirements. Database engineers have puzzled out how to get non-relational systems to work within an environment that demands high availability, scalability, with differing levels of recovery and partition tolerance. BASE is still a leading innovation that is wedded to the NoSQL model, and the evolution of both together is harmonious. But that doesn’t mean they always have to be in partnership; there are several options. So while the opening anecdote is true in many cases, organizations that need more diverse possibilities can move into the commercial arena and get the specific option that works best for them. This paper is sponsored by: MarkLogic.
Tags : 
nosql, database, acid v base, white paper
    
DATAVERSITY
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: IBM     Published Date: Jun 04, 2018
"The appearance of your reports and dashboards – the actual visual appearance of your data analysis -- is important. An ugly or confusing report may be dismissed, even though it contains valuable insights about your data. Cognos Analytics has a long track record of high quality analytic insight, and now, we added a lot of new capabilities designed to help even novice users quickly and easily produce great-looking and consumable reports you can trust. Watch this webinar to learn: • How you can more effectively communicate with data. • What constitutes an intuitive and highly navigable report • How take advantage of some of the new capabilities in Cognos Analytics to create reports that are more compelling and understandable in less time. • Some of the new and exciting capabilities coming to Cognos Analytics in 2018 (hint: more intelligent capabilities with enhancements to Natural Language Processing, data discovery and Machine Learning)."
Tags : 
data analysis, data analytics, dashboards
    
IBM
Published By: Amazon Web Services     Published Date: Jul 16, 2018
The Internet of Things (IoT) is composed of sensor-embedded devices and machines that exchange data with each other and the cloud through a secure network. Often referred to as “things” or “edge devices”, these intelligent machines connect to the internet either directly or through an IoT gateway, enabling them to send data to the cloud. Analyzing this data can reveal valuable insights about these objects and the business processes they’re part of, helping enterprises optimize their operations. Devices in IoT deployments can span nearly any industry or use case. Each one is equipped with sensors, processing power, connectivity, and software, enabling asset control and other remote interactions over the internet. Unlike traditional IT assets, these edge devices are resource-constrained (either by bandwidth, storage, or processing power) and are typically found outside of a data center, creating unique security and management considerations.
Tags : 
    
Amazon Web Services
Published By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : 
vision, high availability, ibm, aix, cdp, core union, database security
    
Vision Solutions
Published By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
    
SAP
Published By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA Tier 3 ABM
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools, analytical applications
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : 
database usage, database management, server usage, data protection
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
    
Cisco
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: IBM APAC     Published Date: Apr 27, 2018
While relying on x86 servers and Oracle databases to support their stock trading systems, processing rapidly increasing number of transactions fast became a huge challenge for Wanlian Securities. They shifted to IBM FlashSystem that helped them cut average response time for their Oracle Databases from 10 to less than 0.4 milliseconds and improved CPU usage by 15%. Download this case study now.
Tags : 
    
IBM APAC
Published By: Dome9     Published Date: Apr 25, 2018
As of May 2017, according to a report from The Depository Trust & Clearing Corporation (DTCC), which provides financial transaction and data processing services for the global financial industry, cloud computing has reached a tipping point1. Today, financial services companies can benefit from the capabilities and cost efficiencies of the cloud. In October of 2016, the Federal Deposit Insurance Corporation (FDIC), the Office of the Comptroller of Currency (OCC) and the Federal Reserve Board (FRB) jointly announced enhanced cyber risk management standards for financial institutions in an Advanced Notice of Proposed Rulemaking (ANPR)2. These proposed standards for enhanced cybersecurity are aimed at protecting the entire financial system, not just the institution. To meet these new standards, financial institutions will require the right cloud-based network security platform for comprehensive security management, verifiable compliance and governance and active protection of customer data
Tags : 
    
Dome9
Published By: Dell EMC     Published Date: Nov 09, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera and Intel.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Oct 08, 2015
Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction.
Tags : 
    
Dell EMC
Published By: SAS     Published Date: Apr 16, 2015
Former Intel CEO Andy Grove once coined the phrase, “Technology happens.” As true as Grove’s pat aphorism has become, it’s not always good news. Twenty years ago, no one ever got fired for buying IBM. In the heyday of customer relationship management (CRM), companies bought first and asked questions later. Nowadays, executives are being enlightened by the promise of big data technologies and the role data plays in the fact-based enterprise. Leaders in business and IT alike are waking up to the reality that – despite the hype around platforms and processing speeds – their companies have failed to established sustained processes and skills around data.
Tags : 
    
SAS
Published By: EMC Converged Platforms     Published Date: Oct 22, 2015
Old Dutch Foods, known for its broad selection of snack foods in the midwest United States and Canada, was struggling to get the right products to the right places at the right time. Its data center included outdated physical servers, and batch processing meant that inventory would not be updated until the end of the day as opposed to real time. In addition, recovering from power outages and disk failures could frequently take up to two weeks. To modernize its data center, Old Dutch Foods invested in EMC Converged Infrastructure. The fast and easy deployment of two VCE VBlock® systems running JD Edwards, MS Exchange, mobile device apps, and operation of a backup site with replicated applications and data. This enhanced the IT department's responsiveness to the business, allowed them to shift to real-time inventory, and reduced CapEx and OpEx costs. Operations were simplified by reducing person-hours needed for infrastructure maintenance by 75 percent.
Tags : 
    
EMC Converged Platforms
Published By: Adobe     Published Date: Feb 20, 2014
San Diego County District Attorney’s Office accelerates Juvenile Court proceedings using Adobe® Acrobat® Pro in Microsoft SharePoint environment.
Tags : 
adobe, adobe acrobat pro, file management, software, data management, electronic documentation, paperless processing, pdf documents, it management, information management
    
Adobe
Start   Previous   1 2 3 4 5 6    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.