data processing

Results 1 - 25 of 105Sort Results By: Published Date | Title | Company Name
Published By: MemSQL     Published Date: Jun 25, 2014
Emerging business innovations focused on realizing quick business value on new and growing data sources require “hybrid transactional and analytical processing” (HTAP), the notion of performing analysis on data directly in an operational data store. While this is not a new idea, Gartner reports that the potential for HTAP has not been fully realized due to technology limitations and inertia in IT departments. MemSQL offers a unique combination of performance, flexibility, and ease of use that allows companies to implement HTAP to power their business applications.
Tags : 
    
MemSQL
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data
    
Skytree
Published By: WhereScape     Published Date: Aug 18, 2016
Data Vault 2.0 leverages parallel database processing for large data sets and provides an extensible approach to design that enables agile development. WhereScape provides data warehouse automation software solutions that enable Data Vault agile project delivery through accelerated development, documentation and deployment without sacrificing quality or flexibility.
Tags : 
    
WhereScape
Published By: Basho     Published Date: Nov 25, 2015
The landscape of Scalable Operational and Analytical Systems is changing and disrupting the norm of using relational databases for all workloads. With the growing need to process and analyze Big Data at Scale, the demand for alternative strategies has grown and has given rise to the emergence of NoSQL databases for scalable processing. Mike Ferguson, Managing Director of Intelligent Business Strategies, is an independent IT Analyst who specializes in Big Data, BI/Analytics, Data Management and Enterprise Business Integration. In this whitepaper he will discuss the movement towards NoSQL databases for scalable operational and analytical systems, what’s driving Big Data analytics from Hadoop to the emergence of Apache Spark, the value of operational analytics and the importance of in-memory processing, and why use Apache Spark as your in-memory analytical platform for operational analytics.
Tags : 
    
Basho
Published By: DATAVERSITY     Published Date: Dec 27, 2013
There are actually many elements of such a vision that are working together. ACID and NoSQL are not the antagonists they were once thought to be; NoSQL works well under a BASE model, but also some of the innovative NoSQL systems fully conform to ACID requirements. Database engineers have puzzled out how to get non-relational systems to work within an environment that demands high availability, scalability, with differing levels of recovery and partition tolerance. BASE is still a leading innovation that is wedded to the NoSQL model, and the evolution of both together is harmonious. But that doesn’t mean they always have to be in partnership; there are several options. So while the opening anecdote is true in many cases, organizations that need more diverse possibilities can move into the commercial arena and get the specific option that works best for them. This paper is sponsored by: MarkLogic.
Tags : 
nosql, database, acid v base, white paper
    
DATAVERSITY
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: CA Technologies EMEA     Published Date: Aug 03, 2017
Using CA Live API Creator, you can execute business policies using Reactive Logic. You write simple declarative rules defining relationships across data fields, and they’re automatically enforced when changes occur—just like formulas in a spreadsheet. Reactive Logic should cover most of your application requirements, but you also have the ability to configure event processing or external callouts using server-side JavaScript or imported Java® libraries if you so desire.
Tags : 
api, application programming interface, psd2, open banking, json, github
    
CA Technologies EMEA
Published By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : 
vision, high availability, ibm, aix, cdp, core union, database security
    
Vision Solutions
Published By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
    
SAP
Published By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA Tier 3 ABM
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : 
database usage, database management, server usage, data protection
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
    
Cisco
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Dell EMC     Published Date: Nov 09, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera and Intel.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Oct 08, 2015
Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction.
Tags : 
    
Dell EMC
Published By: SAS     Published Date: Apr 16, 2015
Former Intel CEO Andy Grove once coined the phrase, “Technology happens.” As true as Grove’s pat aphorism has become, it’s not always good news. Twenty years ago, no one ever got fired for buying IBM. In the heyday of customer relationship management (CRM), companies bought first and asked questions later. Nowadays, executives are being enlightened by the promise of big data technologies and the role data plays in the fact-based enterprise. Leaders in business and IT alike are waking up to the reality that – despite the hype around platforms and processing speeds – their companies have failed to established sustained processes and skills around data.
Tags : 
    
SAS
Published By: EMC Converged Platforms     Published Date: Oct 22, 2015
Old Dutch Foods, known for its broad selection of snack foods in the midwest United States and Canada, was struggling to get the right products to the right places at the right time. Its data center included outdated physical servers, and batch processing meant that inventory would not be updated until the end of the day as opposed to real time. In addition, recovering from power outages and disk failures could frequently take up to two weeks. To modernize its data center, Old Dutch Foods invested in EMC Converged Infrastructure. The fast and easy deployment of two VCE VBlock® systems running JD Edwards, MS Exchange, mobile device apps, and operation of a backup site with replicated applications and data. This enhanced the IT department's responsiveness to the business, allowed them to shift to real-time inventory, and reduced CapEx and OpEx costs. Operations were simplified by reducing person-hours needed for infrastructure maintenance by 75 percent.
Tags : 
    
EMC Converged Platforms
Published By: Adobe     Published Date: Feb 20, 2014
San Diego County District Attorney’s Office accelerates Juvenile Court proceedings using Adobe® Acrobat® Pro in Microsoft SharePoint environment.
Tags : 
adobe, adobe acrobat pro, file management, software, data management, electronic documentation, paperless processing, pdf documents
    
Adobe
Published By: Adobe     Published Date: Feb 20, 2014
Gain more efficient ways of working with documents and collaborating with others on them.
Tags : 
adobe, adobe acrobat pro, microsoft applications, collaboration, merging documents, editing documents, pdf to office format, file formatting
    
Adobe
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: HP - Enterprise     Published Date: Jun 04, 2013
Businesses are overwhelmed with data; it’s a blessing and a curse. A curse because it can overwhelm traditional approaches to storing and processing it. A blessing because the data promises business insight that never existed earlier. The industry has spawned a new term, “big data,” to describe it. Now, IT itself is overwhelmed with its own big data. In the press to roll out new services and technologies—mobility, cloud, virtualization—applications, networks, and physical and virtual servers grow in a sprawl. With them comes an unprecedented volume of data such as logs, events, and flows. It takes too much time and resources to sift through it, so most of it lies unexplored and unexploited. Yet like business data, it contains insight that can help us solve problems, make decisions, and plan for the future.
Tags : 
data research, big data, virtualization, applications, networks
    
HP - Enterprise
Published By: Teradata     Published Date: Jan 30, 2015
This report is about two of those architectures: Apache™ Hadoop® YARN and Teradata® Aster® Seamless Network Analytical Processing (SNAP) Framework™. In the report, each architecture is described; the use of each in a business problem is illustrated; and the results are compared.
Tags : 
teradata, data, big, data, analytics. insights, solutions, business opportunities, challenges
    
Teradata
Published By: Teradata     Published Date: Jan 30, 2015
It is hard for data and IT architects to understand what workloads should move, how to coordinate data movement and processing between systems, and how to integrate those systems to provide a broader and more flexible data platform. To better understand these topics, it is helpful to first understand what Hadoop and data warehouses were designed for and what uses were not originally intended as part of the design.
Tags : 
teradata, data, big, data, analytics. insights, solutions, business opportunities, challenges
    
Teradata
Start   Previous   1 2 3 4 5    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.