transactional databases

Results 1 - 8 of 8Sort Results By: Published Date | Title | Company Name
Published By: Cloudant - an IBM Company     Published Date: Aug 01, 2015
The database you pick for your next web or mobile application matters now more than ever. Today’s applications are expected to run non-stop and must efficiently manage continuously growing amounts of transactional and multi-structured data in order to do so. This has caused NoSQL to grow from a buzzword to a serious consideration for every database, from small shops to the enterprise. Read this whitepaper to learn why NoSQL databases have become such a popular option, explore the various types available, and assess whether you should consider implementing a NoSQL solution for your next application.
Tags : 
    
Cloudant - an IBM Company
Published By: SAP     Published Date: May 18, 2014
This white paper discusses the issues involved in the traditional practice of deploying transactional and analytic applications on separate platforms using separate databases. It analyzes the results from a user survey, conducted on SAP's behalf by IDC, that explores these issues.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools, analytical applications
    
SAP
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : 
data. queries, database operations, transactional databases, clustering, it management, storage, business technology
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Pentaho     Published Date: Mar 08, 2016
If you’re evaluating big data integration platforms, you know that with the increasing number of tools and technologies out there, it can be difficult to separate meaningful information from the hype, and identify the right technology to solve your unique big data problem. This analyst research provides a concise overview of big data integration technologies, and reviews key things to consider when creating an integrated big data environment that blends new technologies with existing BI systems to meet your business goals. Read the Buyer’s Guide to Big Data Integration by CITO Research to learn: • What tools are most useful for working with Big Data, Hadoop, and existing transactional databases • How to create an effective “data supply chain” • How to succeed with complex data on-boarding using automation for more reliable data ingestion • The best ways to connect, transport, and transform data for data exploration, analytics and compliance
Tags : 
data, buyer guide, integration, technology, platform, research, enterprise applications
    
Pentaho
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept