column

Results 1 - 25 of 32Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 06, 2015
The growth of NoSQL data storage solutions have revolutionized the way enterprises are dealing with their data. The older, relational platforms are still being utilized by most organizations, while the implementation of varying NoSQL platforms including Key-Value, Wide Column, Document, Graph, and Hybrid data stores are increasing at faster rates than ever seen before. Such implementations are causing enterprises to revise their Data Management procedures across-the-board from governance to analytics, metadata management to software development, data modeling to regulation and compliance. The time-honored techniques for data modeling are being rewritten, reworked, and modified in a multitude of different ways, often wholly dependent on the NoSQL platform under development. The research report analyzes a 2015 DATAVERSITY® survey titled “Modeling NoSQL.” The survey examined a number of crucial issues within the NoSQL world today, with focus on data modeling in particular.
Tags : 
    
DATAVERSITY
Published By: Stardog Union     Published Date: Jul 27, 2018
When enterprises consider the benefits of data analysis, what's often overlooked is the challenge of data variety, and that most successful outcomes are driven by it. Yet businesses are still struggling with how to query distributed, heterogeneous data using a unified data model. Fortunately, Knowledge Graphs provide a schema flexible solution based on modular, extensible data models that evolve over time to create a truly unified solution. How is this possible? Download and discover: • Why businesses should organize information using nodes and edges instead of rows, columns and tables • Why schema free and schema rigid solutions eventually prove to be impractical • The three categories of data diversity including semantic and structural variety
Tags : 
    
Stardog Union
Published By: MarkLogic     Published Date: Jun 17, 2015
Modern enterprises face increasing pressure to deliver business value through technological innovation that leverages all available data. At the same time, those enterprises need to reduce expenses to stay competitive, deliver results faster to respond to market demands, use real-time analytics so users can make informed decisions, and develop new applications with enhanced developer productivity. All of these factors put big data at the top of the agenda. Unfortunately, the promise of big data has often failed to deliver. With the growing volumes of unstructured and multi-structured data flooding into our data centers, the relational databases that enterprises have relied on for the last 40-years are now too limiting and inflexible. New-generation NoSQL (“Not Only SQL”) databases have gained popularity because they are ideally suited to deal with the volume, velocity, and variety of data that businesses and governments handle today.
Tags : 
data, data management, databse, marklogic, column store, wide column store, nosql
    
MarkLogic
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
This ESG Lab review documents the results of recent testing of the Oracle SPARC M7 processor with a focus on in-memory database performance for the real-time enterprise. Leveraging new advanced features like columnar compression and on-ship in-memory query acceleration, ESG Lab compared the in-memory database performance of a SPARC T7 system with a SPARC M7 processor to an x86-based system.
Tags : 
    
Oracle PaaS/IaaS/Hardware
Published By: IBM     Published Date: May 19, 2016
In our 21-criteria evaluation of the dynamic case management (DCM) market, we identified the 14 most significant software vendors — Appian, bpm’online, Column Technologies, DST Systems, Eccentex, IBM, Isis Papyrus, Lexmark Enterprise Software, MicroPact, Newgen Software, OnBase by Hyland, OpenText, Pegasystems, and TIBCO Software — and researched, analyzed, and scored them. The evaluation focused on providers’ adaptive, analytics, and mobile features, all critical to helping enterprises tackle increasing volumes of varied and unstructured work. This report helps enterprise architecture (EA) professionals select the best providers to meet their unique needs.
Tags : 
ibm, forrester, forrester wave, dynamic case management, dcm, software vendors, software, enterprise applications, business technology
    
IBM
Published By: IBM     Published Date: Jul 08, 2016
In our 21-criteria evaluation of the dynamic case management (DCM) market, we identified the 14 most significant software vendors — Appian, bpm’online, Column Technologies, DST Systems, Eccentex, IBM, Isis Papyrus, Lexmark Enterprise Software, MicroPact, Newgen Software, OnBase by Hyland, OpenText, Pegasystems, and TIBCO Software — and researched, analyzed, and scored them. The evaluation focused on providers’ adaptive, analytics, and mobile features, all critical to helping enterprises tackle increasing volumes of varied and unstructured work. This report helps enterprise architecture (EA) professionals select the best providers to meet their unique needs.
Tags : 
ibm, forrester, forrester wave, dynamic case management, dcm, networking, software development, enterprise applications
    
IBM
Published By: IBM     Published Date: Jul 21, 2016
IBM's recently released DB2 version 11.1 for Linux, Unix and Windows (LUW) is a hybrid database that IBM says can handle transactional and analytic workloads thanks to its BLU Acceleration technology, which features an in-memory column store for analytical workloads that can scale across a massively parallel cluster.
Tags : 
ibm, db2. analytics, mpp, data wharehousing
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: Alteryx, Inc.     Published Date: Apr 21, 2017
Analysts struggle to incorporate new sources of data into their analysis because they rely on Microsoft Excel or other tools that were not designed for data blending. Deleting columns, parsing data, and writing complicated formulas to clean and combine data every time it changes is not an efficient way for today’s analysts to spend their time. Download The Definitive Guide to Data Blending and: Understand how analysts are empowered through data blending Learn how to automate time-consuming, manual data preparation tasks Gain deeper business insights in hours, not the weeks typical of traditional approaches
Tags : 
    
Alteryx, Inc.
Published By: MarkLogic     Published Date: Mar 29, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : 
enterprise, metadata, management, organizations, technology, tools, mark logic
    
MarkLogic
Published By: MarkLogic     Published Date: May 07, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : 
agile, enterprise, metadata, management, organization
    
MarkLogic
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
MongoDB is an open-source, document database designed with both scalability and developer agility in mind. MongoDB bridges the gap between key-value stores, which are fast and scalable, and relational databases, which have rich functionality. Instead of storing data in rows and columns as one would with a relational database, MongoDB stores JSON documents with dynamic schemas. Customers should consider three primary factors when evaluating databases: technological fit, cost, and topline implications. MongoDB's flexible and scalable data model, robust feature set, and high-performance, high-availability architecture make it suitable for a wide range of database use cases. Given that in many cases relational databases may also be a technological fit, it is helpful to consider the relative costs of each solution when evaluating which database to adopt.
Tags : 
total, cost, ownership, comparison, mongodb, oracle
    
AstuteIT_ABM_EMEA
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Today’s businesses generate staggering amounts of data, and learning to get the most value from that data is paramount to success. Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on-demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Amazon Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. Organizations choose Amazon Redshift for its affordability, flexibility, and powerful feature set: • Enterprise-class relational database query and management system • Supports client connections with many types of applications, including business intelligence (BI), reporting, data, and analytics tools • Execute analytic queries in order to retrieve, compare, and evaluate large amounts of data in multiple-stage operations
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. It’s designed for speed and ease of use — but to realize all of its potential benefits, organizations still have to configure Redshift for the demands of their particular applications. Whether you’ve been using Redshift for a while, have just implemented it, or are still evaluating it as one of many cloud-based data warehouse and business analytics technology options, your organization needs to understand how to configure it to ensure it delivers the right balance of performance, cost, and scalability for your particular usage scenarios. Since starting to work with this technolog
Tags : 
    
Amazon Web Services
Published By: Oracle Corp.     Published Date: Oct 15, 2012
Please join Tom Kyte, of the AskTom column and Senior Technical Architect at Oracle, to learn about the threats every IT Database and Security administrator needs to be aware of. Tom will also discuss best practices for securing your databases.
Tags : 
database security, compliance, sql, it security, oracle
    
Oracle Corp.
Published By: Oracle Corporation     Published Date: May 11, 2012
Exadata Hybrid Columnar Compression is an enabling technology for two new Oracle Exadata Storage Server features: Warehouse Compression and Archive Compression. We will discuss each of these features in detail later in this paper, but first let's explore Exadata Hybrid Columnar Compression - the next generation in compression technology.
Tags : 
oracle, data warehousing, database, exadata, database machine, infrastructure, operation, operation costs, mobile, growth, payback, architecture, demands, enterprise applications, data management
    
Oracle Corporation
Published By: Vertica     Published Date: Aug 15, 2010
If you are responsible for BI (Business Intelligence) in your organization, there are three questions you should ask yourself: - Are there applications in my organization for combining operational processes with analytical insight that we can't deploy because of performance and capacity constraints with our existing BI environment?
Tags : 
business intelligence, vertica, aggregated data, olap, rolap, sql, query, data warehouse, oltp, analytical applications, database development, data integration, data mining, data quality, service oriented architecture, information management, data warehousing
    
Vertica
Previous   1 2    Next    
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept