data analysis

Results 1 - 25 of 359Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Oct 12, 2017
The foundation of this report is a survey conducted by DATAVERSITY® that included a range of different question types and topics on the current state of Data Architecture. The report evaluates the topic through a discussion and analysis of each presented survey question, as well as a deeper examination of the present and future trends.
Tags : 
    
DATAVERSITY
Published By: Ted Hills     Published Date: Mar 08, 2017
This paper explores the differences between three situations that appear on the surface to be very similar: a data attribute that may occur zero or one times, a data attribute that is optional, and a data attribute whose value may be unknown. It shows how each of these different situations is represented in Concept and Object Modeling Notation (COMN, pronounced “common”). The theory behind the analysis is explained in greater detail by three papers: Three-Valued Logic, A Systematic Solution to Handling Unknown Data in Databases, and An Approach to Representing Non-Applicable Data in Relational Databases.
Tags : 
    
Ted Hills
Published By: CA Technologies     Published Date: Apr 24, 2013
Using ERwin Data Modeler & Microsoft SQL Azure to Move Data to the Cloud within the DaaS Lifecycle by Nuccio Piscopo Cloud computing is one of the major growth areas in the world of IT. This article provides an analysis of how to apply the DaaS (Database as a Service) lifecycle working with ERwin and the SQL Azure platform. It should help enterprises to obtain the benefits of DaaS and take advantage of its potential for improvement and transformation of data models in the Cloud. The use case introduced identifies key actions, requirements and practices that can support activities to help formulate a plan for successfully moving data to the Cloud.
Tags : 
    
CA Technologies
Published By: MemSQL     Published Date: Jun 25, 2014
Emerging business innovations focused on realizing quick business value on new and growing data sources require “hybrid transactional and analytical processing” (HTAP), the notion of performing analysis on data directly in an operational data store. While this is not a new idea, Gartner reports that the potential for HTAP has not been fully realized due to technology limitations and inertia in IT departments. MemSQL offers a unique combination of performance, flexibility, and ease of use that allows companies to implement HTAP to power their business applications.
Tags : 
    
MemSQL
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data
    
Skytree
Published By: Data Ninja     Published Date: Apr 16, 2017
By adding structure to free text using text analytics and graph databases, text becomes valuable business data. This paper examines a real life use case in risk analysis. Text is a part of all communication channels from social media, documents, logs, and data bases. In order to use the information from text, you need to extract the data in a way that provides useful information on entities, locations, organizations, and their properties. Graph databases are very powerful in showing the text relationships including the nearest neighbors, clusters, and the shortest paths. The combination of text analytics and graph databases can be used to solve business problems.
Tags : 
    
Data Ninja
Published By: Alteryx     Published Date: May 24, 2017
Spreadsheets are a mainstay in almost every organization. They are a great way to calculate and manipulate numeric data to make decisions. Unfortunately, as organizations grow, so does the data, and relying on spreadsheet-based tools like Excel for heavy data preparation, blending and analysis can be cumbersome and unreliable. Alteryx, Inc. is a leader in self-service data analytics and provides analysts with the ability to easily prep, blend, and analyze all data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. This paper highlights how transitioning from a spreadsheet-based environment to an Alteryx workflow approach can help analyst better understand their data, improve consistency, and operationalize analytics through a flexible deployment and consumption environment.
Tags : 
    
Alteryx
Published By: DATAVERSITY     Published Date: Jun 14, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate, if needed) the strategic and tactical business drivers every enterprise needs for market success. This paper is sponsored by: ASG, DGPO and DebTech International.
Tags : 
data, data management, data governance, data steward, dataversity, research paper
    
DATAVERSITY
Published By: DATAVERSITY     Published Date: Jun 17, 2013
This report analyzes many challenges faced when beginning a new Data Governance program, and outlines many crucial elements in successfully executing such a program. “Data Governance” is a term fraught with nuance, misunderstanding, myriad opinions, and fear. It is often enough to keep Data Stewards and senior executives awake late into the night. The modern enterprise needs reliable and sustainable control over its technological systems, business processes, and data assets. Such control is tantamount to competitive success in an ever-changing marketplace driven by the exponential growth of data, mobile computing, social networking, the need for real-time analytics and reporting mechanisms, and increasing regulatory compliance requirements. Data Governance can enhance and buttress (or resuscitate,if needed) the strategic and tactical business drivers every enterprise needs for market success.
Tags : 
research paper, data, data management, data governance, data steward
    
DATAVERSITY
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: Oracle     Published Date: Oct 20, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most compani
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Ounce Labs, an IBM Company     Published Date: Dec 15, 2009
Today, when you make decisions about information technology (IT) security priorities, you must often strike a careful balance between business risk, impact, and likelihood of incidents, and the costs of prevention or cleanup. Historically, the most well-understood variable in this equation was the methods that hackers used to disrupt or invade the system.
Tags : 
ounce labs, it securitym it risk, software applications, pci dss, hipaa, glba, data security, source code vulnerabilities
    
Ounce Labs, an IBM Company
Published By: Ounce Labs, an IBM Company     Published Date: Jul 08, 2009
The Business Case for Data Protection, conducted by Ponemon Institute and sponsored by Ounce Labs, is the first study to determine what senior executives think about the value proposition of corporate data protection efforts within their organizations. In times of shrinking budgets, it is important for those individuals charged with managing a data protection program to understand how key decision makers in organizations perceive the importance of safeguarding sensitive and confidential information.
Tags : 
ounce labs, it securitym it risk, software applications, ciso, pci dss, hipaa, glba, data security
    
Ounce Labs, an IBM Company
Published By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
    
SAP
Published By: HP Enterprise Business     Published Date: Mar 02, 2017
Powered by data from 451 Research, the Right Mix web application benchmarks your current private vs public cloud mix, business drivers, and workload deployment venues against industry peers to create a comparative analysis. See how your mix stacks up, then download the 451 Research report for robust insights into the state of the hybrid IT market.
Tags : 
    
HP Enterprise Business
Published By: Pentaho     Published Date: Nov 04, 2015
Although the phrase “next-generation platforms and analytics” can evoke images of machine learning, big data, Hadoop, and the Internet of things, most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. Next-generation platforms and analytics often mean simply pushing past reports and dashboards to more advanced forms of analytics, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis.
Tags : 
pentaho, analytics, platforms, hadoop, big data, predictive analytics, networking, it management
    
Pentaho
Published By: Cisco     Published Date: Jun 21, 2016
The demands on IT today are staggering. Most organizations depend on their data to drive everything from product development and sales to communications, operations, and innovation. As a result, IT departments are charged with finding a way to bring new applications online quickly, accommodate massive data growth and complex data analysis, and make data available 24 hours a day, around the world, on any device. The traditional way to deliver data services is with separate infrastructure silos for various applications, processes, and locations, resulting in continually escalating costs for infrastructure and management. These infrastructure silos make it difficult to respond quickly to business opportunities and threats, cause productivity-hindering delays when you need to scale, and drive up operational costs.
Tags : 
    
Cisco
Published By: Adobe     Published Date: Nov 09, 2017
Patients are going digital — and taking the healthcare system with them. Learn how in the 2017 Digital Trends in Healthcare and Pharma report. Download it now to learn: Why two-thirds of healthcare companies are investing in data analysis. How they’re building content marketing programs to boost patient knowledge. What they plan to do with virtual and augmented reality this year and beyond.
Tags : 
    
Adobe
Published By: Oracle CX     Published Date: Oct 19, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most companies
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Cloudian     Published Date: Feb 15, 2018
We are living in an age of explosive data growth. IDC projects that the digital universe is growing 50% a year, doubling in size every 2 years. In media and entertainment, the growth is even faster as capacity-intensive formats such as 4K, 8K, and 360/VR gain traction. Fortunately, new trends in data storage are making it easier to stay ahead of the curve. In this paper, we will examine how object storage stacks up against LTO tape for media archives and backup. In addition to a detailed total cost of ownership (TCO) analysis covering both capital and operational expenses, this paper will look at the opportunity costs of not leveraging the real-time data access of object storage to monetize existing data. Finally, we will demonstrate the validity of the analysis with a real-world case study of a longstanding network TV show that made the switch from tape to object storage. The limitations of tape storage go way beyond its lack of scalability. Data that isn’t searchable is becoming
Tags : 
    
Cloudian
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.