analytics

Results 201 - 225 of 2985Sort Results By: Published Date | Title | Company Name
Published By: Attunity     Published Date: Nov 15, 2018
IT departments today face serious data integration hurdles when adopting and managing a Hadoop-based data lake. Many lack the ETL and Hadoop coding skills required to replicate data across these large environments. In this whitepaper, learn how you can provide automated Data Lake pipelines that accelerate and streamline your data lake ingestion efforts, enabling IT to deliver more data, ready for agile analytics, to the business.
Tags : 
    
Attunity
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
This technical whitepaper by Radiant Advisors covers key findings from their work with a network of Fortune 1000 companies and clients from various industries. It assesses the major trends and tips to gain access to and optimize data streaming for more valuable insights. Read this report to learn from real-world successes in modern data integration, and better understand how to maximize the use of streaming data. You will also learn about the value of populating a cloud data lake with streaming operational data, leveraging database replication, automation and other key modern data integration techniques. Download this whitepaper today for about the latest approaches on modern data integration and streaming data technologies.
Tags : 
streaming data, cloud data lakes, cloud data lake, data lake, cloud, data lakes, streaming data, change data capture, cloud computing, modern data integration, data integration, data analytics, cloud-based data lake, enterprise data, self-service data
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this technical whitepaper to learn how data architects and DBAs can avoid the struggle of complex scripting for Kafka in modern data environments. You’ll also gain tips on how to avoid the time-consuming hassle of manually configuring data producers and data type conversions. Specifically, this paper will guide you on how to overcome these challenges by leveraging innovative technology such as Attunity Replicate. The solution can easily integrate source metadata and schema changes for automated configuration real-time data feeds and best practices.
Tags : 
data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics, database transactions, streaming environments, real-time data replication, data configuration
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this checklist report, with results based on the Eckerson Group’s survey and the Business Application Research Center (BARC), on how companies using the cloud for data warehousing and BI has increased by nearly 50%. BI teams must address multiple issues including data delivery, security, portability and more before moving to the cloud for its infinite scalability and elasticity. Read this report to understand all 7 seven considerations – what, how and why they impact the decision to move to the cloud.
Tags : 
cloud, business intelligence, analytics, cloud data, data lake, data warehouse automation tools, dwa, data warehouse, security and compliance, data movement, hybrid cloud, hybrid cloud environment, cross-platform automation, portability
    
Attunity
Published By: Avanade DACH     Published Date: May 08, 2018
In this six-step guide, we aim to help you solve your data challenges to prepare for advanced analytics, cognitive computing, machine learning and the resulting benefits of AI. We’ll show you how to get your data house in order, scale beyond the proof of concept stage, and develop an agile approach to data management. By continually repeating the steps in this guide, you’ll sharpen your data and shape it into a truly transformational business asset. You’ll be able to overcome some of the most common business problems, and work toward making positive changes: • Improve customer satisfaction • Reduce equipment outages • Increase marketing campaign ROI • Minimize fraud loss • Improve employee retention • Increase accuracy for financial forecasts
Tags : 
    
Avanade  DACH
Published By: Avanade DACH     Published Date: Aug 01, 2018
Je besser die Daten, desto besser die Künstliche Intelligenz Sie möchten Ihre Kunden und deren Verhalten besser verstehen? Ihnen eine maßgeschneiderte Customer Experience bieten? Oder neue Geschäftsfelder identifizieren? Es ist vielleicht nicht immer offensichtlich: Aber die Grundlage jeder gut funktionierenden KI sind Daten. In unserem Leitfaden zeigen wir Ihnen in sechs Schritten, wie Sie Ihre Daten auf innovative Weise organisieren. So schaffen Sie eine optimale Grundlage, um künftig das Beste aus künstlicher Intelligenz, Cognitive Computing und maschinellem Lernen herausholen zu können.
Tags : 
    
Avanade  DACH
Published By: Avi Networks     Published Date: Mar 07, 2018
"Maximizing Operational Efficiency and Application Performance in VMware-Based Data Center Some of the most common challenges in VMware-based virtual data center environments include: - Lack of visibility into applications and end-user experience - Complex and error-prone operations - High capital and operational costs Review our solution brief to learn how the Avi Controller, the industry’s first solution that integrates application delivery with real-time analytics, is able to solve these challenges."
Tags : 
    
Avi Networks
Published By: Avi Networks     Published Date: May 14, 2018
Avi Vantage is the only solution that delivers built-in application analytics in addition to enterprise-grade load balancing and application security. With millions of data points collected in real time, the platform delivers network-DVR like capabilities with the ability to record and display application analytics over specific time intervals (last 15 minutes, hour, day, week etc.) or for individual transactions. These application insights including total round trip time for each transaction, application health scores, errors, end user statistics, and security insights (DDoS attacks, SSL vulnerabilities, ciphers etc.) simplify troubleshooting of applications.
Tags : 
    
Avi Networks
Published By: Avi Networks     Published Date: Aug 17, 2018
This whitepaper details how ADCs and virtual appliances are inflexible, expensive, and lacking in visibility in contrast to the central management, autoscaling features, intelligent analytics, and flexibility offered by the Avi Vantage Platform. This document also outlines 7 key considerations that you should evaluate before you refresh your current load balancers: 1. Automation and Self-Service 2. Elasticity 3. Applica tion Analytics and Troubleshooting 4. Performance 5. Hybrid Cloud 6. Ecosystem Integrations 7. Total Cost of Ownership
Tags : 
    
Avi Networks
Published By: Aviatrix     Published Date: Jun 11, 2018
Once you've designed and secured your Global Transit Network, are you done? Are you ready to hand day-to-day responsibility over to an operations team? Or, are there other elements you need to ensure that the day-to-day operation of your transit hub is efficient and effective? As part of our fact-filled AWS Bootcamp series, Aviatrix CTO Sherry Wei and Neel Kamal, head of field operations at Aviatrix, demonstrate the best practices they've gleaned from working with operations teams, all who require: • Visibility: Do you have a way to centrally view your network, see performance bottlenecks, control security policies, and set other configuration details? • Deep Analytics: Can you easily gather performance and audit data and export it to Splunk, DataDog, or other advanced reporting tools? • Monitoring and Troubleshooting: Do you have a real-time view of network health, and how easily can you access the data needed to locate and fix issues? • Alert Management: When issues do occur, what r
Tags : 
aws, aws vpc, aws global transit network, aws transit vpc, cisco csr, csr 1000v
    
Aviatrix
Published By: AWS     Published Date: Jun 15, 2016
Netflix, one of the world’s leading Internet television networks, is using AWS to deliver billions of hours of content monthly, and run its analytics platform for optimum performance of its global service.
Tags : 
    
AWS
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: AWS     Published Date: Dec 15, 2017
Healthcare and Life Sciences organizations are using data to generate knowledge that helps them provide better patient care, enhances biopharma research and development, and streamlines operations across the product innovation and care delivery continuum. Next-Gen business intelligence (BI) solutions can help organizations reduce time-to-insight by aggregating and analyzing structured and unstructured data sets in real or near-real time. AWS and AWS Partner Network (APN) Partners offer technology solutions to help you gain data-driven insights to improve care, fuel innovation, and enhance business performance. In this webinar, you’ll hear from APN Partners Deloitte and hc1.com about their solutions, built on AWS, that enable Next-Gen BI in Healthcare and Life Sciences. Join this webinar to learn: How Healthcare and Life Sciences organizations are using cloud-based analytics to fuel innovation in patient care and biopharmaceutical product development. How AWS supports BI solutions f
Tags : 
    
AWS
Published By: AWS     Published Date: Apr 27, 2018
Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data can now be processed in memory, and more significantly, analyzed as it arrives, in real time. Millions to hundreds of millions of events (such as video streams or application alerts) can be collected and analyzed per hour to deliver insights that can be acted upon in an instant. From financial services to manufacturing, this rev
Tags : 
    
AWS
Published By: AWS     Published Date: May 18, 2018
We’ve become a world of instant information. We carry mobile devices that answer questions in seconds and we track our morning runs from screens on our wrists. News spreads immediately across our social feeds, and traffic alerts direct us away from road closures. As consumers, we have come to expect answers now, in real time. Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data
Tags : 
    
AWS
Published By: AWS     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
    
AWS
Published By: AWS     Published Date: Aug 20, 2018
A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated querying: ability to run a query across heterogeneous sources of data • Data consumption: support numerous types of analysis - ad-hoc exploration, predefined reporting/dashboards, predictive and advanced analytics
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 04, 2018
Today’s businesses generate staggering amounts of data, and learning to get the most value from that data is paramount to success. Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on-demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics.
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 04, 2018
Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. It’s designed for speed and ease of use — but to realize all of its potential benefits, organizations still have to configure Redshift for the demands of their particular applications. Whether you’ve been using Redshift for a while, have just implemented it, or are still evaluating it as one of many cloud-based data warehouse and business analytics technology options, your organization needs to understand how to configure it to ensure it delivers the right balance of performance, cost, and scalability for your particular usage scenarios. Since starting to work with this technology
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
AWS
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: AWS     Published Date: Oct 30, 2018
Enhance the Availability & Performance of SAP HANA Apps Abstract: SAP HANA delivers powerful analytics capabilities that can help you improve business performance and drive digital transformation. You can more easily build reliable and performant SAP HANA-powered landscapes with SUSE Linux Enterprise Server for SAP Applications and Amazon Web Services (AWS). That’s because SUSE can help you achieve near zero downtime and sustain high-performance levels, while AWS delivers a broad and deep set of cloud services that are certified to fulfill the compute, memory, and storage requirements of SAP HANA.
Tags : 
availability, performance, sap, hana, apps
    
AWS
Published By: AWS     Published Date: Nov 15, 2018
It isn’t always easy to keep pace with today’s high volume of data, especially when it’s coming at you from a diverse number of sources. Tracking these analytics can place a strain on IT, who must provide the requested information to C-suite and analysts. Unless this process can happen quickly, the insights grow stale. Download your complimentary ebook now to see how Matillion ETL for Amazon Redshift makes it easy for technical and business users alike to participate and own the entire data and analysis process. With Matillion ETL for Amazon Redshift, everyone from CTOs to marketing analysts can generate valuable business intelligence by automating data and analytics orchestrations.
Tags : 
    
AWS
Published By: AWS     Published Date: Nov 15, 2018
"Getting the right analytics, quickly and easily, is important to help grow your organization. But analytics isn’t just about collecting and exploring data. The truly important step resides in converting this data into actionable insights. Acquiring these insights requires some planning ahead. While ease of deployment, time-to-insight, and cost are all important, there are several more assessments you need to take before choosing the right solution. Learn the 8 must-have features to look for in data visualization. Download this white paper to learn how TIBCO® Spotfire® in AWS Marketplace assist in providing you advanced, cost-effective analytics."
Tags : 
    
AWS
Start   Previous    2 3 4 5 6 7 8 9 10 11 12 13 14 15 16    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept