performance data

Results 1 - 25 of 807Sort Results By: Published Date | Title | Company Name
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: Neo Technology     Published Date: Jun 28, 2015
The future of Master Data Management is deriving value from data relationships which reveal more data stories that become more and more important to competitive advantage as we enter into the future of data and business analytics. MDM will be about supplying consistent, meaningful views of master data and being able to unify data into one location, especially to optimize for query performance and data fit. Graph databases offer exactly that type of data/performance fit. Use data relationships to unlock real business value in MDM: - Graphs can easily model both hierarchical and non-hierarchical master data - The logical model IS the physical model making it easier for business users to visualize data relationships - Deliver insights in real-time from data relationships in your master data - Stay ahead of the business with faster development Download and read the white paper Your Master Data Is a Graph: Are You Ready? to learn why your master data is a graph and how graph databases like Neo4j are the best technologies for MDM.
Tags : 
database, nosql, graph database, big data, master data management, mdm
    
Neo Technology
Published By: Experian     Published Date: Mar 30, 2017
Businesses today recognize the importance of the data they hold, but a general lack of trust in the quality of their data prevents them from achieving strategic business objectives. Nearly half of organizations globally say that a lack of trust in their data contributes to increased risk of non-compliance and regulatory penalties (52%) and a downturn in customer loyalty (51%). To be of value to organizations, data needs to be trustworthy. In this report, you will read about the findings from this unique study, including: · How data powers business opportunities · Why trusted data is essential for performance · Challenges that affect data quality · The current state of data management practices · Upcoming data-related projects in 2017
Tags : 
    
Experian
Published By: Snowflake Computing     Published Date: Feb 27, 2017
Snowflake’s cloud-built data warehouse delivers the performance, concurrency, simplicity and affordability needed to store and analyze all of an organization’s data in one location. Snowflake combines the power of data warehousing, the flexibility of big data platforms and the elasticity of the cloud. Find out more at snowflake.net.
Tags : 
    
Snowflake Computing
Published By: FairCom     Published Date: May 25, 2016
As companies embrace NoSQL as the “next big thing,” they are rightly cautious of abandoning their investment in SQL. The question a responsible developer or IT manager must investigate is “in which cases are each of these technologies, SQL and NoSQL, the appropriate solution?” For example, cloud provider BigStep offered this assessment: “NoSQL is not the best model for OLTP, ad hoc queries, complicated relationships among the data, and situations when stability and reliability outweigh the importance of speed.” While that statement may be true of many NoSQL databases, c-treeACE is the exception. Its unique, No+SQL architecture offers the advantages of SQL on top of a robust, high-performance NoSQL core engine. In this white paper, you'll read five ways c-treeACE breaks the NoSQL mold in terms of: • Data Integrity • Availability and Reliability • Complex Data Relationships • Flexible Queries • Performance
Tags : 
    
FairCom
Published By: Wave Computing     Published Date: Jul 06, 2018
This paper argues a case for the use of coarse grained reconfigurable array (CGRA) architectures for the efficient acceleration of the data flow computations used in deep neural network training and inferencing. The paper discusses the problems with other parallel acceleration systems such as massively parallel processor arrays (MPPAs) and heterogeneous systems based on CUDA and OpenCL, and proposes that CGRAs with autonomous computing features deliver improved performance and computational efficiency. The machine learning compute appliance that Wave Computing is developing executes data flow graphs using multiple clock-less, CGRA-based System on Chips (SoCs) each containing 16,000 processing elements (PEs). This paper describes the tools needed for efficient compilation of data flow graphs to the CGRA architecture, and outlines Wave Computing’s WaveFlow software (SW) framework for the online mapping of models from popular workflows like Tensorflow, MXNet and Caffe.
Tags : 
    
Wave Computing
Published By: graphgrid     Published Date: Oct 02, 2018
Whether it’s for a specific application, optimizing your existing operations, or innovating new customer services, graph databases are a powerful technology that turn accessing and analyzing your data into a competitive advantage. Graph databases resolve the Big Data limitations and free up data architects and developers to build amazing solutions that predict behaviors, enable data driven decisions and make insightful recommendations. Yet just as cars aren’t functional with only engines, graph databases require surrounding capabilities including ingesting multi-source data, building data models that are unique to your business needs, ease of data interaction and visualization, seamless co-existence with legacy systems, high performance search capabilities, and integration of data analysis applications. Collectively, this comprehensive data platform turns graph capabilities into tangible insights that drive your business forward.
Tags : 
    
graphgrid
Published By: graphgrid     Published Date: Oct 02, 2018
Whether it’s for a specific application, optimizing your existing operations, or innovating new customer services, graph databases are a powerful technology that turn accessing and analyzing your data into a competitive advantage. Graph databases resolve the Big Data limitations and free up data architects and developers to build amazing solutions that predict behaviors, enable data driven decisions and make insightful recommendations. Yet just as cars aren’t functional with only engines, graph databases require surrounding capabilities including ingesting multi-source data, building data models that are unique to your business needs, ease of data interaction and visualization, seamless co-existence with legacy systems, high performance search capabilities, and integration of data analysis applications. Collectively, this comprehensive data platform turns graph capabilities into tangible insights that drive your business forward.
Tags : 
    
graphgrid
Published By: TIBCO Software GmbH     Published Date: Jan 15, 2019
Enterprises use data virtualization software such as TIBCO® Data Virtualization to reduce data bottlenecks so more insights can be delivered for better business outcomes. For developers, data virtualization allows applications to access and use data without needing to know its technical details, such as how it is formatted or where it is physically located. For developers, data virtualization helps rapidly create reusable data services that access and transform data and deliver data analytics with even heavylifting reads completed quickly, securely, and with high performance. These data services can then be coalesced into a common data layer that can support a wide range of analytic and applications use cases. Data engineers and analytics development teams are big data virtualization users, with Gartner predicting over 50% of these teams adopting the technology by 202
Tags : 
    
TIBCO Software GmbH
Published By: IBM Watson Health     Published Date: Dec 10, 2018
In this case study, large health systems implement IBM Watson Health to surface improvement opportunities. Using this tool, they were able to cut costs, reduce patients’ length of stay, acquire actionable data, address number of readmissions and improve management of COPD and sepsis.
Tags : 
hospitals, health systems, healthcare, analytics, data, insights, case study, performance improvement, healthcare insights, cost, length of stay, financial performance data, readmissions, quality performance, benchmarks, dashboards, patient volume, savings
    
IBM Watson Health
Published By: Lenovo - APAC     Published Date: Feb 11, 2019
Asian ICT infrastructure investment is exploding as businesses review and modernise their data-centre architectures to keep up with the service demands of a growing and increasingly sophisticated population. Demand for cloud services, particularly to support big-data analytics initiatives, is driving this trend. Frost & Sullivan, for example, believes the Asia-Pacific cloud computing market will grow at 28.4 percent annually through 2022. Despite this growth, many businesses are also rapidly realising that public cloud is not the best solution for every need as theydo not always offer the same level of visibility, performance, and control as on-premises infrastructure.This reality is pushing many companies towards the middle ground of hybrid IT, in which applications and infrastructure are distributed across public cloud and self-managed data centre infrastructure. Read about Medical company Mutoh and how it took advantage of the latest technology.
Tags : 
lenovodcg, nutanix, hyperconvergedinfrastructure, hci
    
Lenovo - APAC
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day One” challenges of deploying, managing and monitoring PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. An effective monitoring and logging strategy is critical for maintaining the reliability, availability, and performance of database environments. The second section of this eBook provides a detailed analysis of all aspects of monitoring and logging PostgreSQL: ? Monitoring KPIs ? Metrics and stats ? Monitoring tools ? Passive monitoring versus active notifications
Tags : 
    
Stratoscale
Published By: Xtreme-D     Published Date: Feb 04, 2019
Download the full paper to learn more about how XTREME-Stargate offers a “super head node” for HPC cloud clusters
Tags : 
hpc, high performance computing, clusters, clustering, hybrid cloud, data center, iaas
    
Xtreme-D
Published By: Mindfire     Published Date: May 07, 2010
In this report, results from well over 650 real-life cross-media marketing campaigns across 27 vertical markets are analyzed and compared to industry benchmarks for response rates of static direct mail campaigns, to provide a solid base of actual performance data and information.
Tags : 
mindfire, response rates, personalized cross-media, marketing campaign, personalization, personalized urls, purls, performance data, conversion, marketing roi, direct mail campaign, multi channel marketing, integrated marketing campaign, lead generation, campaign management, tracking and measurement, reports and analytics, marketing dashboard
    
Mindfire
Published By: HPE & Intel®     Published Date: Oct 10, 2016
In the financial services industry (FSI), high-performance compute infrastructure is not optional; it’s a prerequisite for survival. No other industry generates more data, and few face the combination of challenges that financial services does: a rapidly changing competitive landscape, a complex regulatory environment, tightening margin pressure, exponential data growth, and demanding performance service-level agreements (SLAs).
Tags : 
    
HPE & Intel®
Published By: SAP     Published Date: Nov 17, 2016
Culture has become one of the most important business topics of 2016. CEOs and HR leaders now recognize that culture drives people's behavior, innovation, and customer service: 82 percent of survey respondents believe that "culture is a potential competitive advantage." Knowing that leadership behavior and reward systems directly impact organizational performance, customer service, employee engagement, and retention, leading companies are using data and behavioral information to manage and influence their culture.
Tags : 
sap, human resources, employee, deloitte, culture
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Oct 23, 2017
Midsized firms operate in the same hypercompetitive, digital environment as large enterprises—but with fewer technical and budget resources to draw from. That’s why it is essential for IT leaders to leverage best-practice processes and models that can help them support strategic business goals such as agility, innovation, speed-tomarket, and always-on business operations. A hybrid IT implementation can provide the infrastructure flexibility to support the next generation of high-performance, data-intensive applications. A hybrid foundation can also facilitate new, collaborative processes that bring together IT and business stakeholders.
Tags : 
digital environment, hyper competitive, business goals, hybrid it, technology, hpe
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
For the typical enterprise, the volume of data that needs to be managed and protected is growing at roughly 40% per year. Add to that the performance requirements of new applications and the demands for instant response time, always-on availability, and anytime-anywhere access. With such demands, data center managers face storage challenges that cannot be addressed using traditional, spinning-disk technology.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Very little data is available on how effectively enterprises are managing private cloud deployments in the real world. Are they doing so efficiently, or are they facing challenges in areas such as performance, TCO and capacity? Hewlett Packard Enterprise commissioned 451 Research to explore these issues through a survey of IT decision-makers and data from the Cloud Price Index.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Pour une entreprise type, le volume de données devant être gérées et protégées augmente d'environ 40 % par an. Ajoutez à cela les exigences de performances des nouvelles applications, de temps de réponse instantanés, de disponibilité continue et d'accès en tout lieu et en tout temps. Avec de telles exigences, les responsables des datacenters sont confrontés à des défis de stockage qui ne peuvent pas être relevés à l'aide de technologies de disques rotatifs traditionnelles.
Tags : 
    
Hewlett Packard Enterprise
Published By: HPE Intel     Published Date: Mar 15, 2016
Accelerate your journey to an all-flash data center with Hewlett Packard Enterprise Storage Consulting solutions. Slash costs and double performance with HPE 3PAR StoreServ All-flash arrays. Now you no longer need to choose which apps to take to flash; take them all and you won’t regret it. We deliver maximum performance, highest availability, Tier-1 data services, ease of management, and robust data protection at the lowest total cost of ownership (TCO) on the market when you engage with HPE Storage Consulting to provide an end-to-end all-flash solution.
Tags : 
    
HPE Intel
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Business users expect immediate access to data, all the time and without interruption. But reality does not always meet expectations. IT leaders must constantly perform intricate forensic work to unravel the maze of issues that impact data delivery to applications. This performance gap between the data and the application creates a bottleneck that impacts productivity and ultimately damages a business’ ability to operate effectively. We term this the “app-data gap.”
Tags : 
    
Hewlett Packard Enterprise
Published By: Broadsoft     Published Date: May 25, 2017
This whitepaper is for contact center management and business executives looking for ways to optimize the business performance of their contact center. • If yours is an existing contact center with infrastructure that has been built up over the years, this whitepaper provides bestpractice steps to break down the resulting data silos, unify them, and optimize your contact center for business performance. • If yours is a new contact center and you have the opportunity to build your infrastructure from the ground up using modern technologies, this whitepaper will provide best practices to prevent building data silos.
Tags : 
    
Broadsoft
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept