scale

Results 1 - 25 of 1160Sort Results By: Published Date | Title | Company Name
Published By: DATAVERSITY     Published Date: Jul 24, 2014
Will the “programmable era” of computers be replaced by Cognitive Computing systems which can learn from interactions and reason through dynamic experience just like humans? With rapidly increasing volumes of Big Data, there is a compelling need for smarter machines to organize data faster, make better sense of it, discover insights, then learn, adapt, and improve over time without direct programming. This paper is sponsored by: Cognitive Scale.
Tags : 
data, data management, cognitive computing, machine learning, artificial intelligence, research paper
    
DATAVERSITY
Published By: Tamr, Inc.     Published Date: Feb 08, 2019
Traditional data management practices, such as master data management (MDM), have been around for decades – as have the approaches vendors take in developing these capabilities. And they were well-equipped for the problem at hand: managing data at modest size and complexity. However, as enterprises mature and start to view their data assets as a source of competitive advantage, new methods to managing enterprise data become desirable. Enterprises now need approaches to data management that can solve critical issues around speed and scale in an increasingly complex data environment. This paper explores how data curation technology can be used to solve data mastering challenges at scale.
Tags : 
    
Tamr, Inc.
Published By: Databricks     Published Date: Sep 13, 2018
Learn how to get started with Apache Spark™ Apache Spark™’s ability to speed analytic applications by orders of magnitude, its versatility, and ease of use are quickly winning the market. With Spark’s appeal to developers, end users, and integrators to solve complex data problems at scale, it is now the most active open source project with the big data community. With rapid adoption by enterprises across a wide range of industries, Spark has been deployed at massive scale, collectively processing multiple petabytes of data on clusters of over 8,000 nodes. If you are a developer or data scientist interested in big data, learn how Spark may be the tool for you. Databricks is happy to present this ebook as a practical introduction to Spark. Download this ebook to learn: • Spark’s basic architecture • Why Spark is a popular choice for data analytics • What tools and features are available • How to get started right away through interactive sample code
Tags : 
    
Databricks
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Cambridge Semantics     Published Date: Mar 13, 2015
As the quantity and diversity of relevant data grows within and outside the enterprise, how can IT easily deploy secure governed solutions that allow business users to identify, extract, link together and derive value from the right data at the right time, at big data scale, while keeping up with ever changing business needs? Smart Enterprise Data Management (Smart EDM) is new, sensible paradigm for managing enterprise data. Anzo Smart Data solutions allow IT departments and their business users to quickly and flexibly access all of their diverse data. Based upon graph data models and Semantic data standards, Anzo enables users to easily perform advanced data management and analytics through the lens of their business at a fraction of the time and cost of traditional approaches, while adhering to the governance and security required by enterprise IT groups. Download this whitepaper to learn more.
Tags : 
enterprise data management, data governance, data integration, cambridge semantics
    
Cambridge Semantics
Published By: Cambridge Semantics     Published Date: Aug 17, 2015
As the quantity and diversity of relevant data grows within and outside of the enterprise, business users and IT are struggling to extract maximum value from this data. Current approaches, including the rigid relational data warehouse and the unwieldy Hadoop-only Data Lake, are limited in their ability to provide users and IT with the answers they need with the proper governance and security required. Read this whitepaper to learn how The Anzo Smart Data Lake from Cambridge Semantics solves these problems by disrupting the way IT and business alike manage and analyze data at enterprise scale with unprecedented flexibility, insight and speed.
Tags : 
    
Cambridge Semantics
Published By: Cambridge Semantics     Published Date: May 11, 2016
With the explosive growth of Big Data, IT professionals find their time and resources squeezed between managing increasingly large and diverse siloed data stores and increased user demands for timely, accurate data. The graph-based ANZO Smart Data Manager is built to relieve these burdens by automating the process of managing, cataloging and governing data at enterprise scale and security. Anzo Smart Data Manager allows companies to truly understand their data ecosystems and leverage the metadata within it.
Tags : 
    
Cambridge Semantics
Published By: Splice Machine     Published Date: Nov 16, 2014
Organizations are now looking for ways to handle exploding data volumes while reducing costs and maintaining performance. Managing large volumes and achieving high levels of concurrency on traditional scale up databases, such as Oracle, often means purchasing expensive scale-up hardware. In this white paper, learn about the different options and benefits of scale out solutions for Oracle database users.
Tags : 
splice machine, oracle, oracle database, database, hadoop, nosql, white paper, data, data management, dataversity
    
Splice Machine
Published By: Cloudant - an IBM Company     Published Date: Jun 01, 2015
Whether you're a DBA, data scientist or developer, you're probably considering how the cloud can help modernize your information management and analytics strategy. Cloud data warehousing can help you get more value from your data by combining the benefits of the cloud - speed, scale, and agility - with the simplicity and performance of traditional on-premises appliances. This white paper explores how a cloud data warehouse like IBM dashDB can reduce costs and deliver new business insights. Readers will learn about: - How data warehousing-as-a-service helps you scale without incurring extra costs - The benefits of in-database analytics in a cloud data warehouse - How a cloud data warehouse can integrate with the larger ecosystem of business intelligence tools, both on prem and off prem
Tags : 
nosql, ibm, dashdb, database, cloud
    
Cloudant - an IBM Company
Published By: VoltDB     Published Date: Feb 12, 2016
The need for fast data applications is growing rapidly, driven by the IoT, the surge in machine-to-machine (M2M) data, global mobile device proliferation, and the monetization of SaaS platforms. So how do you combine real-time, streaming analytics with real-time decisions in an architecture that’s reliable, scalable, and simple? In this report, Ryan Betts and John Hugg from VoltDB examine ways to develop apps for fast data, using pre-defined patterns. These patterns are general enough to suit both the do-it-yourself, hybrid batch/streaming approach, as well as the simpler, proven in-memory approach available with certain fast database offerings.
Tags : 
    
VoltDB
Published By: Reltio     Published Date: Feb 12, 2016
Reltio delivers reliable data, relevant insights and recommended actions so companies can be right faster. Reltio Cloud combines data-driven applications with modern data management for better planning, customer engagement and risk management. IT streamlines data management for a complete view across all sources and formats at scale, while sales, marketing and compliance teams use data-driven applications to predict, collaborate and respond to opportunities in real-time. Companies of all sizes, including leading Fortune 500 companies in healthcare and life sciences, distribution and retail rely on Reltio.
Tags : 
    
Reltio
Published By: Reltio     Published Date: Aug 11, 2017
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management, 2016 states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.”
Tags : 
    
Reltio
Published By: Reltio     Published Date: May 22, 2018
"Forrester's research uncovered a market in which Reltio [and other companies] lead the pack,” the Forrester Wave Master Data Management states. "Leaders demonstrated extensive and MDM capabilities for sophisticated master data scenarios, large complex ecosystems, and data governance to deliver enterprise-scale business value.” Reltio executes the vision for next-generation MDM by converging trusted data management with business insight solutions at scale and in the cloud. Machine learning and graph technology capabilities enable a contextual data model while also maintaining temporal and lineage changes of the master data.
Tags : 
    
Reltio
Published By: Alteryx     Published Date: May 24, 2017
Spreadsheets are a mainstay in almost every organization. They are a great way to calculate and manipulate numeric data to make decisions. Unfortunately, as organizations grow, so does the data, and relying on spreadsheet-based tools like Excel for heavy data preparation, blending and analysis can be cumbersome and unreliable. Alteryx, Inc. is a leader in self-service data analytics and provides analysts with the ability to easily prep, blend, and analyze all data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. This paper highlights how transitioning from a spreadsheet-based environment to an Alteryx workflow approach can help analyst better understand their data, improve consistency, and operationalize analytics through a flexible deployment and consumption environment.
Tags : 
    
Alteryx
Published By: Ataccama     Published Date: Mar 16, 2018
Discover how a major healthcare provider and facility operator in the United States implemented Ataccama ONE data quality tools to scale their data quality across the enterprise.
Tags : 
    
Ataccama
Published By: Basho     Published Date: Nov 25, 2015
The landscape of Scalable Operational and Analytical Systems is changing and disrupting the norm of using relational databases for all workloads. With the growing need to process and analyze Big Data at Scale, the demand for alternative strategies has grown and has given rise to the emergence of NoSQL databases for scalable processing. Mike Ferguson, Managing Director of Intelligent Business Strategies, is an independent IT Analyst who specializes in Big Data, BI/Analytics, Data Management and Enterprise Business Integration. In this whitepaper he will discuss the movement towards NoSQL databases for scalable operational and analytical systems, what’s driving Big Data analytics from Hadoop to the emergence of Apache Spark, the value of operational analytics and the importance of in-memory processing, and why use Apache Spark as your in-memory analytical platform for operational analytics.
Tags : 
    
Basho
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: Splice Machine     Published Date: May 19, 2014
SQL-on-Hadoop solutions have become very popular recently as companies solve the data access issues with Hadoop or seek a scale-out alternative for traditional relational database management systems. However, with all of the options available, choosing which solution is right for your business can be a daunting task. This white paper discusses the options you should consider and questions to ask, including: Is it really “Real-Time”? Is it true SQL? Does it support secondary indexes? Can it efficiently handle sparse data? Can it deliver fast performance on massive joins? Read this white paper to get a better understanding of the SQL-on-Hadoop landscape and what questions you should ask to identify best solution for your business.
Tags : 
white paper, splice machine, sql, hadoop, nosql, nosql white paper, hadoop white paper, dataversity
    
Splice Machine
Published By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
Powerful IT doesn’t have to be complicated. Hyperconvergence puts your entire virtualized infrastructure and advanced data services into one integrated powerhouse. Deploy HCI on an intelligent fabric that can scale with your business and you can hyperconverge the entire IT stack. This guide will help you: Understand the basic tenets of hyperconvergence and the software-defined data center; Solve for common virtualization roadblocks; Identify 3 things modern organizations want from IT; Apply 7 hyperconverged tactics to your existing infrastructure now.
Tags : 
    
Hewlett Packard Enterprise
Published By: Sage EMEA     Published Date: Jan 29, 2019
Transform your finance operations into a strategic, data-driven engine Data inundation and information overload have burdened practically every largescale enterprise today, providing great amounts of detail but often very little context on which executives can act. According to the Harvard Business Review,1 less than half of an organisation’s structured data is actively used in making decisions. The burden is felt profoundly among finance executives, who increasingly require fast and easy access to real-time data in order to make smart, timely, strategic decisions. In fact, 80% of analysts’ time is spent simply discovering and preparing data, and the average CFO receives information too late to make decisions 24% of the time.2
Tags : 
    
Sage EMEA
Published By: Google Apigee     Published Date: Feb 05, 2019
Enterprises can extend their services with scale and speed that would have been unthinkable just a few years ago. They can focus on their core strengths while leveraging resources from other ecosystem participants, creating richer, more valuable digital experiences than most companies have the resources to produce alone and spreading their businesses to new audiences and markets. The speed of change and the range of opportunities are dizzying—which is why it’s so easy for businesses to hit hurdles or chart the wrong course.
Tags : 
    
Google Apigee
Published By: ipoque     Published Date: Feb 14, 2019
Virtualized Evolved Packet Core (vEPC) is a major breakthrough in network function virtualization (NFV). When asked where they have deployed NFV in production networks, communication service providers (CSPs) consistently name vEPC as one of the top answers. Why is that? In order to maximize their processing capacity, CSPs virtualize a subset of their network applications, including mobile edge computing (MEC), base stations (small/macro cells) and the mobile core, because these systems use a large bandwidth. The mobile packet core builds the foundation of the core network on which mobile CSPs offer IP-based services to their customers. Implementing vEPC solutions can help CSPs obtain the scale necessary to accommodate growing numbers of subscribers and large amounts of traffic or connections while controlling costs and improving on quality of experience (QoE). In the past, evolved packet core (EPC) solutions were deployed on purpose-built hardware. NFV enables operators to deploy EPC c
Tags : 
dpi, deep packet inspection, vepc, sdn, nfv, network analytics, virtual network
    
ipoque
Published By: Rackspace     Published Date: Feb 01, 2019
Nearly nine in 10 enterprises have adopted a multi-cloud strategy, according to the latest RightScale State of the Cloud Report, and these enterprises use eight different clouds, on average. Increasingly, their cloud of choice is a public cloud. Every public cloud, however, is different. GCP is rapidly gaining users among companies of all sizes. Today, GCP’s customers include large global brands like Disney, eBay, HSBC, The Home Depot, Schlumberger and Verizon, and smaller ones like gaming platform Smash.gg and the Rhode Island School of Design, one of the nation’s leading arts and design institutions. Whether GCP is a good fit for your company depends on a multitude of factors. To find out what these are and how Rackspace can help your business, download this whitepaper today.
Tags : 
    
Rackspace
Published By: Mimecast     Published Date: Jan 17, 2019
Two-thirds of all internally generated email sent is from employees communicating within an organization*. Yet most IT organizations only focus on inbound email when it comes to protecting against cyber-attacks. In doing so, they ignore the serious risks posed by internal and outbound emails and the actions of two at risk groups of users - the compromised and careless employee. Mimecast Internal Email Protect extends the security capabilities of Targeted Threat Protection to provide advanced inside-the-perimeter defenses. Watch this on-demand webinar where Mimecast’s Chief Trust Officer, Marc French, and Cyber Security Strategist, Bob Adams discuss: The top things to do to optimize your Targeted Threat Protection implementation and prepare for addressing the threats on the inside. The multiple ways internal email threats start, and why human error nearly always plays a role. The scale and impact of attacks that spread via internal email. How to extend your current protection with Mim
Tags : 
    
Mimecast
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to analyzing and mitigating the risks of migrating to PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. A key decision criteria for adopting any technology is whether it can support requirements for existing applications while also fitting into longer term strategies and needs. The first section of this eBook provides a detailed analysis of all aspects of migrating from legacy and commercial solutions to PostgreSQL: ? Schema and code migration ? Data migration ? Application code migration ? Testing and evaluation
Tags : 
    
Stratoscale
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept