processing

Results 1 - 25 of 353Sort Results By: Published Date | Title | Company Name
Published By: MemSQL     Published Date: Jun 25, 2014
Emerging business innovations focused on realizing quick business value on new and growing data sources require “hybrid transactional and analytical processing” (HTAP), the notion of performing analysis on data directly in an operational data store. While this is not a new idea, Gartner reports that the potential for HTAP has not been fully realized due to technology limitations and inertia in IT departments. MemSQL offers a unique combination of performance, flexibility, and ease of use that allows companies to implement HTAP to power their business applications.
Tags : 
    
MemSQL
Published By: Skytree     Published Date: Nov 23, 2014
Critical business information is often in the form of unstructured and semi-structured data that can be hard or impossible to interpret with legacy systems. In this brief, discover how you can use machine learning to analyze both unstructured text data and semi- structured log data, providing you with the insights needed to achieve your business goals.
Tags : 
log data, machine learning, natural language, nlp, natural language processing, skytree, unstructured data, semi-structured data
    
Skytree
Published By: Expert System     Published Date: Mar 19, 2015
Establishing context and knowledge capture In today’s knowledge-infused world, it is vitally important for organizations of any size to deploy an intuitive knowledge platform that enables delivery of the right information at the right time, in a way that is useful and helpful. Semantic technology processes content for meaning, allowing for the ability to understand words in context: it allows for better content processing and interpretation, therefore enabling content organization and navigation, which in turn increases findability.
Tags : 
enterprise data management, unstructured data, semantic technology, expert system
    
Expert System
Published By: Ontotext     Published Date: Dec 21, 2015
Learn how semantic technologies make any content intelligent and turn it into revenue for your publishing business There is a smarter, cost-effective way for publishers to create, maintain and reuse content assets with higher accuracy. It is called dynamic semantic publishing. Putting Semantic Technologies at Work for the Publishing Industry An efficient blend of semantic technologies, dynamic semantic publishing enables powerful experiences when it comes to publishers’ main stock of trade: processing and representing information.
Tags : 
    
Ontotext
Published By: WhereScape     Published Date: Aug 18, 2016
Data Vault 2.0 leverages parallel database processing for large data sets and provides an extensible approach to design that enables agile development. WhereScape provides data warehouse automation software solutions that enable Data Vault agile project delivery through accelerated development, documentation and deployment without sacrificing quality or flexibility.
Tags : 
    
WhereScape
Published By: Basho     Published Date: Nov 25, 2015
The landscape of Scalable Operational and Analytical Systems is changing and disrupting the norm of using relational databases for all workloads. With the growing need to process and analyze Big Data at Scale, the demand for alternative strategies has grown and has given rise to the emergence of NoSQL databases for scalable processing. Mike Ferguson, Managing Director of Intelligent Business Strategies, is an independent IT Analyst who specializes in Big Data, BI/Analytics, Data Management and Enterprise Business Integration. In this whitepaper he will discuss the movement towards NoSQL databases for scalable operational and analytical systems, what’s driving Big Data analytics from Hadoop to the emergence of Apache Spark, the value of operational analytics and the importance of in-memory processing, and why use Apache Spark as your in-memory analytical platform for operational analytics.
Tags : 
    
Basho
Published By: DATAVERSITY     Published Date: Dec 27, 2013
There are actually many elements of such a vision that are working together. ACID and NoSQL are not the antagonists they were once thought to be; NoSQL works well under a BASE model, but also some of the innovative NoSQL systems fully conform to ACID requirements. Database engineers have puzzled out how to get non-relational systems to work within an environment that demands high availability, scalability, with differing levels of recovery and partition tolerance. BASE is still a leading innovation that is wedded to the NoSQL model, and the evolution of both together is harmonious. But that doesn’t mean they always have to be in partnership; there are several options. So while the opening anecdote is true in many cases, organizations that need more diverse possibilities can move into the commercial arena and get the specific option that works best for them. This paper is sponsored by: MarkLogic.
Tags : 
nosql, database, acid v base, white paper
    
DATAVERSITY
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Workday SA     Published Date: Jan 16, 2018
Financial transformation by definition is not something you can bolt on—it requires a willingness to question long-held assumptions and envision where you want to go and a total technology rethink. In the next blog, we’ll take a closer look at how one, unified, cloud-based system can create the perfect environment for finance to handle transaction processing and compliance and control while delivering the answers the business needs.
Tags : 
financial, technology, innovation, optimization, business, workday
    
Workday SA
Published By: Mimecast     Published Date: Apr 18, 2017
"Your Email & The EU GDPR GDPR changes how organizations need to protect personal data, including data contained in email and contact databases. Regardless of physical location, you must be in GDPR compliance for EU resident personal data by May 2018. Download the white paper to learn: - The unprecedented level of effort required for collecting and processing personal data - The specific security, privacy and protection requirements to comply with GDPR - How a majority (58%) of mid-sized and large organizations have a poor understanding of the wide scope of the regulation and associated penalties"
Tags : 
gdpr, email, personal data, security, data protection
    
Mimecast
Published By: Snowflake     Published Date: Jan 25, 2018
Compared with implementing and managing Hadoop (a traditional on-premises data warehouse) a data warehouse built for the cloud can deliver a multitude of unique benefits. The question is, can enterprises get the processing potential of Hadoop and the best of traditional data warehousing, and still benefit from related emerging technologies? Read this eBook to see how modern cloud data warehousing presents a dramatically simpler but more power approach than both Hadoop and traditional on-premises or “cloud-washed” data warehouse solutions.
Tags : 
    
Snowflake
Published By: NEC     Published Date: Sep 29, 2009
Written by: IDC Abner Germanow, Jonathan Edwards, Lee Doyle IDC believes the convergence of communications and mainstream IT architectures will drive significant innovation in business processes over the next decade.
Tags : 
it architecture, idc, automation, automated order processing, soa, service oriented architecture, soap, http
    
NEC
Published By: Ephesoft     Published Date: Jan 18, 2018
Mortgage lenders experience a variety of document intensive processes throughout the entire loan lifecycle. Learn more about how intelligent document capture technology can improve mortgage processing, mortgage loan due diligence, loan onboarding, settlement services, loan servicing and reporting in the cloud or on-premise working with your existing LOS or other back-end platforms.
Tags : 
mortgage, financial, solutions, ephesoft, transact, reporting, cloud, back end
    
Ephesoft
Published By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : 
vision, high availability, ibm, aix, cdp, core union, database security
    
Vision Solutions
Published By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
    
SAP
Published By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
    
SAP
Published By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA Tier 3 ABM
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : 
database usage, database management, server usage, data protection
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
    
Cisco
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.